Close Close
Popular Financial Topics Discover relevant content from across the suite of ALM legal publications From the Industry More content from ThinkAdvisor and select sponsors Investment Advisor Issue Gallery Read digital editions of Investment Advisor Magazine Tax Facts Get clear, current, and reliable answers to pressing tax questions
Luminaries Awards
ThinkAdvisor

Life Health > Life Insurance

Incomplete

X
Your article was successfully shared with the contacts you provided.

A good friend asked me to read a book, and this past weekend, I read most of it. Since I do a fair amount of research, the book annoyed me. How? The author spoke of studies, some I estimate that were relatively small, about which he wrote this: subjects who did X had fewer cancers or heart problems than people who did Y, and his thesis was that the evidence was overwhelmingly in favor of X. However, the author did not say how many people were in each study he mentioned, nor did he say how many actually contracted cancer or heart trouble. Instead, he only used percentages, which is, to me, a warning signal.

If the study had 24 people and two who followed X got cancer and four who were devotees of Y got cancer, it’s fair to say that twice as many people had cancer or heart trouble who followed Y than X. But is it really fair? One, it’s a very small study. Also, in a small study, deviations are normal and to be expected. It is distinctly not fair to use a small study to report double the incidence of anything — there is simply not a big enough sample. And, finally,  the reader would feel differently if he knew the writer was discussing four out of 24 people. If one uses a signal like twice as many, it’s supposed to have a statistical meaning.  

So, if you’re producing a study or even illustrating an idea — like the last 10 years of performance of Fund X vs. Fund Y, please be complete, not incomplete.

Have a complete week and do good and complete work, okay?