Principles and guidelines for applied statistics

What I would like to contribute regarding principles and guidelines for the practice of statistics is nowhere as deep as Sander Greenland’s wonderful words, and he has given food for thought for various ideas I’d like to come back to in the future. For now I’d like to contribute these practical ideas.

First, I spend as much time as anything on refining measurements, especially the outcome variable Y. Researchers do not have a good understanding about the negative effect of poor precision/resolution in Y, often wanting to go so far as doing the dreaded “responder analysis.” In general, statisticians spend too little time with measurements. Stephen Senn wrote an excellent article about this. Stunningly, he was criticized by other statisticians who felt that measurement is not the domain of statistics. Far too often, statisticians, especially at pharmaceutical companies but also frequently in academia, have completely abdicated their responsibilities to maximize power and precision by taking for granted clinicians’ choices of measurements, not even pointing out the unnecessary tripling of needed sample size in some cases.

I’ve laid out my general principles of my statistical practice here. They are reproduced below.

  1. Use methods grounded in theory or extensive simulation
  2. Understand uncertainty, and realize that the most honest approach to inference is a Bayesian model that takes into account what you don’t know (e.g., Are variances equal? Is the distribution normal? Should an interaction term be in the model?)
  3. Design experiments to maximize information
  4. Understand the measurements you are analyzing and don’t hesitate to question how the underlying information was captured
  5. Be more interested in questions than in null hypotheses, and be more interested in estimation than in answering narrow questions
  6. Use all information in data during analysis
  7. Use discovery and estimation procedures not likely to claim that noise is signal
  8. Strive for optimal quantification of evidence about effects
  9. Give decision makers the inputs (other than the utility function) that optimize decisions
  10. Present information in ways that are intuitive, maximize information content, and are correctly perceived
  11. Give the client what she needs, not what she wants
  12. Teach the client to want what she needs
10 Likes