Descripción del proyecto
With numerous failures to replicate, common misreporting of results, widespread failure to publish non-significant results or to share data, and considerable potential bias due the flexibility of analyses of data and researcher’s tendency to exploit that flexibility, psychological science is said to experience a crisis of confidence. These issues lead to dissemination of false positive results and inflate effect size estimates in meta-analyses. This leads to poor theory building, an inefficient scientific system, a waste of resources, lower trust in psychological science, and psychology’s outcomes being less useful for society. After having contributed to the literature highlighting these problems the goal of my ERC project is to improve psychological science by offering novel solutions to five vexing challenges: (1) I want to counter misreporting of results by using our new tool statcheck in several studies on reviewers’ tendency to demand perfection and by applying it to actual peer review. (2) I want to counter the biasing effects of common explorations of data (p-hacking) by professing and studying pre-registration and by developing promising new approaches called blind analysis and cross-validation using differential privacy that simultaneously allows for exploration and confirmation with the same data. (3) I want to counter the common problem of selective outcome reporting in psychological experiments by developing powerful latent variable methods that render it fruitless to not report all outcome variables in a study. (4) I want to counter the problem of publication bias by studying and correcting misinterpretations of non-significance. (5) I want to develop and refine meta-analytic methods that allow for the correction of biases that currently inflate estimates of effects and obscure moderation. The innovative tools I develop have the potential to improve the way psychologists (and other scientists) analyse data, disseminate findings, and draw inferences.