Innovative Methods for Psychology Reproducible Open Valid and Efficient
With numerous failures to replicate, common misreporting of results, widespread failure to publish non-significant results or to share data, and considerable potential bias due the flexibility of analyses of data and researcher’s...
ver más
¿Tienes un proyecto y buscas un partner? Gracias a nuestro motor inteligente podemos recomendarte los mejores socios y ponerte en contacto con ellos. Te lo explicamos en este video
Proyectos interesantes
PID2019-104080GB-I00
META-ANALISIS: ESTUDIO DE SU REPLICABILIDAD Y SU PAPEL EN LA...
30K€
Cerrado
PID2019-104033GA-I00
EXPLORANDO NUEVAS PREGUNTAS EN LAS SINTESIS DE LA INVESTIGAC...
36K€
Cerrado
PSI2017-82490-P
META-ANALISIS Y SESGO DE PUBLICACION: DESARROLLO DE LA ESTRA...
19K€
Cerrado
TransparencyMeters
Transparency instruments to quantify the method transparency...
161K€
Cerrado
PHIL_OS
A Philosophy of Open Science for Diverse Research Environmen...
2M€
Cerrado
PID2021-122404NB-I00
REFORMULANDO EL MODELO META-ANALITICO DE EFECTOS ALEATORIOS
50K€
Cerrado
Información proyecto IMPROVE
Duración del proyecto: 66 meses
Fecha Inicio: 2017-05-29
Fecha Fin: 2022-11-30
Fecha límite de participación
Sin fecha límite de participación.
Descripción del proyecto
With numerous failures to replicate, common misreporting of results, widespread failure to publish non-significant results or to share data, and considerable potential bias due the flexibility of analyses of data and researcher’s tendency to exploit that flexibility, psychological science is said to experience a crisis of confidence. These issues lead to dissemination of false positive results and inflate effect size estimates in meta-analyses. This leads to poor theory building, an inefficient scientific system, a waste of resources, lower trust in psychological science, and psychology’s outcomes being less useful for society. After having contributed to the literature highlighting these problems the goal of my ERC project is to improve psychological science by offering novel solutions to five vexing challenges: (1) I want to counter misreporting of results by using our new tool statcheck in several studies on reviewers’ tendency to demand perfection and by applying it to actual peer review. (2) I want to counter the biasing effects of common explorations of data (p-hacking) by professing and studying pre-registration and by developing promising new approaches called blind analysis and cross-validation using differential privacy that simultaneously allows for exploration and confirmation with the same data. (3) I want to counter the common problem of selective outcome reporting in psychological experiments by developing powerful latent variable methods that render it fruitless to not report all outcome variables in a study. (4) I want to counter the problem of publication bias by studying and correcting misinterpretations of non-significance. (5) I want to develop and refine meta-analytic methods that allow for the correction of biases that currently inflate estimates of effects and obscure moderation. The innovative tools I develop have the potential to improve the way psychologists (and other scientists) analyse data, disseminate findings, and draw inferences.