Skip to content

Commit 44d4431

Browse files
Add some references about reproducibility of research. (#19)
1 parent f8a6342 commit 44d4431

File tree

3 files changed

+104
-2
lines changed

3 files changed

+104
-2
lines changed

bibliography.bib

Lines changed: 87 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -489,3 +489,90 @@ @dataset{knight_2025_17250038
489489
doi = {10.5281/zenodo.17250038},
490490
url = {https://doi.org/10.5281/zenodo.17250038},
491491
}
492+
493+
@article{open_science_collaboration2015,
494+
title={Estimating the reproducibility of psychological science},
495+
author={{Open Science Collaboration}},
496+
journal={Science},
497+
year={2015},
498+
volume={349},
499+
number={6251},
500+
pages={aac4716},
501+
}
502+
503+
@article{camerer2016,
504+
title={Evaluating replicability of laboratory experiments in economics},
505+
author={Camerer, Colin F. and others},
506+
journal={Science},
507+
year={2016},
508+
volume={351},
509+
number={6280},
510+
pages={1433--1436},
511+
}
512+
513+
@article{camerer2018,
514+
title={Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015},
515+
author={Camerer, Colin F. and others},
516+
journal={Nature Human Behaviour},
517+
year={2018},
518+
volume={2},
519+
pages={637--644},
520+
}
521+
522+
523+
@article{silberzahn2018,
524+
title={Many analysts, one dataset: Making transparent how variations in analytical choices affect results},
525+
author={Silberzahn, Raphael and others},
526+
journal={Advances in Methods and Practices in Psychological Science},
527+
year={2018},
528+
volume={1},
529+
number={3},
530+
pages={337--356},
531+
}
532+
533+
@article{breznau2022,
534+
title={Observing many researchers using the same data and hypothesis reveals a hidden universe of uncertainty},
535+
author={Breznau, Nate and others},
536+
journal={PNAS},
537+
year={2022},
538+
volume={119},
539+
number={44},
540+
pages={e2203150119},
541+
}
542+
543+
@article{sandve2013,
544+
title={Ten simple rules for reproducible computational research},
545+
author={Sandve, Geir Kjetil and others},
546+
journal={PLoS Computational Biology},
547+
year={2013},
548+
volume={9},
549+
number={10},
550+
pages={e1003285},
551+
}
552+
553+
@article{wilson2017,
554+
title={Good enough practices in scientific computing},
555+
author={Wilson, Greg and others},
556+
journal={PLoS Computational Biology},
557+
year={2017},
558+
volume={13},
559+
number={6},
560+
pages={e1005510},
561+
}
562+
563+
@article{stodden2018,
564+
title={Enhancing reproducibility for computational methods},
565+
author={Stodden, Victoria and Seiler, Jennifer and Ma, Zhaokun},
566+
journal={PNAS},
567+
year={2018},
568+
volume={115},
569+
number={11},
570+
pages={2561--2570},
571+
}
572+
573+
@book{turingway2022,
574+
title={The Turing Way: A Handbook for Reproducible, Ethical and Collaborative Data Science},
575+
author={{The Turing Way Community}},
576+
year={2022},
577+
note={Zenodo. DOI: 10.5281/zenodo.3233853}
578+
}

paper/main.pdf

3.46 KB
Binary file not shown.

paper/main.tex

Lines changed: 17 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -151,11 +151,26 @@ \section*{Main}\label{sec:introduction}
151151
strategies against small, unrepresentative sets of opponents. Such practices
152152
bias conclusions and weaken claims about the relative performance of new
153153
strategies.
154+
155+
These challenges are not limited to the \IPD{} literature.
156+
Reproducibility failures have been widely documented across the social sciences
157+
and economics, with large-scale replication projects revealing that only around
158+
half of published findings hold up under independent
159+
scrutiny~\cite{open_science_collaboration2015, camerer2016, camerer2018}. Computational
160+
research adds further complexity, as analytic flexibility and non-transparent
161+
workflows can yield highly variable conclusions even from identical
162+
data~\cite{silberzahn2018, breznau2022}. Within game theory and related modelling work,
163+
the challenge of reproducibility intersects with simulation code, algorithmic
164+
implementation, and data provenance.
154165
An important step toward addressing this issue has been the
155166
\texttt{Axelrod-Python} project~\cite{AxelrodProject}, an open-source Python
156167
package that provides a comprehensive framework for implementing and testing
157168
\IPD{} strategies. The library includes a wide variety of strategies from the
158-
literature, together with detailed documentation and usage examples. By
169+
literature, together with detailed documentation and usage examples.
170+
This project illustrates best practice by providing fully open, tested, and version-controlled
171+
artifacts, embodying community principles outlined in reproducibility
172+
guides~\cite{wilson2014, sandve2013, wilson2017, stodden2018, turingway2022}.
173+
By
159174
providing open, executable implementations, \AXL{} makes it possible to test
160175
strategies under common conditions and compare their performance systematically,
161176
and it has therefore been used in ongoing research~\cite{Harper2017,
@@ -713,7 +728,7 @@ \section*{Conclusion}\label{sec:discussion}
713728
the first effort to package and reproduce, according to contemporary best
714729
practices, code originally written in the 1980s. The archived materials~\cite{knight_2025_17250038} (at
715730
\url{https://doi.org/10.5281/zenodo.17250038})
716-
are curated to high standards of reproducible research~\cite{wilson2014} and
731+
are curated to high standards of reproducible research~\cite{wilson2014, sandve2013, wilson2017, stodden2018, turingway2022} and
717732
accompanied by a fully automated test suite. All changes to the original code
718733
were made systematically and transparently, with complete records available at
719734
(\url{https://github.com/Axelrod-Python/axelrod-fortran}).

0 commit comments

Comments
 (0)