Scholarly research is at the forefront of innovation, especially with a breadth of new technologies that can enhance the research process. However, in a race for scholars to produce more and more new findings, documentation practices and reproduction of results may be neglected. Lack of validation through reproduction can lead to a general distrust of scholarly research and experiments, but a more generous approach to information sharing could be the answer to this issue. Scholars have connected socially for centuries to share their ideas, and this practice has led to some truly innovative ideas that have shaped our world today. With willing participants sharing their ideas and their research methods, new findings can be reproduced and validated, creating a stronger and more trustworthy community of scholars.
scientific and technical information
Consume, Reproduce, Extend and Connect:
Sustaining Our Research Lifecycle
by Richard P. Johnson
The continuing dramatic increase in computational power has spurred a new age of scientific computational analysis that has allowed us to test new ideas and simulate complex systems in ways that were previously impossible. We can now simulate biological systems, process huge amounts of observational data and predict interactions between weather systems with greater and greater precision. However, in our race to produce these results, our ability to capture, document, share and reproduce our experiments has not kept pace with our ambition. This challenge has put the most critical piece of establishing scientific theories in jeopardy: proper vetting through reproduction and validation of results. As a community, we are still very immature in our methods and processes for effectively capturing computational analysis and in turn reproducing the conditions necessary to verify results. One could argue that if we are still making progress, it may be enough to accept these limitations and shift our proving ground from the lab to the real world. In other words, do we really need to document information in order to share it for others to reproduce our results? There are critical concerns lying beneath the surface that we must consider.
The Roots of Inspiration
Regardless of whether someone consumes past work to examine, reproduce, validate or incorporate it, the process of consuming and considering the work and ideas of others is a critical part of the scholarly process. In his 2010 Ted Talk , Steven Johnson outlines how innovation typically happens through the organic process of scholarly dialogue. He posits that the proliferation of new ideas during the Age of Enlightenment in the 18th century emerged largely from a new culture of scholars gathering together in cafes in Europe. What is it about this social environment that is impactful? When you think of brilliant advancements over the past few centuries, these sparks of innovation seem to come from nowhere. Furthermore, a certain amount of hyper-focus may seem to produce eureka moments in isolation from others. Sir Isaac Newton is well known to have been extremely reclusive when he produced his greatest works . However, even Newton built upon the work of others, such as his derivation in the Principia  of Johannes Kepler’s laws of planetary motion. In turn, when you examine the work and human influences of those individuals the real picture emerges.
Johnson concluded that eureka moments are actually the product of dialogue and ideas gestating over time, and that moment is just the moment when everything crystallizes. In turn, the development of neural networks is analogous to how ideas are formed: by making connections through a network of individuals with one idea branching from another.Looking at the classic example of Benjamin Franklin’s kite experiment in June 1752, it is often thought of as a leap in understanding ahead of his peers. Upon closer examination however, Benjamin Franklin was not the first to suggest that lightning was electricity, but the first to suggest that it could be proved experimentally . Franklin had active correspondence with many colleagues in France and England. Papers from electricians such as John Freke in England and Johann Heinrich Winkler in Germany in 1746 previously noted similarities between lightning and electrical discharge . Franklin was also not even the first to perform his experiment. Thomas-François Dalibard performed a variation of the experiment with an iron rod one month before Franklin’s attempt .
The dissemination of Benjamin Franklin’ s results also embodies a critical part of the process – consideration and validation of his work. He detailed his experiments through a series of letters that he shared with Peter Collinson. Collinson then shared with the Royal Society of London who then reproduced his experiments. Interestingly, in Franklin’s first publication of the experiment in the Pennsylvania Gazette  he did not claim he had conducted the experiment himself, which has contributed to speculation that Franklin may not have conducted the experiment at all . Regardless, Franklin has arguably inspired the work of other researchers and scholars, and a review of Franklin’s correspondence and dissemination of his findings still underpin the scholarly communications process of today.
Scholarly Communication Today
For many, the topic of scholarly communication probably encompasses characteristics of digital publishing, discussion forums and peer-reviewed journals. However, digging deeper reveals scholarly communication is fundamentally rooted in a scholarly dialogue that has existed for centuries. Breaking this dialog into its fundamental elements, it can be boiled down to a few steps: consume, consider and extend . Each of the primitive elements of the scientific method  – observe, hypothesize, gather data, test, refine and conclude – can also be mapped to one of these fundamental steps. If we expand the process beyond one individual, towards communities establishing scientific theories, trust in parallel work cannot be established without proper validation and testing; therefore, in the scientific scholarly workflow we need to insert a fourth step – “reproduce” before “extend.”
As we have s hifted to digital documentation and analytical tools that cannot simply be documented and shared in a text document, we have both more power and greater complexity in the materials surrounding experiments. The complexity presents a paradox where it is both more critical and more challenging to share all the information necessary to both consume and understand the work of our peers. Our methods are largely tuned to capturing text-based supplementary materials such as handwritten lab notes and calculations. Instead, a diverse set of digital materials and formats and the computational environments themselves are proving extremely difficult to capture and share. As Stodden and Miguez  state, “Without the data and computer codes that underlie scientific discoveries, published findings are all but impossible to verify.” The difficulty of obtaining this information challenges the integrity of our system.
Consider a si mplified view of a sample network of materials and scholarship where Researcher 1 wrote software for a lab that produced a set of data and subsequently produced a publication. That data is then referenced and reused for an experiment by Researcher 2, producing a second publication. Then, consider that the original data and software are archived and shared in one repository, while derivate data and publications are present in other databases and repositories. Even if a single repository may possess the technical capability, intellectual property and copyright concerns may prevent related scholarship from being accessed or stored in one location.
Th e often-cited U.S. Office of Science and Technology Policy 2013 memo on increasing access to the results of federally funded scientific research  is just one official recognition of the need to capture related materials (data and software) in order to properly understand, validate, reuse and inspire the work of others. Franklin might never have conceived of his experiment if he hadn’t consumed the works of other scholars and made the personal connections with scholars in Europe. Related materials scattered across multiple systems requires us to add an additional step to our process: “connect.”
Remaking Our Process
The scholarly sharing and dissemination process of Franklin’s day was slow; however, even with our rapid communications technology, we must still solve how we can sustain these elemental steps moving forward. Even though some have come to this realization, our technology, processes and systems are just beginning to address these needs. In order to ensure future access to related information, it is necessary to capture and preserve information at multiple steps in the research lifecycle. In the United States, efforts like the SHARE project, led by the Association of Research Libraries and the Center for Open Science, and the VIVO and RMAP projects are built to capture and link connections between scholarly works. OpenAIRE in Europe and La Referencia in Latin America aggregate records of scholarly publications and data similar to SHARE’s coverage of North America. The National Data Service is an effort largely focused on tools embedded in the computational environment and presents great potential to get tools into the hands of researchers in order to properly capture the computational environment for others. The Confederation of Open Access Repositories (COAR) also works with the international repository community to align practices and policies. Numerous technology platforms are utilized for preservation and data management such as Fedora Repository, DSpace, EPrints, iRods and the Open Science Framework, plus a whole landscape of vendor-hosted services. Artificial barriers still exist between these services though, and we must be persistent and steadfast despite any stumbles.
In many cases, these systems are governed by separate implementing organizations and geopolitical boundaries. The application of standards (that is, a common way to structure and code information) is also still inconsistent. Imagine researchers each working within a different foreign language, and then imagine they each speak a language that is only used by a dozen individuals. Even with standards applied, the proliferation of standards used in the exchange and structure of information further complicates what is necessary to even approach a universal network of information sharing. A culture with incentives and pressures to produce novel findings has also lowered incentives towards validating previous results . Subsequently, there is less funding for validation and reproduction of results, and more importantly, less prestige in doing so. This environment, in turn, lowers the trust and confidence in the system as a whole, allowing room for more and more falsified experiments to be accepted with minor scrutiny.
A study from the Center for Open Science by Nosek et al. to replicate 100 social science experiments revealed that only 36% of the experiments successfully presented significant results . They note, however, that it is expected that some results may not be reproducible and that when they combined results from original data and data from the replications, 68% produced significant results. This number is still relatively low and creates doubt about the integrity of our current system of scholarly review. A follow-up study in progress to replicate cancer studies is also revealing flaws in our system. While initial efforts have been unable to replicate results, it may speak more to a lack of proper documentation than the integrity of the results themselves , and it still places doubt on which studies we can trust and which we cannot.
These issues will not be solved by will alone, nor can they be solved with one coordinated project or system. It will continue to take multiple iterations (generations) of software and systems, organizations working together and real dollars to overcome these barriers. It will take time to organize communities around these efforts and a willingness to maintain the diversity of systems necessary to meet a comprehensive set of needs. The cultural issues will take the efforts of the willing to absorb pain, demonstrate success and create new incentives before new cultural norms emerge. I would argue it should be as prestigious to discredit or disprove the findings of another as it is to posit a new hypothesis. We love to debate, and we need to maintain this component of our research process. Otherwise, the integrity of our system may be only an illusion.
Resources Mentioned in the Article
 Johnson, S. (July 2010). Steven Johnson: Where good ideas come from [Video file]. Retrieved from www.ted.com/talks/steven_johnson_where_good_ideas_come_from
 The Kite Experiment, 19 October 1752. (2016). Founders Online. Retrieved from http://founders.archives.gov/documents/Franklin/01-04-02-0135. [Original source: The Papers of Benjamin Franklin, vol. 4, July 1, 1750, through June 30, 1753, L. W. Labaree. (Ed.). New Haven: Yale University Press, 1961, pp. 360–369.]
 Benjamin Franklin. (2017). Wikipedia. Retrieved from https://en.wikipedia.org/wiki/Benjamin_Franklin
 Shiffer, M. B. (2003). Did Franklin really fake the kite experiment? History News Network. Retrieved from http://historynewsnetwork.org/article/1770
 Stodden, V., & Miguez, S., (2014). Best practices for computational science: Software infrastructure and environments for reproducible and extensible research. Journal of Open Research Software, 2(1: e21, 1-6). doi: http://doi.org/10.5334/jors.ay
 U.S. Office of Science and Technology Policy (2013). Memorandum for the heads of executive departments and agencies: Increasing access to the results of federally funded scientific research. Retrieved from https://obamawhitehouse.archives.gov/blog/2013/02/22/expanding-public-access-results-federally-funded-research
 Nosek, B. A., Spies, J., & Motyl, M. (May 25, 2012). Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Retrieved from https://arxiv.org/abs/1205.4251v2
 Baskin, P. (2017). An attempt to replicate top cancer studies casts doubt on reproducibility itself. Chronicle of Higher Education. Retrieved from www.chronicle.com/article/An-Attempt-to-Replicat-Top/238934
Richard Johnson is co-program director, Digital Initiative and Scholarship, and head of Data Curation and Digital Library Solutions at the Hesburgh Libraries, University of Notre Dame. He can be reached at rjohns14<at>nd.edu