I was privileged to be at a meeting between JISC (UK) and NSF (US). Every paragraph of the report is worth reading – I quote a few…
The fundamental conclusions of the workshop were:
• The widespread availability of digital content creates opportunities for new forms of research and scholarship that are qualitatively different from traditional ways of using academic publications and research data. We call this “cyberscholarship”.
• The widespread availability of content in digital formats provides an infrastructure for novel forms of research. To support cyberscholarship, such content must be captured, managed, and preserved in ways that are significantly different from conventional methods.
As with other forms of infrastructure, common interests are best served by agreement on general principles that are expressed as a set of standards and approaches that, once adopted, become transparent to the user. Without such agreements, locally optimal decisions may preclude global advancement. Therefore, the workshop concluded that:
• Development of the infrastructure requires coordination at a national and
international level. In Britain, JISC can provide this coordination. In the United States, there is no single agency with this mission; we recommend an inter-agency coordinating committee. The Federal Coordinating Council for Science, Engineering and Technology (FCCSET), which coordinated much of the US government’s role in developing high performance computing in the 1990s, provides a good model for the proposed Federal Coordinating Council on Cyberscholarship (FC3S). International coordination should also engage organizations such as the European Strategy Forum on Research Infrastructures (ESFRI), the German research foundation DFG, and the Max Planck Digital Library.
• Development of the content infrastructure requires a blend of interdisciplinary research and development that engages scientists, technologists, and humanities scholars. The time is right for a focused, international effort to experiment, explore, and finally build the infrastructure for cyberscholarship.
• We propose a seven-year timetable for implementation of the infrastructure. The first three years will focus on developing and testing a set of prototypes, followed by implementation of coordinated systems and services.
Computer programs analyze vast amounts of information that could never be processed manually. This is sometimes referred to as “data-driven science”. Some have described data-driven science as a new paradigm of
research. This may be an over-statement, but there is no doubt that digital information is leading to new forms of scholarship. In a completely different field, Gregory Crane, a humanities researcher, recently made the simple but profound statement, “When collections get large, only the computer reads every word.” A scholar can read only one document at a time, but a supercomputer can analyze millions, discovering patterns that no human could observe.
The National Virtual Observatory describes itself as “a new way of doing astronomy, moving from an era of observations of small, carefully selected samples of objects in one or a few wavelength bands, to the use of multiwavelength data for millions, if not billions of objects. Such datasets will allow researchers to discover subtle but significant patterns in statistically rich and unbiased databases, and to understand complex astrophysical systems through the comparison of data to numerical simulations.” From: http://www.us-vo.org/
The workshop participants set the following goal:
Ensure that all publicly-funded research products and primary resources will be readily available, accessible, and usable via common infrastructure and tools through space, time, and across disciplines, stages of research, and modes of human expression.
The shortcomings of the current environment for scholarly communication are wellknown and evident. Journal articles include too little information to replicate an experiment. Restrictions justified by copyright, patents, trade secrets, and security, and the high costs of access all add up to a situation that is far from optimal. Yet this suboptimal system has vigorous supporters, many of whom benefit from its idiosyncrasies.
For example, the high cost of access benefits people who belong to the wealthy organizations that can afford that access. Journal profits subsidize academic societies. Universities use publication patterns as an approximate measure of excellence.
Younger scholars, who grew up with the Web, are less likely to be restrained by the habits of the past. Often – but not always – they are early adopters of innovations such as web search engines, Google Scholar, Wikipedia, and blog-science. Yet, they come under intense pressure early in their careers to conform to the publication norms of the past.
… and so the final proposal
… a seven year target for the implementation of the infrastructure for
cyberscholarship. The goal of establishing an infrastructure for cyberscholarship by 2015 is aggressive, but achievable, when coordinated with other initiatives in the U.S., Britain, and elsewhere. A three-phase program is proposed over seven years: a three-year research prototype phase that explores several competing alternatives, a one-year architecture specification phase that integrates the best ideas from the prototypes, followed by a three-year research and implementation phase in which content infrastructure is deployed and research on value-added services continues. Throughout the seven years, an evaluation component will
provide the appropriate focus on measurable capability across comparable services. A “roadmap” for the program is suggested in the following figure.
[… it’s too large to cut, so you’ll have to read it for yourselves…]
… and the details …