European Commission

Printable version
Text smaller Text bigger

Optimizing Research Sharing in the European Research Area: Cyberinfrastructure, Quality and Open Access

September 29, 2009
VN:F [1.9.22_1171]
Rating: 0.0/5 (0 votes cast)

Jean-Claude Guédon

Main speaker session 1.5


I. Introduction

The issue of Open Access to the scientific and scholarly literature is obviously not self-contained. In the present context, it must be related to the deeper question of building a “research area”. It happens that the European case is particularly interesting because it also intersects the problem of building across national borders. As such, it raises issues that have a degree of universality about them.

It is in this spirit that the following little text has been written. Laboured or toiled would be better terms: these are difficult questions and addressing them within the narrow compass of a very few pages turned out to be much more challenging than anticipated.

Essentially, the argument begins with a rapid historical tour, followed by a disquisition about an important distinction: that between quality and excellence. Both of these parts set the background for the concluding section where the term cyberinfrastructure is used as a way to bring in the human and institutional questions, the networking strategies and the migration beyond the disciplinary straitjackets. Open Access, as will be obvious, is a crucial element of this construction, but Open Access will also derive much of its meaning from being embedded in this wider framing of scientific and scholarly communication. In other words, Open Access is not an absolute, but rather a means to a variety of very important ends, such as the building of optimal and vibrant research areas.

II. A Quick Historical Context

Scientific communication emerged in the 17th century and, broadly speaking, it has gone through four  historical phases:

1.       Journals and academies (ca. 1665-1820). This period sees the invention of the scientific journal in 1665. Access is no problem for the “Gentlemen of science”.

2.       Scientific societies (1820-1945).

Both of these phases see the natural philosophers and scientists firmly in control of the communication system. Printing is outsourced, but printers do the printing and little else. Access is often organized through the bartering of journals.

3.       Commercial publishers and globalised scientific societies (1945-present). With the Second World War, Germany lost its pre-eminence in scientific publishing. Some commercial publishers, following the example set by Robert Maxwell, seize the opportunity opened by the decline of European societies, and create firms that grow and merge. Eugene Garfield’s Science Citation Index (SCI), opens the era of scientific metrics.

The advent of the third phase has led to rapidly rising subscription prices, and more difficult access to the scientific literature for many researchers. Publishers have increased their control over journals whenever possible, for example by owning them. Through investments in new journals that open new positions for editors, i.e. gate-keepers, publishers directly influence the power structure of science. This situation will only increase with the next, partially concurrent, digital phase. Access becomes more difficult for most.

4.       Digitization and the Internet (late 1980’s-present). Digitization has transformed many aspects of publishing:

a.       The sale of objects (printed issues of periodicals) is replaced by licensing of access to a database of articles. Contractual arrangements tend to trump copyright and authors’ rights provisions.

b.       The wholesale, discounted, licensing of access to all the titles of a given company (the “Big Deal”) becomes possible because the costs of copying and disseminating are close to zero. Big Deals .

c.       The transformation of journal collections into digital article collections allows locking up public domain materials.

d.       Publishers take on a wider set of tasks that are ever more closely related to content policy (investing in new journals, new marketing strategies, linking articles from company journals to stimulate use and, therefore, citation, etc.).

e.       Publishers invent new forms of business plans for OA publications (for example “author-pays”, or “Open Choice” in hybrid journals). Granting agencies and, more recently, some institutions (universities, research centres, etc.) begin to adapt financially to this shift

f.         Many researchers and their institutions take advantage of the low entry barrier into digital publishing to create new journals[1]. They often rely on local subsidies, volunteer work and, for a few, on the “author-pay” approach (for example the Public Library of Science publications).

g.       Some governments, aware of the fact that scientific publishing forms an essential phase of scientific research, aware of the fact that its costs is but a tiny fraction (1 to 2%) of the cost of research, simply subsidize scientific publishing directly. This is the SciELO model in Latin America, with participations from South Africa, Spain and Portugal.

h.       A powerful movement, based on self-archiving and the creation of various types of depositories (institutional, central and subject-based – the “green road”), has provided an important alternative (and complement) to the OA journal route (the “gold road”). It coexists, rather than compete, with the present publishing system, and many publishers allow it with various constraints[2].

The elements that make up phase 4 above reflect the fundamental fact that computers, especially when they are networked, form a highly disruptive technological milieu[3]. Open Access is one of the more visible symptoms of the “disruptive” transformation of scientific publishing. It signals the present, or potential presence, of important shifts in power relations between publishers, scientists and research institutions. Recently, all of these elements, including the network technologies that make them possible, have been subsumed under the term “cyberinfrastructure”[4].

Designing a European Research Area amounts to sketching out the contours of a new equilibrium where the “great scientific conversation” is optimized. Its communication system must rest on a secure financial basis. It must  also fulfill a number of tasks that scientific communities need in order to work. This includes unambiguous attribution of discovery (with date), preservation of the scientific archive, good branding capacity, openness of digital documents to algorithmic treatment (or “open computation”, to use Clifford Lynch’s phrase), and easy access for researchers (but shouldn’t publicly funded research results be available to all citizens?). Digitization can lead to a very closed or a very open world, but Open Access has been making significant, if not always coordinated, inroads. Its opponents can no longer ignore it, or even scoff it off.

               III Excellence is not a Synonym of Quality

Designing a European Research Area requires clear objectives. Does it seek to foster quality research, or does it seek excellence? The two terms are often used as if they were more or less synonymous, but this assumption should be questioned. Excellence cannot substitute for quality, or vice-versa. Upon closer inspection, these two words reveal complex relations that merit detailing.

Quality is based on thresholds, on minimum requirement. It guarantees function, capacity or competence. Factories, for example, practise “quality control”, not “excellence control”. Likewise, our schools pass or fail students on the basis of whether they demonstrate sufficient skills. This is the fundamental meaning of a “passing grade”. It is true that educational systems also include gradations above the passing grade, but these act as quality gradients and thus simply refine the primary objective of schools. The end result is the definition of levels of quality, not of excellence.

Schools have long internalized the complex relationship between quality and excellence. Some means of evaluation in schools are designed to identify who is best in a group, in other words excellence. This means that quality and excellence can complement each other. Because such groups also have quality control, they generally turn out to be of outstanding quality as well. However, if we imagine identifying the best in a set that would be devoid of quality control, the result could easily be very poor. In short, quality can exist without reference to excellence, but the reverse is not true.

The relationship between excellence and quality is neither direct, nor linear. Mediocre systems can harbour elements of excellence but some basic level of quality is still required for excellence to occur at all. Conversely, excellence signals the probable existence of some level of quality in the background. However, evaluating the level of quality through the occasional presence of excellence is not reliable, and neither is predicting the appearance of excellence through the measurement of quality.

To identify the best, it is necessary to design special kinds of “games” or “contests”. Obviously, the rankings emerging from such contests will depend in part on the rules and how they are applied. In the world of research and in the context of “Big Science” research, to use Derek de Solla Price’s  formulation, armies of laboratory experimenters relentlessly produce more and more data and observations ultimately distilled in the form of articles. All this work depends on quality, not excellence. Scientists do compete, of course, but the competition rules may be based on criteria that would lead to identifying forms of excellence that could be surprising. For example, if competition is based on impact (i.e. number of citations received), it may lead to writing review essays, or publish laboratory technique rather than usual research articles[5]. These are useful tasks, but they hardly qualify as groundbreaking work. Conversely (and ironically), coming up with truly radical concepts in a given field may lead, initially at least, to resistance and even rejection on the part of peer reviewers.

Part of the present scientific “game” also includes the title (and prestige) of the journal where scientists publish. Before SCI was used to calculate “impact factors”, journal prestige was largely based on impressionistic consensual perception: scientists, particularly leading scientists, “knew” which journals mattered[6].  Bibliographies tended to institutionalize and formalize this rough form of consensus. However, bibliographies also act as a quality threshold because they need to include some journals and exclude others. SCI organized itself in the same way, but it did so on the basis of an explicit concept of “core science”. As a result, quality was redefined on a new basis: being selected by SCI became (and still remains) the passing grade for a journal.

SCI also redefined the terms applied to the competition between journals by designing an “impact factor” (IF).  It is important to note that the IF emerged from outside the research communities. In this fashion, SCI created an expression of visibility and popularity that looked objective because it was quantified. The IF thus provided a powerful tool to gauge competition among journals. It turned into an even stranger device when research administrators, or their surrogates (for example tenure and promotion committees) began to extend the use of IF to individuals by essentially considering that their capacity to publish in journals with high impact factors was a reliable index of their intellectual quality. Doing so has led to a profound redefinition of individual excellence and has affected the career of many. For example, it has affected the ways in which juries have allocated research grants. The apparent objectivity of a number is seen as having been regularly used as a way to facilitate decision taking.

Beside the measurement problems just noted, the confusion between excellence and quality can easily increase the invisibility of what is not included in “core” science. This issue is regularly discussed in terms of “lost science”[7], and then it generally refers to difficulties met by scientists and publications in developing nations. But the reasoning also applies to richer countries because focusing on the competitive best simply ignores the simply “solidly good”. It accentuates the skewing of distribution curves that are already naturally skewed[8]. As a result, it increases the relative power of those at the top at the expense of those on the periphery. This is obviously an important consideration to take into account in the designing of a European Research Area. Before anything else, the confusion between excellence and quality also leads to a waste of intellectual potential[9].

               IV. ERA, cyberinfrastructure and Open Access

The term “research area” points in a particular direction in as uncontroversial a manner as is possible. The European Union is a project, a work-in-progress, and it is made up of member states that come with their usual accoutrement of boundaries and territories. The point of building Europe, as a political and economic construct, is to make the boundaries more porous, to conceive of these boundaries as sources of rich diversity rather than pillars of murderous adversity.

Such goals do not necessarily agree with the notion of national territory. According to geographers, a territory is space that has been structured by means of communication. Building a nation-state rests in part on structuring the national space with various means of communication, including radio and television. Then came the radically supra-national Internet.

With the goal of a European Research Area emerges the possibility of a new kind of territory, one that may be competing with the classical territories of nation-states. Speaking in terms of an “area” finesses the need to speak either in terms of amorphous space or the organized hardness of territories. However, the phrase stands as no more than a place holder, a problem-to-be-solved, rather than as a label for a well-defined concept.

Part of the solution to the construction of Europe comes from the new technologies. For this reason, the notion of “cyberinfrastructure”, I believe, can be an important lever. But the notion of cyberinfrastructure introduced here is not simply technical; it fully includes the human factor in the development equation, and human choices in technical decisions. In a recent posting on the site “Academic Evolution”, Professor Gideon Burton has aptly described “cyberinfrastructure” in the following way: “[It] is built as much upon social parameters, intellectual property provisions, and academic evaluation systems as computer systems. It requires us to reconceptualize what is consequential about scholarly work beyond traditional genres or methods of academic publishing.”[10]

The term “cyberinfrastructure” fits well with the expression “great conversation” used earlier to describe the essence of scientific research. Let us recall some crucial points included in the “great conversation”:

1.       Science is built through producing papers that are published to be tested by colleagues. This is the very foundation of the “great conversation”.

2.       Scientific authors “exist” only to the extent that their published words enter the conversation and are used by others. Authors, from this perspective, behave like nodes in a network and science is a highly distributed process taking place within networks.

3.       Networks, while generally sporting fuzzy boundaries, keep together and remain identifiable through intense exchanges passing through their nodes. This suggests that the strength of a network will grow stronger with better access to the “great conversation”. This is why Open Access is the optimal form communication can take within any network, and why it is so crucial for the optimal carrying out of scientific research[11].

The previous three points show why a cyberinfrastructure is not simply a technological issue; but they also demonstrate why “great conversation” as network fits so well with the technological notions of networks, in particular the Internet. I would suggest that the word “cyberinfrastructure” begins to provide substance to the term “area” in the following manner:

1.       It provides a technological foundation to the research area (exactly as postal systems and printing were necessary foundations for the Republic of science in the 17th century);

2.       It clearly lays out what is needed to work well: full, immediate, open access to all of science. This translates into Open Access for scientific publications and data;

3.       Following Gideon Burton’s remarks, it includes the need to rethink a number of crucial elements in the workings of scientific communication, some of which – the easiest – are technical, some of which are human. The most important element, in my opinion, is the issue of evaluation of research results. The issue of property is also important, but it should be approached as a way to serve the “great conversation” rather than the reverse. In other words, patent issues should not be neglected, but neither should they be allowed to slow down scientific progress;

4.       Cyberinfrastructure, viewed through its technical dimension, can be extended almost indefinitely. However, networks, for example a European network, can sport a degree of cohesion and a clear identity through the management of the communication flows. There is no question that Europe must communicate with the rest of the world; however, it can do so while remaining visible as “Europe” because even more intense ties, including face-to-face ties through personnel mobility, can be built in the European network. Establishing patterns of European careers for European researchers is another way to create identity[12]. In this way, the European network will truly exist and, because of its very existence, it will be in a better position to engage with other networks.

A European cyberinfrastructure is actually well on its way to being implemented. In a great many member states, initiatives have been taken that are moving in the right direction. Among these, the following are particularly important:

1.       A harmonization of European diplomas to facilitate the movement of students, teachers and researchers throughout the Union;

2.       A political will to bring this freedom of movement to its highest possible pitch;

3.       A series of decisions taken by the European University Association (EUA) to improve quality and accountability (for example the Institutional Evaluation Programme, the Examining Quality Culture project).

4.       Preservation of digital documents also rests on the existence of networks, as the LOCKSS approach easily demonstrates[13].

5.       Open Access activities, particularly regarding depositories, and the support of researchers to publish in OA journals.

However, and more fundamentally, it is the quality issue that should act as the guiding light. European activities should push for high quality standards in all research activities. In parallel, complementary strategies can de designed to identify the best researchers and promote European excellence. Various forms of competition, prizes, “games” can help achieve this goal, but the important point is to keep quality control and excellence competition both separate and mutually supportive. Never should one be confused for the other.

What is still missing in these developments is a coordinated effort to construct an optimal European Research Area.  The Summary of Responses to the Questionnaire to CREST Members and Observers (dated June 9, 2009) demonstrates this fully. Clearly, the national and European perspectives create a pattern of interference that displays elements of convergence, but also idiosyncratic elements. This is not surprising in the present European context, but it is precisely the obstacle to overcome. On one hand, ERA is mentioned regularly, but it still looks more like a hope rather than a reality; on the other hands, various projects and initiatives, some financed by the European Commission, some by member states, contribute interesting building blocks to a possible common construction. However, the plans for this construction are still largely absent.

To this end I would suggest the following policy moves:

1.       Designing a strong technical cyberinfrastructure for the whole of Europe is important for two reasons:

a.       It will provide an even playing field for all parts of Europe when it comes to access and communication;

b.       It will provide an apparently neutral discussion ground for member states, through which human, social, economic and even political issues can be negotiated, but negotiated in a transposed vocabulary that should help the process to move forward.

2.       All the European strategies aiming at facilitating the movement of scholars, their encounters face-to-face, and their collaboration in various projects, should be strongly emphasized: broadband communication among Europeans is what ensures the density and the structuring of conversation needed to give it a specifically European tonality (without cutting it off, once again, from the rest of the world). But non-technical solutions such as harmonized diplomas and career paths, European quality assurance, etc. are also part of this strategy;

3.       All the European strategies aiming at opening up access to the literature must be strongly encouraged. This includes more than supporting Open Access as a broad principle; it means giving specific direction and purpose to certain important facets of Open Access. Open Access, for example, can be used as a very efficient tool to establish quality control. I will suggest only two possibilities for Open Access here, but they are crucial (and obviously, more could be developed):

a.       Following up on the excellent example of Brazil and a dozen other countries, including a few from Europe, a “public option” should be offered to scientific authors. That public option would take the form of stable, viable[14], rigorously managed, scientific journals that would be simply financed as part of the governmental efforts to support scientific research[15]. In this way, authors could decide where to publish. They would know that the public journals are at least as rigorous as the commercial ones. They would know that publishing in such journals would automatically place their articles in Open Access. Such journals could certainly demonstrate a degree of authority, visibility and prestige equal or superior to most of their commercial counterparts. Scientific society journals could automatically become part of such a system, thus reconnecting publishing with scientific communities. Finally, these journals, by their presence, would act as quality control instruments on commercial journals;

b.       Building on the excellent work done by DRIVER[16] and all the national initiatives aiming at developing institutional, central or subject-based repositories, European countries could   network these repositories in such a way as to create the possibility of measuring the quality of the articles entrusted to this network of collected digital documents. In other words, and even though many, if not most, of the articles collected would already be the product of some process of peer review attached to the journals where they were first published, they could be submitted to the judgment of international juries that could rank these articles according to some scales to be devised. To put it in a slightly different way, the network of institutional, central and subject-based repositories could easily become the foundation for an entirely new kind of quality evaluation that would complement or even correct the results of journal-based peer review and quality assumed on the basis of title prestige.

The point of these two proposals is quite simple: it would help the research communities of Europe, and the institutions with which they are associated, to recover some of the control they have been losing over content and orientation of research, as was pointed out in the third and fourth historical phases sketched out at the beginning of this text. It would also feed Open Access into a  broader project than is usually the case, and would give it fuller meaning. In particular, Open access would not only fit clearly within a European infrastructure, but it would become an essential element of this cyberinfrastructure. Without Open Access, the European Research Area will not provide the right tools for the optimal working of the “great conversation” and for the best possible integration of new member states.

[1]    Many of these journals can be found in the Directory of Open Access Journals. See

[2]    The Romeo list is well known in this regard: .

[3]    The expression “disruptive technology” became public in 1995. See Bower, Joseph L. & Christensen, Clayton M. (1995). “Disruptive Technologies: Catching the Wave” Harvard Business Review, January-February 1995.

[4]    The word “cyberinfrastructure” gained purchase through the “Report of the National Science Foundation Blue-Ribbon Advisory Panel on Cyberinfrastructure”, often called the “Atkins Report” following the name of the Chair. This report is available at

[5]    With regard to review essays, it is known that journal editors occasionally include such articles to boost the impact factor of their journals.

[6]    In a sense, it worked a little like the recent exercises carried out by the European Science Foundation to define the significant humanities journals in the world. See

[7]    This term was used in an article on the subject published in Scientific American in August 1995: “Trends: Lost Science in the Third World” by W. Gibbs, 92-99.

[8]    For example, it is known that only 6% of a given set of scientists produce about 50% of all the articles emanating from that set.

[9]    In the case of the Third World, see my “Open Access and the divide between “mainstream” and ‘peripheral’ science” available at . The Brazilian publication (in Portuguese), originally slated for 2008 is still in waiting. An Italian translation is forthcoming as well.

[10] Gideon Burton teaches Renaissance Literature at Brigham Young University in Utah, USA.

[11]  For an important study of networks from this perspective, see Yochai Benkler, The Wealth of Networks” (New Haven, Yale University Press, 2006).

[12]  The Bologna process plays an important part in this regard.

[14]  I use the term viable rather than sustainable because the latter really implies a stable status within a market context. This implies the overarching presence of markets in all situations. This point is questionable, of course, but without even reaching into metaphysical spheres, it is clear that the important marker, for science, is not of journals, but of facts, ideas, theories and concepts.

[15]  Let us remember, once more, that the relative cost of publishing research to the cost of supporting research is very small – 1 or 2% – and let us remember that most people agree that publishing is an integral part of the research cycle.

[16]  DRIVER is well known, but further detailscan be found at


3 Responses,'fpfis' to “Optimizing Research Sharing in the European Research Area: Cyberinfrastructure, Quality and Open Access”

  1. Chris Armbruster Says:

    This contribution by Jean-Claude Guedon highlights that open access is not an end but a means to enhancing and simplifying scholarly communication. In the context of ERA, the value of OA might be to enhance the quality of research throughout the area because all researchers, whatever their location, have equal and timely access to scientific information. For this, OA must be wired into the cyberinfrastructure of ERA. So far, so good, i.e. policy recommendations 1. and 2. are sound. But recommendation 3. does not follow. Neither is it obvious why the state should directly sponsor scholarly journals (though scholarly publishing receives much subsidy in one way or another), nor can we be assured that repositories will deliver, especially in a mix-and-match approach (though some of the large subject-based repositories have a track record of enhancing scholarly communication). To stimulate debate I wish to register two objections:

    – The point about scholarly journals (if not conceived only as minutes of science or career advancement service) is scholarly communication and if cyberinfrastructures, open source software etc. lower the barriers to entry then you want a flexible system that is open to innovation, new content and the emergence of new communities. Big deals and some current structures of commercial publishing are not conducive to this, but it is hard to see how a ‘public option’ with the concomitant issues of power, governance and funding will improve the situation. Moreover, with a proliferation of open access publishing models already happening (scholarly and commercial –, efforts might sensibly be directed at reflexively observing and improving these;
    – Repositories have a mixed track record. Institutional repositories, despite existing for nearly a decade and being sponsored with quite a bit of public money (including the EU) have not impacted scholarly communication in any way. Moreover, most of them remain sparsely populated. By contrast, large subject-based repositories have made a difference (e.g. ArXiv SSRN, RePEc – sometimes with little or no public money) and, if powered by research funder mandates (e.g. PMC), may be enhanced further ( In fact, it could be argued that large subject-based collections could become a research infrastructure that supports scientific excellence (faster circulation of results, new applications, easy text and data mining etc.).
    For the ERA Vision 2020 it would seem worthwhile to take a hard look at the ongoing evolution of journals and repositories and have a clearer insight it was does not work and what does enhance scholarly communication, aid quality assurance and, possibly, help foster scientific breakthroughs.
    VA:F [1.9.22_1171]
    Rating: 0 (from 0 votes)
  2. Jean-Claude Guédon Says:

    I expected this reaction. The public option is justified as follows: States support research and publishing is part of the research process. Therefore, there is nothing illegitimate in States supporting scientific communication. In fact, they do it already, as reiterated by Chris Armbruster.

    A public option is not meant to push out commercial publishers; it is simply meant to establish standards and minima, both in terms of quality of publishing – a point on which commerical publishers seem to be gradually failing – and in terms of prices – a point on which commercial publishers have reached unsustainable levels. Authors could then see which journals they prefer: public or private, but all public journals , as a matter of course, would be OA.

    Editorial autonomy is obviously required and, ideally, the support for these journals should be coming from a basket of countries to decrease the risks of interference.

    The issue of intereference, in passing, is real, but not limited to public financing. Consequently, before raising this issue, supporters of private publishers should carefully study the practices of these publishers and evaluate their level of interference in the editorial process. For example, when a private publisher creates a new journal, how is the editor in chief chosen? Only by peers? Probably not.

    VA:F [1.9.22_1171]
    Rating: 0 (from 0 votes)
  3. Stevan Harnad Says:

    Suggestion a is merely the usual “support Gold OA publishing” stance


    And suggestion b misses the point of Green OA self-archiving — not realizing that the real problem and challenge for OA and the EU research community is (1) to elicit (though OA mandates) the EU’s missing target OA content, most of which is not yet being made OA  — not (2) to “re-evaluate” the little that is being made OA by “international juries” (as a first step toward the creation of hypothetical alternative “branding” authorities, to replace closed-access elitism in science, and through its existing peer-reviewed journals).


    The target OA content is of course already-published, peer reviewed journal articles, which have already been evaluated by “juries” (peer review). Meanwhile the real problem of OA, and the only one — providing access to all peer-reviewed journal articles being published (by whatever economic model) today — is left by the wayside, unsolved by suggestion b. (Suggestion a alone is far too little and far too slow.)


    Suggestion b is just a suggestion to somehow create re-evalutaton “juries” for what (little) OA content is already being provided currently, not a means of increasing that content to cover all the refereed research output of Europe, through Green OA mandates from all EU institutions and funders, which are what is really needed.


    Besides, what is needed for peer-reviewed publication (besides OA itself)  is not yet another peer evaluation (this time “international”) but OA metrics! That’s what should be recommended to the EC for postpublication assessment, not another time-consuming refereeing exercise for already refereed articles! (That’s exactly the wasteful process that the UK’s Research Assessment Exercise is trying to get away from!)

    VA:F [1.9.22_1171]
    Rating: 0 (from 0 votes)