Back-office functions for knowledge economy
October 19, 2009
Discussant session 1.5
Jean-Claude Guédon raises several important issues in his report.
The whole discussion of excellence versus quality is extremely relevant, but also one where it is extremely difficult to point to concrete actions. One in which one works with renewed peer review as part of the parallel publishing is an interesting idea, but it will in my opinion get widely accepted with difficulty. What is the added value for the user? Is this enough value for them to find it worth while to use the extra time?
The introduction of “great conversation” to characterize the research process is interesting to play with. A conversation, however, requires many learned skills such as language and grammar. In using this “conversation” metaphor, “language” and “grammar” are important – to be part of an interesting discussion also requires a common knowledge often denoted as “cultural background”. Furthermore, both language and grammar change over time – as does the cultural background. What a few decades ago was considered common knowledge in the European society such as religious icons, can today not be taken for granted, and reading old text requires an extended use of notes. The conversation therefore is a product not only of place and time but also linked to the people communicating, their network and the means by which the communication is mediated.
The introduction was not meant as an academic discussion of what constitutes “conversation” or rather meaningful conversation, but really meant to point to a large number of issues which need to be addressed if the information should be useful across discipline borders and over time, popularly speaking areas which need to be part of the back office.
And reflecting new ways of doing research – just as the preprint servers radically changed the use of articles especially in the science domain. These new methods include “living review”, database publishing as in the genome project or collaborative knowledge development through the use of e.g. wikis. Reproducible results call for access to data as well as to the tools used to analyze these and is an important factor in the open research environment. Publishers such as Nature now require that datasets behind articles are deposited prior to publication of the article.
But how can a researcher in the future return to datasets and reproduce the results – or how can the datasets be useful in other areas? An answer could encompass an infrastructure consisting not only of the articles and data but also of the programs used, the parameters and/or the (discipline specific) implicit assumptions. It has always been part of archival practice to ensure the context information – the provenience.
The knowledge economy requires new business models. The traditional model from the industrial society based on high values of goods, i.e. digital objects, must be replaced by one which addresses the ease with which these can multiply, but also one which incorporates the cost of maintaining the objects. We need to invent models which on one hand respect the intellectual property and on the other allow for use and re-use of material fast and easily.
The creative economy applied in the research arena would be a cocktail of use and re-use (mash-up), expert involvement, scientific acknowledgment of added intellectual property, and continued trust in the document; this is not easy to mix and even harder to implement. Characteristic in the above list is both the creative element expressed in the intellectual knowledge creation and the infrastructure element based on standards and rules. Not only the concepts, words and grammar for the conversation needs to be known to have a scientific conversation but also the means by which they are transmitted and the means to ensure that the conversation can be interpreted in the future. One example of such new infrastructure elements is “workflow provenance”, where relevant information is captured and stored with the data.
As we see more and more creative elements in research some needs to work on ensuring that the tacit information is captured and embedded in the data and that relevant information is captured to allow other researchers to reproduce the results.
And to stimulate wide uptake of results across different disciplines, we need to ensure enough descriptive data to judge the validity of a given dataset and its potential usage.
Two different perspectives, but both are needed.
The main point deduced from the above is that cross disciplinary use and meaningful future access depend on similar descriptions – and for both applies that this information must be captured as an integrated part of the research process. The more open this process is and the more people who are invited to read, judge and use, the higher the expected long term quality of the results.
Furthermore, to ensure long term availability, short term issues need to be addressed. The above mentioned use and tagging are important, but also work on ensuring a proper description of the interrelation between objects and adherence to a set of agreed standards will increase the chance of wider use of the results.
The main policy move suggested here is for the commission to require a proper knowledge management plan as part of the applications and as already suggested to stimulate mechanism for sharing of information and thereby push the research area into the knowledge economy domain.
Tags:Birte Christensen-Dalsgaard


