Electronic Publishing and Academic Credibility

  1. 1. Raymond George Siemens

    Dept of English - Malaspina University, Electronic Textual Cultures Lab - Malaspina University, University of Victoria

Work text
This plain text was ingested for the purpose of full-text search, not to preserve original formatting or readability. For the most complete copy, refer to the original conference program.

In response to a call (by the Humanities and Social Sciences Federation of Canada [HSSFC]) for targeted research, a team assembled from among the faculty at Malaspina University-College, the University of Victoria, the University of New Brunswick, McMaster University and Université de Montréal took on an exploration of the urgent issues relating to the perception -- and ultimately the use -- of electronic publication within, and well beyond, the Canadian academic community.

The research team has, since, provided a critical assessment of the North American and European literature surrounding the notion of credibility in electronic scholarly publication and has made recommendations that take into account both that literature and factors unique to our national context. Assessment and recommendations have been made in distinct, though interrelated, areas of Peer Review and Imprint, Copyright, and Archiving and Text Fluidity / Version Control.

Our paper presentation will highlight our findings and outline the recommendations we have made to the HSSFC. Our approach can be found in the materials attached below; a draft of our report -- still in progress, at November 2000 -- can be found at the following URL:


Peer Review and Imprint

Peer review, seen by many as the most important factor to assuage the reluctance of scholars to publish electronically, is a process that has evolved over many generations of scholars; it has become the cornerstone of academic publication and is something that is highly valued in all scholarly activities (among them, the pragmatics of academic review processes on which promotion and tenure are based). It is important to recognize that peer review is necessarily, and appropriately, a conservative process, and that any new scholarly endeavor will take time to gain general acceptance.
The team -- with the leadership of Jean-Claude Guédon (U Montréal) in this area -- has located, selected, and provided a representative bibliographic overview of pertinent literature, literature that helps us identify the characteristics of peer review processes suitable for electronic publication, literature that helps us consider the implications of such processes both for reviewers and authors, and literature that recommends best practices that accord as much as possible with those already accepted by the academic community for non-electronic scholarly publication.

Like peer review, a publisher's imprimatur, or imprint, is seen to be a very important indicator of qualitative assurance in academic culture. As part of its study of peer review, the team has also treated the subject of imprimatur as related to that of peer review, and in a similar manner -- chiefly directing the reader to literature that proposes best practices aimed at both traditionally-print publishers operating the in the electronic medium and newer academic publishing groups that operate solely in the electronic medium.

Recommendations on the topic of peer review and imprint remain largely implicit, but a suggestive overview of current thought is given, as is an annotated bibliography and a summary of the data gathered on these topics via the questionnaire that was circulated among a representative sample of publishers, scholars, reviewers for academic publishing granting agencies, reviewers for bodies giving grant funding to those in the humanities, university administrators and academic authors.

Under the direction of Michael Best and Elizabeth Grove-White (U Victoria), the team has located, selected, and provided an overview of pertinent literature that highlights critical issues affecting the ownership of online resources, focusing on those resources that are text-based. The report also provides links to sites where the discussion of copyright is ongoing.

Work in this area, the report asserts, suggests that if copyright regulation is important for the traditional print publication mission of academic research institutions, it is doubly important in the context of new digital communication media, since international and national copyright principles and regulations were developed before networked digital media transformed the information and communication marketplace.

Broadly speaking, Canadian copyright legislation recognizes that copyright ownership rights belong to the creator of a work and those rights inhere in a work from the moment it is fixed in a tangible medium of expression. But it is normal in the academic community for the author to cede copyright to the publisher; thus, it follows that most academic texts belong not to an individual but to a collective (a journal, a press).

Academic authors are more concerned about the "currency" of the profession -- tenure, promotion, salary -- than about royalties or ownership of their work. In those disciplines where the norm for scholarly interchange has been through monographs published by university presses, more often consulted in the library than bought individually by scholars or students, there has been little challenge to the practice of ceding the ownership of scholarly work to the press. In the digital age, however, this practice is likely in due course to reduce substantially the potential readership of scholarly works, especially of journal articles.

At the same time as academic attitudes to copyright de-emphasize ownership issues that dominate the realms of the professional writer and press, they strongly underline the importance of wide promulgation of texts and transparent access for users. Addressing copyright in the context of scholarly publication means addressing fundamental principles of academic culture, specifically the academic community's mission of advancing knowledge through creating, validating, and disseminating new knowledge, and through preserving the existing corpus of public knowledge. Academic publication is central to these activities because it is the main process through which newly discovered knowledge is refined, certified, distributed and archived. In addition to promulgating knowledge, publication provides the North American academy with the mechanisms for assessing the quality and quantity of contributions by individual faculty members that are used to determine faculty eligibility for tenure, promotion, grants and fellowships.

With the advent of the Internet and increasingly user-friendly technologies for online editing, publishing, and distributing available to faculty, an increasing number of disciplinary groups have turned to online journals and other online forums for disseminating their knowledge. While the academic community continues to have reservations about this seemingly unruly new medium and the implications for the traditional academic activities for validating, authorizing and archiving new knowledge, the push for electronic forms of scholarly communication reflects a search for more timely, convenient, and economic means of announcing and certifying new research results.

But while the Internet is bringing steady and fundamental changes in the processes of scholarly communication and publication, these changes are occasioned less by the potential of the new media than by economic considerations occasioned by the changing world of academic publishing. In response to this publication crisis, influential academic groups within Canada and beyond see digital publication as a strong alternative to the limitations of the current print environment and a non-commercial alternative to traditional scholarly communication.

Because issues of copyright -- both ownership and fair use access -- have become so imbricated with perceptions of the economic potential in electronic publishing, questions about the distinctive operation, use, and unique copyright challenges posed by digital media have been poorly represented in these debates.

Both the actual legislation concerning copyright, and the perception of intellectual ownership of data within the academic community are bound to evolve as the new medium both forces and encourages change. It will be a major challenge in the next decade to find ways of making knowledge more freely available in electronic format, without reducing the legitimate needs of those who create content for a living.

Archiving and Text Fluidity / Version Control

Led by members of the University of New Brunswick Electronic Text Centre -- Alan Burk, James Kerr, and Andy Pope, a group that is currently involved in a larger study of journal metadata, citation linking and journal archiving -- the team has structured its review in this area to take into account a wide number of concerns associated with identifying and preserving academic work in the dynamic and evolving nature of the electronic medium.

While the internet is being populated with texts that are functionally indistinguishable from their print equivalents, it has also become a medium for new sorts of texts that do not fit well within the standard taxonomy of scholarly publication. Academic web sites, and electronic scholarly texts, can incorporate traditional text, non-linear sequencing, various sorts of digital multimedia objects (image, audio, &c.), and a level of interactivity that goes far beyond the relation between reader and static print. To read them may require specific hardware and software. Given these new forms of publication, the archiving challenges are many. Also contributing significantly to the problem is how easy it is for individuals and groups to create or modify electronic publications and the variety of formats, some of which are proprietary, which are the carriers of the new electronic media.


The report asserts that there are four processes that might preserve digital objects; they are as follows:

physical conversion to an accepted archival medium;
digital conversion that parallels changes in the hardware and software environment;
physical maintenance of the appropriate hardware and software environment;
or virtual maintenance of the appropriate hardware and software environment.
The literature describing these four processes or promoting one or another of them is complex and, necessarily, speculative. The explicit speculation tends to focus on what is technically possible or administratively plausible. The social and economic components of any digital preservation program are assumed more often than they are stated or examined. No one knows what a national or international-level digital preservation program would cost or how its implementation would be structured. However, institutions such as the National Library are trying to grapple with planning for a national level archiving and preservation program for the humanities and social sciences, looking at such issues as funding requirements, level of preservation, criteria for selecting objects to be preserved, and working over time with electronic objects where the original format and associated software have ceased to be generally supported. Their efforts are discussed in a forthcoming summary of an interview between the Electronic Text Centre and National Library staff, one of a series of interviews with key stakeholders.

Preservation of human artifacts has been largely a societal, rather than an individual or corporate, activity. Funding for libraries, museums, and archives is implicitly contingent on their acceptance of various preservation responsibilities and the recognition by national and provincial level funding bodies of the need for digital preservation. Digital preservation is different from traditional preservation in the sense that no agency is currently being funded to provide national level digital preservation. The institutional will to create digital archives and the societal support to maintain them may develop, but if they do not, it really will not matter whether the technical problems raised by digital archives can be solved.

Adherence to standards is sometimes suggested as an additional method for digital preservation, but it is not a preservation process. Standards are only as good as their level of social acceptance. Even those that have wide acceptance may subsequently be rendered obsolete or irrelevant by changes in the external environment.

No academic or research library aims to preserve the entire corpus of print publications. Preservation of the scholarly record -- consisting of peer reviewed publications plus the indexing and abstracting tools to access this literature -- is generally considered essential to the role of academic and research libraries. A pressing digital preservation problem is not in this mission but, rather, has to do with the incredible expansion over the last decade or so of the number and variety of digital objects that are not apart of the scholarly record, per se, but are cited within the scholarly record or might be the subject for future scholarly investigation.

Text Fluidity / Version Control

The issue of text fluidity is a red herring as far as digital preservation is concerned. A fluid is not fixed, firm or stable. A text is fixed, firm and stable, at least for the duration of any reading. The conjunction of the two concepts, "text fluidity," produces one of those jarring metaphors intended to enlighten but inclined to mislead and confuse; a more neutral and clarifying phrase is "version control."

Any modifications to a text, of any sort, produce distinct versions, all related to the original text; modifications to a text do not produce a text that is fluid. Naming versions of a text and specifying the differences between the revision and the original text are the two elements of version control. Were one to consider or preserve multiple versions of a given text, every version of that text could be uniquely named and its relation to the first version (and others) could be specified, if such relational specificity was warranted given the texts or objects involved.

Simply because electronic publications can be changed with ease does not mean that the canons of scholarly publishing could be or should be modified to accommodate large numbers of variants on a single text. Scholarly publishing can be expected to change in response to the opportunities offered by electronic texts, but the change will be incremental.

The most likely scenario for version control remains as it is now: acceptance of variant texts will depend on the type of text and the discipline. Refereed journal articles, scholarly monographs, “grey” literature, and secondary scholarly texts form a continuum with regard to the toleration of changes to the text, with refereed articles being the least tolerant.

Web sites can incorporate material that goes far beyond the relation between reader and static print. Academics who invest their time, skills and creativity in the development of such web sites need a reasonable expectation that their efforts will not be dismissed as just a new form of grey literature. Implementing some simple conventions for version control, particularly in conjunction with greater use of metadata by web site creators, would facilitate the archiving process for innovative scholarly web sites and would constitute a starting point for gaining academic acceptance -- but attaining academic respectability will likely require much more than formal version control, including some mechanism for peer review and criteria that a site or collection of digital objects need to satisfy to be considered scholarly.

Methodology, Survey / Questionnaire


As noted above, the team has based its observations and recommendations on an analytical review of the growing bodies of literature, national and international, that is related to the areas specifically identified by the HSSFC as being integral to the perceived credibility of electronic scholarly publication -- combined with useful materials beyond those related to our specific mandate, with attention paid to initiatives that have particular relevance to the Canadian academic community, and taking into data materials gathered by two surveys: one, discussed immediately below, which addressed issues of importance to all areas addressed by this report, and another, discussed in the above section on Archiving and Text Fluidity / Version Control, which addressed issues specifically related to that section of the report that necessitated specialized expertise.

Survey / Questionnaire
As per our proposal, members of the team conducted a survey with a representative number of publishers, scholars, representatives of the Aid to Scholarly Publication Programme, reviewers, university administrators, and academic authors -- via a questionnaire; this survey has illuminated the international literature review, and has helped us to situate the results of that literature review specifically within the Canadian context that this report seeks to serve.

The questionnaire was designed with some consultation with people with experience in quantitative data gathering and analysis; its contents also reflect input into the full range of concerns addressed by this report. Specifically reporting on this survey are Geoffrey Rockwell (McMaster U) and, via data analysis, Lynne Siemens (Malaspina U-C); the questionnaire itself was designed jointly by the research team, first collated and administered under the leadership of Joanne Buckley (McMaster U), then developed further and its delivery overseen by Ray Siemens (Malaspina U-C), Michael Best (U Victoria), and a team based at both the University of Victoria and Malaspina University-College.

If this content appears in violation of your intellectual property rights, or you see errors or omissions, please reach out to Scott B. Weingart to discuss removing or amending the materials.

Conference Info

In review


Hosted at New York University

New York, NY, United States

July 13, 2001 - July 16, 2001

94 works by 167 authors indexed

Series: ACH/ICCH (21), ALLC/EADH (28), ACH/ALLC (13)

Organizers: ACH, ALLC