Home Me Calendar Diagrams
Home About Me My Calendar My Diagrams My Garden Going Green My Research Activities My Research Publications

Technology as Agent for Transformation:
Five Case-Studies of University Libraries as Facilitators for Electronic Scholarly Publishing

Andrew Treloar

Senior Lecturer in Information Management

School of Computing and Mathematics

Deakin University, Rusden Campus



This paper was presented at the Victorian Association for Library Automation (VALA) 1998 Bi-Ennial conference. This document is the online version of the paper published in the conference proceedings. Minor changes to the layout and referencing have been made to accommodate its new online existence. The text itself has not been changed. The slides I used for my presentation are also available online. This version last modified February 24, 1998.


Libraries have always played a central role in providing access to information. They have often played a minor role in the production of information artefacts. The new electronic technologies have lowered the barriers to publishing and a number of libraries are seizing this opportunity to redefine their roles as information universities. This paper discusses the results of a research project which examined the role of academic libraries as facilitators for scholarly electronic publishing. The research involved initial discussions via email followed by intensive site visits to a number of representative libraries in the US, UK, and Europe. This paper will consider each of the selected publishing projects in turn, looking at their mission, how they began, their current organisational structure, the publishing products they produce, the lessons other libraries can learn from their experiences, and their future prospects.


Theory and practice

It has become a truism to say that technology is changing everything. Libraries are naturally not immune to this transforming process. Other papers at this conference will discuss aspects of this technological change, the role of technology in changing internal processes within libraries and the ways libraries interact with their users. In contrast to these present changes, for libraries over the last three hundred years or so, the predominant technology has been print, typically instantiated as monographs and serials. For most libraries this is still the most important technology and will be into the foreseeable future for many applications.

David Kaufer and Kathleen Carley in their book Communication at a Distance [1] discuss the advent of print and its influence on the world of communication we inhabit. Their work focuses on the entire interaction cycle of communication as the unit of analysis. This shifts the emphasis from particular elements in the communicative transaction to a single communication ecology.

This communication ecology consists of agents (which may be individuals, communications artefacts or software) exchanging communicative transactions containing some content within some overall context. Ecology evokes images of richness, interdependence, and evolving complexity and moves beyond the traditional open systems model. The idea of the ecological constructural framework is to build on this simple model by regarding all these components of communication as mutually defining, co-adaptive, and co-evolving. In the context of research into new forms of communication, such a model means that attempting to consider any one component in isolation is fraught with difficulties. Changes in one area will almost certainly prompt unpredictable changes in another. Later work by Kathleen Carley [2] moves beyond the print focus of the earlier work and starts to suggest some tentative ways in which this analysis might deal with some of the new communication technologies, without actually discussing any of those technologies.

The most important set of new technologies for libraries have been the communication and computing technologies. These have provided the potential for transformations of professional practices and intermediary processes. If one considers scholarly publishing as a subset of a general communications technology, the relevant stakeholders involved are publishers, scholars (as both consumers and producers of content), scholarly societies (as representatives of scholars and as publishing intermediaries in their own right) and librarians. Since the advent of printing, libraries have been traditionally associated with providing access to content created by other people. This access provision has involved a range of interrelated roles: acquisition, cataloguing and archiving primarily print materials. Few libraries have seen this role extending to publishing the material themselves or becoming actively involved in such publishing. This is even the case for many of the current digital library initiatives. Papers looking at the roles of digital libraries [3] and assumptions underlying digital libraries [4] in the April 1995 Digital Libraries special issue of CACM hardly mention the library as publisher. Prior to the development of printing, monastic libraries were strongly associated with publishing. Do the new technologies allow for this link between libraries and publishing to be reborn in digital scriptoria? What are the likely impacts on such a transformation for the communications ecology that libraries inhabit?

Research design

This paper looks at five projects where a university library is acting as a significant facilitator for electronic scholarly publishing, rather than just as an access point for material provided by a publisher. The intention was to select projects that differed on a range of measures: organisational structure, geographical location (still important in this increasingly wired world), and type of published product. Because of the audience for this conference, I have concentrated in this paper on how the projects came to exist, how they are organised, what they provide, their level of commercial sustainability, the lessons learned by the participants about might be applicable to similar projects, and the likely future prospects for each project.

The research methodology was to first investigate potential candidate projects via a structured literature and Web search. Once the candidates had been identified, I contacted the most suitable personnel via electronic mail, outlining my research and requesting their cooperation. I also asked if I should be talking to anyone else involved in the project. Once I had received agreement, I undertook intensive research into the project based on publicly available material. Each site was visited and senior informants (directors or assistant directors) from each project took part in a structured interview.

The Projects

The projects were selected to provide as representative a sample as possible. Details of the projects are discussed below. The geographical range of the selection ended up encompassing both coasts of the United States, the United Kingdom and Europe. No Australian projects (of which there are a number) were included in this phase; the funding arrangements require that projects studied be located outside Australia. Together the projects selected provide a good picture of the diversity in the field. Given the extremely fluid nature of the electronic publishing field, specific comments about the projects are only valid as of the time of writing of this paper (early December, 1997). The projects are discussed in alphabetical order.

Highwire Press

Highwire Press is an initiative of Stanford University Libraries/Academic Information Resources. It was selected because it is implementing leading-edge Web-based ejournal technologies and because it is commercially successful (i.e. not just a pilot). Stanford also has a number of interesting projects including the Digital Libraries Initiative, and I was interested in possible synergies.

The stated mission of Highwire is to:

My informant at Highwire Press was Vicky Reich, Assistant Director and Digital Librarian at Stanford University's Green Library. I also attended a talk by Mike Keller when he was visiting Australia in late 1996.

Origins and organisation

Librarians have complained for a long time about the rising cost of STM materials and the need to regain some control of the scholarly literature. Highwire grew out of that frustration. When Mike Keller came to Stanford four years ago, he was active in some committees with Robert Simoni, Professor of Biological Sciences and

Highwire currently operates as a separate cost centre within the library, with the Publisher of Highwire, Michael Keller, also being the University Librarian, and Director of Academic Information Resources. At the time of writing the Highwire Press team listed on their homepage consists of 26 people (including support staff). The more senior Highwire staff also fulfil other positions within Stanford University. The project is currently commercially sustainable, in line with Stanford's policy of extensive charge-back for services. It is not however seen as a way for the university or library to make a profit. Rather, it is viewed as a cost-recovery exercise with both tangible and intangible benefits for the university. Because of its organisational location within the library and physical location in Silicon Valley, the infrastructure was already in place to ease the startup process.


Implied in the Highwire Press mission statement is their intention to provide a model for re-engineering scholarly communication. To this end, they are working in partnership with scholarly societies to bring existing print journals online. The first of these, The Journal of Biological Chemistry has now been joined by 24 largely biomedical publications which are available in both print and electronic form. An additional five or six dozen more titles will be available online within the coming year (consult their homepage for details). For many of these there was a personal connection between the journal personnel, which is what drove the development of successive journals.

These journals are at the leading edge of Web-based journal publishing and are progressively adding a number of additional value features that are only possible in an online environment. These include:

Lessons learned

The critical factors for Highwire's continuing success were stated to be a mix of the academic setting, physical setting, technology and support of the university administration. The academic setting within Stanford provides rich connections to subject domain experts and academic staff. The involvement of library staff at all levels ensures a deep understanding of the requirements of all the participants. As librarians, it is their job to deliver information directly to university researchers and understand their needs. Highwire also helps the publishers deliver intellectual property through librarians to researchers. Finally, as librarians themselves, they understand the special needs and constraints that librarians have; they are their own clients. The physical location of Stanford within Silicon Valley facilitates a range of cooperative projects with leading-edge computer companies, as well as grapevine access to the latest technology news. The right technology mix at Stanford includes a very talented staff and an outstanding networking infrastructure, essential for ensuring adequate client access. By the same token, projects like Highwire based on the latest Web technologies need skills that are in high demand throughout Silicon Valley at present. Compared to a computer startup company, a university can't offer the promise of stock options and becoming rich. This can be a problem in recruiting staff.

Future prospects

Vicky Reich hopes that over the next five years the technology will continue to evolve but will also trickle down. Highwire Press will still be doing the leading edge sophisticated online publishing, but making university published literature and technical reports available online will be seen as routine. Of course, as more and more material becomes available, the challenge is to provide enough overall searching tools to access a coherent information space. Another likely outcome is the transition of existing print journals to an electronic-only existence within the next five years, thus providing a saving on distribution costs for the societies that provide these journals to their members.

Internet Library of Early Journals (ILEJ)

The Internet Library of Early Journals (ILEJ) is a joint project between the Universities of Birmingham, Leeds, Manchester and Oxford. It aims to digitise a critical mass (defined as at least 20 consecutive years) of three eighteenth century journals (Gentleman's Magazine, The Annual Register, and Philosophical Transactions of the Royal Society) and three nineteenth century journals (Notes and Queries, The Builder, and Blackwood's Edinburgh Magazine). While not extremely rare, there are only perhaps 20-25 sets of each journal extant. The digitisation will therefore need to be done on a non-destructive basis. ILEJ was selected as an example of a digitisation project working with non-scientific and older serial material.

The project aims to explore the issues associated with making this sort of material available as well as providing access to it. The variables they are particularly interested in are image creation, indexing techniques and Web access to page images. A number of their working decisions have been made with an eye to reducing the cost of doing this sort of work as far as possible. The intention behind providing access to the material in digital form is to facilitate access by researchers (through desktop access and search mechanisms) and to reduce the need for physical handling of the originals. The project aims to mount 120,000 page images in all.

My main informant for ILEJ was Peter Leggate, Keeper of Science Books, Radcliffe Science Library, Oxford University. I also had discussions with a number of the technical staff at the Bodleian Library.

Origins and organisation

The project began as a result of a successful bid for funding from the eLib programme, set up after the Libraries Review by the UK Higher Education Funding Councils, chaired by Professor Sir Brian Follett in 1993 (sometimes called the Follett Report). The Bodleian library already had an interest in digitisation projects. They decided to focus on materials from the 18th and 19th century in part because of copyright problems with newer material, and to choose materials that were heavily used. The original bid from the Bodleian included journals, newspapers and slides. The eLib programme identified a number of similar digitisation projects and suggested the eventual consortium with Birmingham, Leeds and Manchester. The project effectively commenced in late 1995.

Peter Leggate is the joint project leader with Hugh Wellesley-Smith, Deputy Librarian, Edward Boyle Library, University of Leeds. The management of the project is being coordinated from Oxford. Scanning the journals from microfilm and keyboarding the index entries also takes place at Oxford. Scanning from hard copy is being undertaken at Birmingham and Manchester. Servers to mount the images are located at Oxford and Leeds.

The project is not currently commercially sustainable, and may in fact never be given the nature of the material being digitised. The project team are actively considering the best way to proceed once the initial funding has been allocated (see below).


Figure 1: Notes and Queries, Vol. 6, No. 140, p. 1.

At the time of writing fifteen years of Notes and Queries have been made available. The selected journals consist of a wide mixture of intermingled article types laid out on the page in a somewhat unstructured manner. This means that providing page images is the simplest way to present the material. Each image is between 100 and 200K in size. Pages can be browsed by volume/number/page or searched directly. The input to the search database is OCR text from the page scans. This OCR is done using off the shelf software (designed for modern typefaces and layout) and without manual intervention (because of the prohibitive cost of correcting errors). The result is quite 'dirty' OCR, which would normally present problems in searching. As a partial solution, the Excalibur Technologies EFS search engine is being used for full text fuzzy matching on the dirty OCR text.

A section of one of the journal pages made available is shown in Figure 1.

Lessons learned

Critical to the success of ILEJ has been the collaboration between sites which provides a wider pool of expertise and technology to draw on. Of course, this has also meant an increased coordination load, taking the time of staff who were paid to do other jobs and who were too busy anyway. This factor was exacerbated by the decision (taken after careful analysis) not to appoint a project manager and to use the funds saved for technical staff instead.

Another difficulty encountered was delays in the availability of necessary scanning technology. This meant a consequential lag in appointing scanning staff and a lack of content in the early stages of the project. If the project had ben able to acquire a stock of page images earlier, then they could have modified the servers and programmed functions in a more informed way.

Peter Leggate also made the point that ILEJ is strongly committed to offering a useful service to the scholarly community, not just doing a digitisation project. While this has been a fruitful model, it has also involved a greater spread of activities.

Future prospects

The ILEJ project has been able to extend its operations outside the originally funded period because of early delays caused by waiting for equipment. They have also been able to save some money through higher than expected throughput. At present, the original grant of £338,000 is anticipated to last until August 1, 1998. Some possible extension options for funding are:

The issues faced by ILEJ are those that will have to be faced long-term by the new proposed hybrid libraries. Very rare and old materials will probably be digitised anyway. Current materials will increasingly become available in electronic form. ILEJ is dealing with the issue of what to do with the vast bulk of material in the middle, although it has been able to avoid the vexed issue of copyright. The most likely solution is to digitise high demand materials first and then do digitisation on demand for the remainder (although funding this on demand work may be difficult). With respect to material still covered by copyright, Peter Leggate's view is that libraries need to convince journal publishers that there is little money to be made from their older material. One suggestion he made is that after 20 years, perhaps materials could be digitised on demand with a minimal licence cost. Getting the publishers to agree to this may be problematic, of course.

Project Educate

Project EDUCATE (End-user Courses in Information Access through Communication Technology) is a joint initiative of Limerick University in Ireland, the École Nationale des Ponts et Chaussées in France, the University of Barcelona in Spain, Chalmers University of Technology in Sweden, and the Imperial College of Science Technology and Medicine and Plymouth University the United Kingdom. The overall aim of the project is to help students, research workers and practitioners to develop their information literacy. EDUCATE was selected because it was publishing online teaching support materials (rather than journals) and because it provided a Nordic/European perspective.

My informants for Project EDUCATE were:

Origins and organisation

The initial idea came from Nancy Fjällbrant, arising from her interest in user education. DG XIII from the European Union had given some talks about possible projects, and Jan Rohlin thought it would be good to have some international projects, despite the EU jargon. Nancy Fjällbrant identified a range of weaknesses in user-education in parts of Europe and elsewhere. EDUCATE was originally targeted towards librarians, to replace paper-based materials and to assist them with user-education. The EDUCATE team worked on the idea for one year developing their own software client, but when the Web came along the decision to move was an easy one. The hyperlinking in the Web environment allowed them to move further and further away from a strictly linear product and thereby to make it more useful.

The decision to seek EU funding arose from a need to raise the amount of money required to do things properly (and therefore very hard to do on a local scale from a small country like Sweden). A successful bid for funding was made under the European Union Telematics for Libraries programme -Third Framework. The original funding was for a three year period from January 1994 through February 1997. Chalmers University and the Library (as well as some other funding bodies) contributed an equivalent amount of money. The project was in part driven by a desire to do it anyway - the funding just allowed this to happen more effectively and sooner.

Nancy Fjällbrant was able to draw on her International Association of Technology University Libraries (IATUL) contacts to get assistance. Under the grant proposal, Limerick University was to be the overall coordinator. In practice, Chalmers University of Technology Library provides the day to day technical and administrative management. Imperial College are doing some demonstration versions and courseware. Limerick did some of the interface design and Web development. Translation (and adaptation - using and linking to different resources) is being done into French and Spanish at Barcelona and Ponts et Chaussées.

The intention was that EDUCATE should be commercially sustainable immediately. There is a need for a revenue stream for two reasons: maintenance of existing materials and development of new ones. The experience of other electronic publishing projects has been that initial funding is easier to get than maintenance funding. Jan Rohlin wanted to avoid this and has specified that new modules can only be developed with new money. The revenue stream for maintenance comes from licence fees. These have been set as low as possible consistent with getting enough funds. The total from these fees is not yet at the break even point but the trend is looking promising.


The main product of Project Educate has been a series of Web-based self-paced user education courses called Into Info. These provide training in the selection and use of information tools and resources in particular subject areas (chemistry, physics, electrical and electronic engineering, and energy to date). All the Into Info modules are based on a multi-level hierarchical structure with rich internal and external hyperlinking. The first level of the hierarchy offers:

An example Pathfinders map might look like Figure 2. Each point on the diagram is a link to a corresponding Web page or set of pages.

Figure 2: Pathfinders Image Map

As well as providing users with skills in using the information resources, the Into Info modules act like an annotated bibliography of the highest-quality resources in each discipline area. In this way, users of the modules learn simultaneously how to access resources and which are the best to access.

These courses are both used within the member universities and site-licensed to other universities for use in their own user-education programs.

Lessons learned

The critical success factors for EDUCATE were identified by its director as first of all having a good idea, having the Web (and the idea of the universal browser) come along at the right time, and the importance of checking with the user community throughout the development process. EDUCATE was able to build on the reputation of Nancy Fjällbrant in the user education community and the brand name of Chalmers. The challenges (and benefits) of inter-institution and international cooperation were an issue for EDUCATE and would probably be for any similar project.

Future prospects

Jan Rohlin believes that the project will be of interest for another 3 years, but will have been transformed and reworked completely by that time. At present there are no obvious competitors for provision of this sort of material, but this may not be the case into the future. Modules for different content areas are a possibility with additional funding. The Dedicate project (funding being applied for at present) will be focussed on distance education into Eastern Europe and the Baltic states, based on the Educate model and with a focus on the area of information retrieval).

Project Muse

Project Muse is an initiative of the Johns Hopkins University (JHU) Press and the Milton S. Eisenhower Library at JHU. It provides worldwide networked access to the full-text of over 40 of the Press' electronic journals. Project Muse was chosen as it is one of the very first Web-based electronic journal projects, and has a predominantly humanities coverage in contrast to Highwire.

My informants at Project Muse were:

Origins and organisation

Project Muse arose originally from informal discussions between Susan Lewis and Todd Kelley. They both identified the strengths of the journals published by the Press and felt that moving into electronic publication would have significant advantages. The initial development took place in a very bottom-up way, driven by the commitment of these two individuals. The original aims were to:

Both the library and the press recognised that significant seed-funding was required. They made the decision to seek grants from the National Endowment for the Humanities (NEH) and the Andrew Mellon Foundation, and to use the Web (then very new) as the technology platform. These grant applications were both successful, and development commenced.

The library helped enormously with setting up the project and with raising money through fundraising. They developed the university wide relationships between the various players. They also had the technical expertise in online information (but none in publishing). So Todd Kelley and Susan Lewis partnered together. Susan Lewis got the files from the press' production system and had them translated into Postscript. The library provided the server, organised the information, and embedded metadata into the articles to assist with searching. In effect, the Press provided the content, and the library added value through cataloguing, searchability and dissemination. The project is still a joint initiative and commercially sustainable; the grants finish at the end of 1997. The project is also starting to provide access to journals that do not come from the JHU Press.


Project Muse provides access to the full text of over 40 of the Press's scholarly journals (but not complete runs of each journal) in the humanities, the social sciences, and mathematics. About 3 to 4 outside journals will be added each year from now on. Most of these journals are published in print as well but some are electronic only. No matter what category the journal belongs to, all titles provide:

The preferred way for Muse to provide access is through consortia of universities. This reduces the administrative overheads for the project and enables them to provide discounts to the consortium members. Some of the Muse licences are for single universities, and from the end of 1997 they will be supporting individual user subscriptions.

Lessons learned

As one of the earliest projects, Muse is now a mature system whose success can be regarded as established. Critical in getting to this point was the support of senior management in the library (particularly important for a project like Muse that started bottom-up), and the commitment of the early project staff in making it work and dealing with obstacles aggressively. Susan Lewis commented that in the early days they probably spent too much on getting the product right and not enough on marketing and promotion. Other than that, both Susan Lewis and Todd Kelley commented that they got things about right.

Future prospects

Bearing in mind that Todd Kelley is no longer associated with Project Muse, when he started out he thought there were 3 stages (each about 3 years) that they would go through:

i. Startup period

During this phase, on the editorial side (the first copy content costs) he believed that print would subsidise electronic. It has essentially worked out that way. The money coming in from Muse is not going to royalties or editorial costs.

ii. Mature system

This would be characterised by up to 500 sites (or thereabouts) licensing Muse journals. As the project "moves into this stage Muse has to pay its own way because grant money has run out and one can start to pay royalties for electronic access on top of print. This is essentially where the project is now. He predicted that as libraries start to drop print, there are costs that the press has which are higher (due to smaller print runs). As these costs change, they should disassociate costs of print and electronic to better reflect their individual costs. This then drives the transition of libraries from print to electronic. Round about year 4 he expects print and electronic subscription rates to cross over. One can then add more and more subscriptions to the electronic product at near-zero marginal costs, which is not the case for print.

iii. All ties between print and electronic are cut

In this phase, print sales start to slip. This puts the Muse model and approach out front rather than tagging onto print. He hopes by end of 1998 that the electronic version will be the journal of record. Todd Kelley has been recommending that the press start to plan for print on demand for those who will still want print. He has also been recommending a document delivery service for those who want it. When I asked about the long term future of print, he indicated that he thinks print will be gone altogether by 2003.

Scholarly Communications Project

This project, based in the library at Virginia Polytechnic Institute and State University (Virginia Tech), has been working on a range of publishing activities since 1989. It was selected because of the diversity of its activities, and because it operates in a service rather than cost-recovery environment. The project:

"assists primarily Virginia Tech faculty who are editors of professional journals when they want to also make their publications available to their colleagues in distributed academic communities via the Internet. It assists traditional academic publishers adapt their publications to the Internet and access by the worldwide academic communities. It has also works with a variety of units within the university to extend access to their clients locally, regionally, nationally, and internationally." [8]

My main informant at the Scholarly Communications Project (SCP) was Gail McMillan, Project Director since 1994. I also talked to James Powell, Technical Director, and Professor Ed Fox, who has been active in the Electronic Theses and Dissertations initiative.

Origins and organisation

The project was originally established in the autumn of 1989 by the then Vice President for Information Systems. It was decided early on that the most appropriate organisational base was the library. Its current director, Gail McMillan, is also in charge of Special Collections. Over time, the project has added more and more online publishing activities, and is seen by the university community as the experts in this domain. The funding for most project activities is provided as part of the library budget. Some funding for specialist activities comes from outside agencies. Gail McMillan sees her role as a service-oriented librarian to say yes to any reasonable request from within the university and then work out how to resource it.


The digital products produced through the project now include:

This fairly eclectic portfolio reflects the willingness of the project to respond to requests for assistance from their user community.

Lessons learned

The critical success factors for the SCP include reliable technology and excellent technical support deriving from the strong cooperation between the project and library automation people, realistic expectations from the journal editors and a degree of personal freedom for the project director to pursue new initiatives. The project is free to experiment, make mistakes and risk things. One theme that kept emerging in the discussions was the importance of a service orientation and a commitment to make things happen.

Future prospects

The SCP is one of the things that Virginia Tech likes to show off to new visitors. This commitment is reflected in the allocation of a whole floor of a new building to the project. Gail McMillan has been involved in the design of this floor and has designed space for project managers, programmers, student workers and a conference room. SCP is not seen as a press that goes off-campus. It is rather seen as a facilitator and outlet for content created by faculty and students. They are also thinking about setting up ad-hoc review boards to referee particular papers that don't become a journal but which still get the VT seal of approval. Overall, the future looks very bright.


Each of these projects has used the potentials inherent in the technology in different ways, but each has been driven by a desire to provide a better service to a user community. This might range from providing features that can't be duplicated in paper (as Highwire is doing) to providing access to rare and fragile materials (as in the case of ILEJ). Despite the differences in organisational structures, funding models and products, certain themes can be identified in all the projects.

If technology is the agent of transformation, then getting the technology right is essential. This might mean having something like the Web come along at the right time (Muse, Highwire, EDUCATE), or being prepared to cope as the technology changes every six months. The importance of having reliable technology infrastructure and support was stressed by many informants.

The second element was a willingness to take some risks and be prepared to fail rather than follow the safer and more boring path of waiting for someone else to do it first. Each of these projects are leaders in their fields, in part because of this.

Last, and perhaps most important are the people involved. Nothing can substitute for dedicated, skilled and visionary project staff to do the work and management to support and run interference. Technology can do many things, but it can not yet replace the need for the very special people associated with each of these projects.


I would firstly like to thank all those from the projects discussed who gave so generously of their time to answer my questions. I was delighted with the willingness of people I had never met before to spend time discussing their work with me.

I must also acknowledge the generous and far-sighted funding for this study provided by the Victorian Association for Library Automation (VALA), the sponsors of this conference, through their travel scholarship scheme.

The interview questions were developed with valuable input from Professor Don Schauder, School of Information Management and Systems, Monash University.

Last, but certainly not least, let me thank my wife Dawn and my sons Mark and Iain for their forbearance with my necessary absences during the year to perform this research.


1. Kaufer, D. and Carley, K. Communication at a Distance: The Influence of Print on Socio-Cultural Organization and Change Hillsdale, NJ: Lawrence Erlbaum and Associates, 1993.

2. Carley, K. "Communication Technologies and their Effect on Cultural Homogeneity, Consensus, and the Diffusion of New Ideas", Sociological Perspectives 38, no. 4 (Winter 1995): 547-571.

3. Marchionini, G. and Maurer, J.M. "The Roles of Digital Libraries in Teaching and Learning."CACM 38,4 (April 1995), 67-75.

4. Levy, D.M. and Marshall, C.C. "Going Digital: A Look at Assumptions Underlying Digital Libraries."CACM 38,4 (April 1995), 77-84

5. Newman, Michael. "The Highwire Press at Stanford University: A Review of Current Features." Issues in Science and Technology Librarianship (Summer 1997).

6. Vicky Reich, Personal Communication via email, December 4, 1997.

7. "MUSE - General Introduction", in MUSE Homepage

8. "About Scholarly Communications Project VPI&SU" in Scholarly Communications Project Homepage ÿ