in Library Catalog, Open Data, Open WorldCat, Vendors

Is there a bibliographic emergency?

The Bibliographic Control Working Group held its third and final public meeting on the future of bibliographic control July 9 at the Library of Congress, focusing on “The Economics and Organization of Bibliographic Data.” The conclusion of the meetings will come in a report issued in November. No dramatic changes were issued from this meeting, and public comment is invited until the end of July.

With several panels, invited speakers, and an open forum (including a public webcast), Deanna Marcum, Library of Congress associate librarian for library services, framed the discussion by saying “Worries about MARC as an international standard make it seem like we found it on a tablet.” She went on to say, “Many catalogers believe catalogs…should be a public good, but in this world, it’s not possible to ignore economic considerations.” Marcum said there is no LC budget line that provides cataloging records for other libraries, though the CIP program has been hugely successful.

Value for money
Jose-Marie Griffiths, dean of the library school at the University of North Carolina, Chapel Hill, said there are three broad areas of concern: users and uses of bibliographic data, different needs for the data, and the economics and organization of the data. “What does free cost?” she asked, “Who are the stakeholders, and how are they organizationally aligned?”

Judith Nadler, library director, University of Chicago, moderated the discussion and said the format of the meetings was based on the oral and written testimony that was used to create Section 108 of the Copyright Law. Nadler joked that “We will have authority control–if we can afford it.”

Atoms vs bits
Rick Lugg, partner, R2 Consulting, has often spoken of the need for libraries to say no before saying yes to new things. His Powerpoint-free (at Marcum’s request–no speakers used it) presentation focused on how backlogs are invisible in the digital world. “People have difficulty managing what they have,” Lugg said, “There is a sense of a long emergency, and libraries cannot afford to support what they are doing.”

Using business language, since R2 often consults for academic libraries on streamlining processes, Lugg said libraries are not taking advantage of the value chain. Competitors are now challenging libraries in the area of search, even as technical services budgets are being challenged.

In part, Lugg credited this pressure to the basic MARC record becoming a commodity, and he estimated the cost of an original cataloged record to be $150-200. He challenged libraries to abandon the “cult of perfection,” since “the reader isn’t going to read the wrong book.”

Another area of concern is the difficulty of maintaining three stove-piped bibliographic areas, from MARC records for books, to serials holdings for link resolvers, to an A-Z list of journals. With separate print and electronic records, the total cost of bibliographic control is unknown, particularly with a lifecycle that includes selection, access, digitization, and storage or deaccession.

There is a real question about inventory control vs. bibliographic control, Lugg said. The opportunity cost of the current processes lead to questions if libraries are putting their effort where it yields the most benefit. With many new responsibilities coming down the pike for technical services, including special collections, rare books, finding aids, and institutional repositories, libraries are challenged to retrain catalogers to expand their roles beyond MARC into learning new formats like MODS, METS, and Dublin Core.

Lugg said R2 found that non-MLS catalogers were often more rule-bound than professional staff, which brings about training questions. He summarized his presentation by asking:

  1. How do we reduce our efforts and redirect our focus?
  2. How can we redirect our expertise to new metadata schemes?
  3. How can we open our systems and cultures to external support from authors, publishers, abstract and indexing (A&I) services, etc?

The role of the consortium
Lizanne Payne, director of the WRLC, a library consortia for DC universities, said that with 200 international library consortia dedicated to containing the cost of content, the economics of bibliographic data is paramount. Saying that shared catalogs and systems date from a time when hardware and software was expensive, “IT staff is the most expensive line item now.”

Payne said storage facilities require explicit placement for quick retrieval, not a relative measure like call numbers. She called for algorithms to be written beyond FrBR that dedupe for unique and overlapping copies that go beyond OCLC or LCCN numbers.

Public libraries are special
Mary Catherine Little, director of technical services, Queens Library (NY), gave a fascinating overview of her library system. With 2.2 million items circulated in 2006 in 33 languages, 45,000 visitors per day, and 75,000 titles cataloged last year, Queens is the busiest library in the United States and has 66 branches within “one mile of every resident.”

Little said their ILS plans are evolving, “Heard about Sirsi/Dynix?” With its multilingual and growing collection, Little detailed their process. First, they ask if they are the first library to touch the record. Then, they investigate whether the ILS can function with the record “today, then tomorrow,” and ask if the record can be found from an outside source. The library prefers to get records from online vendors or directly from the publishers, and has 90 percent of English records in the catalog prior to publication.

Queens Public Library has devised a model for international providers which revolve around receiving monthly lists of high-demand titles, especially from high demand Chinese publishers, and standing orders. With vendors feeling the push from the library, many then enter into partnerships with OCLC.

“Uncataloged collections are better than backlogs,” Little said, and many patrons discover high-demand titles by walking around, especially audio and video. “We’ve accepted the tradeoffs,” she said.

Little made a call for community tagging, word clouds, and open source and extensible catalogs, and said there is a continuing challenge to capture non-Roman data formats.

“Global underpinnings are the key to the future, and Unicode must be present,” Little said, “The Library of Congress has been behind, and the future is open source software and interoperability through cooperation.”

Special libraries harmonize
Susan Fifer Canby, National Geographic Society vice president of library and information services, said her library contains proprietary data and works to harmonzie taxonomies across various content management systems (CMS), enhancing with useful metadata to give her users a Google-like search.

Canby said this work has led to a relative consistency and accuracy, which helps users bridge print and electronic sources. Though some special libraries are still managing print collections, most are devoting serious amounts of time to digital finding aids, competitive information gathering, and future analysis for their companies to help connect the dots. The library is working to associate latitude and longitude information with content to aid with mashups.

The National Geographic library uses OCLC records for books and serials, and simple MARC records for maps, and more complex records for ephemera, “though [there’s] no staff to catalog everything.” The big challenge, however, is cataloging photographs, since the ratio used to be 100 shots for every published photo, and now it’s 1000 to 1.”Photographers have been incentivized to provide keywords and metadata,” Canby said. With the rise of IPTC embedded data, the photographers are adding terms from drop-down menus, free-text fields, and conceptual terms.

The library is buying digital content, but not yet HD content, since it’s too expensive due to its large file size. Selling large versions of its photos through ecommerce has given the library additional funds for special librarians to do better, Canby said.

Special libraries have challenges to get their organizations to implement digital document solutions, since most people use email as a filing strategy instead of metadata-based solutions. Another large challenge is that most companies view special libraries as a cost center, and just sustaining services is difficult. Since the special library’s primary role isn’t cataloging, which is outsourced and often assigned to interns, the bottom line is to develop a flexible metadata strategy that includes collaborating with the Library of Congress and users to make it happen.

Vendors and records
Bob Nardini, Coutts Information Services, said book vendors are a major provider of MARC records, and may employ as many catalogers as the Library of Congress does. Coutts relies on the LC CIP records, and said both publishers and LC are under pressure to do more with less. Nardini advocated doing more in the early stages of a book’s life, and gave an interesting statistic about the commodity status of a MARC record from the Library of Congress: With an annual subscription to the LC records, the effective cost is $.06 per record.

PCC
Mechael Charbonneau, director of technical services at Indiana University Libraries, gave some history about how cataloging was under threat in 1996 because of budget crunches. In part, the Program for Cooperative Cataloging (PCC) came about to extend collaboration and to find cost savings. Charbonneau said that PCC records are considered to be equivalent to LC records, “trustworthy and authoritative.” With four main areas, including BIBCO for bibliographic records, NACO for name authority,  SACO for subject authority, and CONSER for serial records, international participants have effectively supplemented the Library of Congress records.

PCC’s strategic goals include looking at new models for non-MARC metadata, being proactive rather than reactive, reacting with flexibility, achieving close working relationships with publishers, and internationalizing authority files, which has begun with LC, OCLC, and the Deutsche Bibliotek.

Charbonneau said in her role as a academic librarian, she sees the need to optimize the allocation of staff in large research libraries and to free up catalogers to do new things, starting with user needs.

Abstract and indexing
Linda Beebe, senior director of PsycINFO, said the American Psychological Association (APA) has similar goals with its database, including the creation of unique metadata and controlled vocabularies. Beebe sees linking tools as a way to give users access to content. Though Google gives users breadth, not precision, partnerships to link to content using CrossRef’s DOI service has started to solve the appropriate copy problem. Though “some access is better than none,” she cautioned that in STM, a little knowledge is a dangerous thing.

Beebe said there is a continuing need for standards, but “how many, and can they be simplified and integrated?” With a dual audience of librarians and end-users, A&I providers feel the need to make the search learning curve gentle while preserving the need for advanced features that may require instruction.

A robust discussion ensued about the need for authority control for authors in A&I services. NISO emerging standards and the Scopus author profile were discussed as possible solutions. The NISO/ISO standard is being eagerly adopted by publishers as a way to pay out royalties.

Microsoft of the library world?
Karen Calhoun, OCLC VP for WorldCat and Metadata Services, listed seven economic challenges for the working group, including productivity, redundancy, value, scale, budgets, demography, and collaboration. Pointing to Fred Kilgour, OCLC founder, as leading libraries into an age of “enhanced productivity in cataloging,” Calhoun said new models of acquisition is the next frontier.

With various definitions of quality from libraries and end users, libraries must broaden their scale of bibliographic control for more materials. Calhoun argued that “narrowing our scope is premature.” With intense budget pressure “not being surprising,” new challenges include retirements building full strength starting in 2010.

Since libraries cannot work alone, and cost reductions are not ends in themselves, OCLC can create new opportunities for libraries. Calhoun compared the OCLC suite of services to the electric grid, and said remixable and reusable metadata is the way of the future, coming from publishers, vendors, authors, reviewers, readers, and selectors.

“WorldCat is an unexploited resource, and OCLC can help libraries by moving selected technical services to the network,” Calhoun said. Advocating moving library services to the OCLC bill “like PayPal,” Calhoun said libraries could reduce its manpower costs.

Teri Frick, technical services librarian at the Orange County Public Library (VA), questioned Calhoun, saying her library can’t afford OCLC, Calhoun admitted ” OCLC is struggling with that,” and “I don’t think we have the answers.”

Frick pointed out that her small public library has the same needs as the largest library, and said any change to LC cataloging policy would have a large effect on her operations in southwestern Virginia. “When LC cut–and I understand why–it really hurt.”

Library of Congress reorganizes
Beacher Wiggins, Library of Congress director for acquisitions and bibliographic control, read a paper that gave the LC perspective. Wiggins cited Marcum’s 2005 paper that disclosed the costs of cataloging at $44 million per year. LC has 400 cataloging staff (down from 750 in 1991), who cataloged 350,000 volumes last year.

The library has reorganized acquisitions and cataloging into one administrative unit in 2004, but cataloger workflows will be merged in 2008, with retraining to take place over the next 12-36 months. New job descriptions will be created, and new partners for international records (excluding authority records) are being selected. After an imbroglio about redistribution of Italian book dealer Casalini records, Wiggins said, “For this and any future agreements, we will not agree to restrict redistribution of records we receive.”

In further questioning, Karen Coyle, library consultant, pointed out that the education effort would be large, as well as the need to retrain people. Wiggins said LC is not giving up on pre-coordination, which had been questioned by LC union member Thomas Mann and others, but that they are looking at streamlining how it is done.

Judith Cannon, Library of Congress instruction specialist, said “We don’t use the products we create, and I think there’s a disconnect there. These are all interrelated subjects.”

NLM questions business model
Dianne McCutcheon, chief of technical services at the National Library of Medicine, agreed that cataloging is a public good and that managers need to come up with an efficient cost/benefit ratio. However, McCutcheon said, “No additional benefit accrues to libraries for contributing unique records–OCLC should pay libraries for each use of a unique record.”

McCutcheon spoke in favor of incorporating ONIX from publishers in place or to supplement MARC, and “to develop the appropriate crosswalks.” With publishers working in electronic environments, libraries should use the available metadata to enhance records and build in further automation. Since medical publishers are submitting citation records directly to NLM for inclusion in Medline, the library is seeing a significant cost savings, from $10 down to $1 a record. The NLM’s Medical Text Indexer (MTI) is another useful tool, which assits catalogers in assigning subject headings, with 60 percent agreement.

NAL urges more collaboration
Christopher Cole, associate director of technical services at the National Agricultural Library (NAL), said like the NLM, the NAL is both a library and a A&I provider. By using publisher supplied metadata as a starting point and adding additional access points and doing quality control, “quality has not suffered one bit.” Cole said the NAL thesaurus was recreated 6-7 years ago after previously relying on FAO and CAB information, and he advocated for a similar reinvention. Cole said, “Use ONIX. The publishers supply it.”

Tagging and privacy
Dan Chudnov, Library of Congresss Office of Strategic Initiatives, made two points, first saying that social tagging is hard, and its value is an emergent phenomenon with no obvious rhyme or reason. Chudnov said it happens in context, and referenced Tim Spalding’s talk given at LC. “The user becomes an access point, and this is incompatible with the ALA Bill of Rights on privacy that we hold dear,” he said.

Finally, Chudnov advocated for the inclusion of computer scientists from the wider community, perhaps in a joint meeting joined by vendors.

Summing up
Robert Wolven, Columbia University director of library systems and bibliographic control and working group member, summarized the meeting by saying that the purpose was to find the “cost sinks” and to find “collective efficiencies,” since metadata has a long life cycle. Cautioning that there are “no free rides,” libraries must find ways to recoup its costs.

Marcum cited LC’s mission, which is “to make the world’s creativity and the world’s knowledge accessible to Congress and the American people,” and said the LC’s leadership role can’t be diminished. With 100 million hidden items (including photos, videos, etc), curators are called upon in 21 reading rooms to direct users to hidden treasures. But “in the era of the Web, user expectations are expanding but funding is not. Thus, things need to be done differently, and we will be measuring success as never before,” Marcum said.

Write a Comment

Comment