LITA National Forum 2006

“Shift Happens”
Preservation, entertainment in the library, and integrating Library 2.0 into a Web 2.0 world dominated the Library and Information Technology Association (LITA) National Forum in Nashville, TN, October 26-29, 2006.
With 378 registered attendees from 43 states and several countries, including Sweden and Trinidad, attendance held steady with previous years, though the Internet Librarian conference, held in the same week, attracted over 1300 librarians.

Free wireless has still not made it into technology conferences, though laptops were clearly visible, and the LITA blog faithfully kept up with sessions for librarians who were not able to attend.

The forum opened with a fascinating talk from librarians at the Country Music Hall of Fame entitled “Saving America’s Treasures.” Using Bridge Media Solutions in Nashville as a technology partner, the museum has migrated unique content from the Grand Ole Opry, including the first known radio session from October 14, 1939, as well as uncovering demos on acetate and glass from Hank Williams. The migration project uses open source software and will generate MARC records that will be submitted to OCLC.

Thom Gillespie of Indiana University described his shift from being a professor in the Library and Information Science program to launching a new program from the Telecommunications department. The MIME program for art, music, and new media has propelled students into positions at Lucas Arts, Microsoft, and other gaming companies. Gillespie said the program has practical value, “Eye candy was good but it’s about usability.” Saying that peering in is the first step but authoring citizen media is the future, he posed a provocative question: “What would happen if your library had a discussion of the game of the month?”

Integration into user environments was a big topic of discussion. Peter Webster of St. Mary’s University, Halifax, Canada, spoke about how embedded toolbars are enabling libraries to enter where users search.

Annette Bailey, digital services librarian at Virginia Tech, announced that the LibX project has received funding for two years from IMLS to expand their research toolbar into Internet Explorer as well as Firefox, and will let librarians build their own test editions of toolbars online.

Presenters from the Los Alamos National Laboratory described their work with MPEG-21, a new standard from the Motion Pictures Experts group. The standard reduces some of the ambiguities of METS, and allows for unique identifiers in locally-loaded content. Material from Biosis, Thomson’s Web of Science, APS, the Institute of Physics, Elsevier, and Wiley, is being integrated into cataloging operations and existing local Open Archives Initiative (OAI) repositories.

Tags and Maps
The University of Rochester has received funding for an open source catalog, which they are calling the eXtensible Catalog (xC). Using an export of 3 million records from their Voyager catalog, David Lindahl and Jeff Susczynski described how their team used User Centered Design to conduct field interviews with their users, sometimes in their dorm rooms. They have prototyped four different versions of the catalog, and CUPID 4 includes integration of several APIs, including Google, Amazon, Technorati, and OCLC’s xISBN. They are actively looking for partners for the next phase, and plan to work on issues with diacritics, incremental updates, and integrating holdings records, potentially using the NCIP protocol.

Steven Abram, of Sirxi/Dynix and incoming SLA president, delivered the closing keynote, “Web 2.0 and Library 2.0 in our Future.” Abram and Sirsi/Dynix have conducted research on 15,000 users, which highlighted the need for community, learning, and interaction. He asked the audience, “Are you working in your comfort zone or my end user’s comfort zone?” In a somewhat controversial set of statements, Abram compared open source software to being “free like kittens” and challenged librarians about the “My OPAC sucks” meme that’s been popular this year. “Do your users want an OPAC, or do they want information?” Stating that libraries need to compete in an era when education is moving towards the distance learning model, Abram asked, “How much are we doing to serve the user when 60-80% of users are virtual?” Saying that librarians help people improve the quality of their questions, Abram said that major upcoming challenges include 50 million digitized books coming online in the next five years. “What is at risk is not the book. It’s us: librarians.”

ALA 2006: Top Tech Trends

In yet another crowded ballroom, the men (and woman) of LITA prognosticated on the future of libraries and technology.

Walt Crawford moderated the panel and spoke in absentia for Sarah Houghton. Her trends were:

  • Returning power to content owners
  • An OCLC ILS with RedLightGreen as the front-end
  • Outreach online

Karen Schneider listed four:

  • Faceted Navigation, from Endeca and others
  • eBooks–the Sophie project from the Institute for the Future of the Book
  • The Graphic Novel–Fun House
  • Net Neutrality

Eric Lease Morgan listed several, and issued a call for a Notre Dame Perl programmer throughout his trends:

  • VoIP, which he thought would cure email abuse
  • Web pages are now blogs and wikis, which may cause preservation issues since they are dynamically generated from a database
  • Social Networking tools
  • Open Source
  • Metasearch, which he thought may be dead given its lack of de-duplication
  • Mass Digitization, and the future services libraries can provide against it
  • Growing Discontent with Library Catalogs
  • Cataloging is moving to good enough instead of complete
  • OCLC is continuing to expand and refine itself
  • LITA 40 year anniversary–Morgan mentioned how CBS is just celebrating their 55th anniversary of live color TV broadcasting

Tom Wilson noted two things: “Systems aren’t monolithic, and everything is an interim solution.”

Roy Tennant listed three trends:
Next generation finding tools, not just library catalogs. Though the NGC4Lib mailing list is a necessary step, metasearch still needs to be done, and it’s very difficult to do. Some vendors are introducing products like Innovative’s Encore and Ex Libris’ Primo which attempt to solve this problem.
The rise of filtering and selection. Tennant said, “The good news is everyone can be a publisher. And the bad news is, Everyone can be a publisher.”
Rise of microcommunities, like code4lib, which give rise to ubiquitious and constant communication.

Discussion after the panelists spoke raised interesting questions, including Clifford Lynch’s recommendation of Microsoft’s Stuff I’ve Seen. Marshall Breeding recommended tagging WorldCat, not local catalogs, but Karen Schneider pointed out that the user reviews on Open World Cat were deficient compared to Amazon.

When asked how to spot trends, Eric Lease Morgan responded, “Read and read and read–listservs, weblogs; Listen; Participate.” Roy Tennant said, “Look outside the library literature–Read the Wall Street Journal, Fast Company, and Business 2.0. Finally, look for patterns.”

More discussion, and better summaries:
LITA Blog » Blog Archive » Eric Lease Morgan’s Top Tech Trends for ALA 2006; Sum pontifications
LITA Blog » Blog Archive » The Annual Top 10 Trends Extravaganza
Hidden Peanuts » ALA 2006 – LITA Top Tech Trends
ALA TechSource | Tracking the Trends: LITA’s ALA Annual ’06 Session
Library Web Chic » Blog Archive » LITA Top Technology Trends

ALA 2006: Future of Search

This oversubscribed session (I sat on the floor, as did many others) featured Stephen Abram of Sirsi/Dynix/SLA president and Joe Janes of the University of Washington debating the future of search, moderated by LJ columnist Roy Tennant.

Abram asked a pointed question, which decided the debate early, “Were libraries ever about search? Search was rarely the point…unless you wanted to become a librarian.”  In Abram’s view, the current threat to libraries comes from user communities like Facebook/MySpace, since MySpace is now the 6th largest search engine. Other threats to libraries include the Google patent on quality.

Abram said the problem of the future is winnowing, and that you cannot teach people to search.”Boolean doesn’t work,” he said. Abram felt it was a given that more intelligence needs to be built into the interface.

In more sociological musings, Abram said “Facts have a half-life of 12 years,” and social networks matter since “teens and 20s live through their social networks. The world is ahead of us, and teams are contextual. People solve problems in groups.”

Joe Janes asked, “What would happen if you made WorldCat open source? Would the fortress of metadata in Dublin, OH crumble?” When asked if libraries should participate in OpenWorldCat, Abram said, “Sure, why not? Our competitor is ignorance, not access. Libraries transform lives.”

Janes pointed out that none of the current search services (Google Answers, Yahoo Answers, and the coming Microsoft  Answers) have worked well, and Tennant said “While Google and Yahoo may have the eyeballs of users, libraries have the feet of users.”

In an interesting digression from the question at hand, Abram asked why libraries aren’t creating interesting tools like LibraryThing and LibraryELF (look for a July NetConnect feature about the ELF by Liz Burns). Janes said it comes back to privacy concerns, since this is the “looking over your shoulder decade. Hi, NSA!” With the NSA and TSA examining search, banking, and phone records, library privacy ethics are being challenged like no recent time in history.

Roy Tennant asked if libraries should incorporate better interface design, relevance ranking, spelling suggestions, and faceting browsing. Abram said it’s already happening at North Carolina State University with the Endeca catalog project. The Grokker pilot at Stanford is another notable example, and the visual contents and tiled results set mirror how people learn. “Since the search engines are having problems putting ads in visual search, it’s good for librarians.”

Abram got the most laughter by pointing out that the thing that killed Dialog was listening to their users. As librarian requests made Dialog even more precise, “At the end of a Dialog search, you could squeeze a diamond out of your ass.” Janes said the perfect search is “no search at all, one that has the lightest cognitive load.”

Since libraries are, in Janes’ words, “a conservation organization because the human record is at stake, the worst nightmare is that nothing changes and libraries die. The finest vision is to put Google out of business.” Abram’s view was libraries must become better at advocacy and trust users to lay paths through catalog tagging and other vendor initiatives.

The question of the future of search turned into the future of libraries, and Joe Janes concluded that “Libraries are in the business of vacations we enabled, cars we helped fix, businesses we started, and helping people move.” Abram ended with a pithy slogan for libraries, the place of “Bricks, Clicks, and Tricks.”

Other commentary here:
The Shifted Librarian: 20060624 Who Controls the Future of Search?
Library Web Chic » Blog Archive » The Ultimate Debate : Who Controls the Future of Search
LITA Blog » Blog Archive » The Ultimate Debate: Who Controls the Future of Search
AASL Weblog – The Ultimate Debate: Who Controls the Future of Search?

ALA 2006: Google Book Search

Ben Bunnell, Manager of Google Book Search and author of an upcoming Last Byte column in the July NetConnect (no link yet), described how Google cofounders Larry Page and Marissa Mayer originally conceived of the book scanning project while they were in graduate school at Stanford. Using a metronome, they estimated that a 300 page book would take 40 minutes to digitize. Though it wasn’t answered at the session, other panels mentioned that the entire University of Michigan library collection of 7 million books is slated for completion in six to seven years. Libraries will be interesting places in 2010.

Google’s intended goal is to “digitize all books,” and Bunnell said “Google is not focused on author, genre, or time period.” Lawsuits from the Author’s Guild and others have slowed progress.

There are three areas of digitization: Publisher agreements for recently published books (except for Elsevier–one panelist quipped that Google should buy Elsevier); books currently in the public domain (before 1923), and what Tim O’Reilly calls the “twilight zone” (75% of what has been published).

The easy part is scanning books in the public domain (before 1923 in the US). This includes Jane Austen, Charles Dickens, Emily Dickinson, and Shakespeare. Other digital projects have started with this, including Project Gutenberg, Early English Books Online, and the Making of America project. The public domain content makes up 20 percent of all available books.

Google already has agreements from all US major publishers, and they are getting digital copies directly for books in print, which are 5 percent of the total.

The controversy comes with books published from 1923-2000. Currently, Google is continuing to scan these books and display their contents in “snippet view” and in a selected number of pages. Searches currently show three snippets.

Following the presentation, discussion revealed that Google now has an agreement with the Library of Congress as well as the other five libraries in the Google Print project (University of Michigan, Oxford, Harvard, Stanford, and NYPL). The Find It in a Library links are live for some books that were originally scanned via libraries and Bunnell said they “are close to linking all books.” Google wants users to alert them to the copyright status of a book, so it seems reasonable to expect a contact link to show up soon. Third, the link syntax of Google Books is static, so one audience member asked if it would be possible to link from a library catalog to the online copy. This is possible, but requires that the patron has a Google Account, which raises privacy concerns.

Recommendation for Google: If you’re going to have a panel from noon to one, bring along your Googleplex chef and feed the hungry librarians.