VIDEO
In this first large scale study of the effect of discovery systems on electronic resource usage, the authors present initial findings on how these systems alter online journal usage by academic library researchers. The study examines usage of content hosted by four major academic journal publishers at 24 libraries that have implemented one of the major discovery systems, EBSCO's EDS, Ex Libris' Primo, OCLC's Worldcat Local, or SerialsSolutions’ Summon. A statistically rigorous comparison of COUNTER-compliant journal usage at each library from the 12 months before and after implementation will determine the degree to which usage rises or falls after discovery tool implementation and address rumors that discovery tools differ in their impact on electronic resource usage.
Libraries have always provided videos as part of their collections, but advances in technology and bandwidth have made it possible now to offer streaming media. The increased accessibility of streaming, available on any network computer, from on or off campus, compared to a DVD makes the decision to seek streaming a no-brainer. Streaming videos however brings a new set of challenges for librarians: there are few or no licensing standards, rights ownership is often unclear and bandwidth limitations are some of them. In this panel we will be presenting the experiences of two academic libraries and of a streaming video provider.
Baruch College has been providing streaming media since fall 2008, with the advent of a new Film Studies minor. The films requested were feature films, often foreign films, not films easily available in any of the nascent aggregator streaming. We will discuss how we grew from streaming a couple of films a semester to over 50.
You’ll also hear about the experience of the University of North Carolina at Greensboro in responding to faculty needs as more courses are offered online and students express a preference for streaming options over viewing DVDs in the classroom or the library.
One source of educational content is Docuseek2, which provides educational streaming access to films from such publishers and distributors as Bullfrog Films. Representatives of these two companies will explain the technical side of working with our libraries, and they will discuss the pros and cons of self-hosting versus using third party platforms to stream video.
This panel will consider licensing issues, access and security issues, and managing course deadlines. We will also discuss issues around hosting (or not) one’s content and what to consider. We will share what we have learned and some best practices that we have developed.
Limited seating available - Registration Required
Don’t miss the opportunity to join your colleagues and EBSCO for an informative discussion about how to maximize the value of your library collection and improve your end users’ library experience…
The right services can help you assess the effectiveness of your library’s collection development decisions and build a valuable collection while also creating a rich user experience. Learn how usage statistics can help you manage your collection and workflow while improved linking, widgets, APIs and other solutions can bring the power of your collection to the end user—improving their experience and creating library champions.
Join EBSCO at the 2013 Charleston Conference, where a panel of librarians and EBSCO experts will share their insights to help implement an effective collection development strategy.
Many academic libraries are struggling with collections size reaching or exceeding building capacity. Meanwhile, the movement of “21st Century Libraries” calls for user centered space. The combination of these two factors has challenged libraries to identify ways to eliminate physical collections without losing access to content.
The academic libraries in the State of Florida, including the University of Central Florida (UCF), have discussed and developed plans for a shared print repository for several years. For the past few years a state wide Shared Storage Task Force was convened with representation from the state university libraries; and eventually formed the FLorida Academic REpository (FLARE) under the leadership of University of Florida.
In 2012, FLARE received the first large shipment from a participating library. After a few months of active planning, UCF implemented its project preparing our materials to send to FLARE, and is poised to be the next library contributing to FLARE.
As presented, the UCF FLARE project requires tremendous coordination and collaboration within the multiple units in the Technical Services Division at UCF, and with the external FLARE Team in Gainesville. Policies and procedures were developed with guidance from the FLARE Team; and internal workflow was designed to ensure accurate processing. Maintaining clear communication with the Public Services Division is also critical.
This presentation will give an overview of the FLARE project and its evolution, and share UCF’s experience in selecting and processing materials for this shared storage facility.
For many libraries, particularly small to midsize academic libraries, journals have placed significant strains on the acquisitions budget. For fiscal year 2012/2013 the Volpe library at Tennessee Tech University faced a significant materials budget shortfall. Rather than simply cutting titles to cover the shortfall or asking the administration for more money we concluded that the existing system of acquiring and delivering information packaged in journals was not sustainable for us. We therefore embarked on a yearlong process to develop a different way of providing article information that would more efficiently use the budget that we have. The process we have developed focuses more heavily on purchasing individual articles, using the CCC product Get it Now, in an attempt to maximize the impact of our budget resources.
This presentation will describe the process that was used to prepare a plan to meet the budget challenge. It will also include a description of the final plan, the implementation of the plan and early results that are available on the operation of the new process.
There are two main objects for this session. First is to present a process we are implementing to meet a challenge many libraries face in the hope that it will be of value to our colleagues and that they can learn from our experience. Second is to solicit input and ideas that will improve the process. Those that attend the session can participate by sharing their reactions and ideas.
What are academic libraries really doing to market the library resources they have? This joint presentation will take a look at what academic libraries are doing today to promote their resources and themselves. Carol Anne Germain, Information Literacy Librarian at the University at Albany and the current president of the New York Library Association, will discuss how librarians can develop, implement, and successfully evaluate marketing plans for e-resources. Nader Qaimari, Senior Vice President of Marketing for Gale, will delve deep into the implications of how libraries can solicit, create and present impactful stories of student success using data gathered from recent studies by Gale, part of Cengage Learning, which explore the tactics librarians employ to market their e-resources and how they really feel about their measurement and effectiveness or non-effectiveness. Participation will be encouraged with the audience on the topics of assessment, budgeting and goal setting along with what vendors can be doing better for support and services surrounding library investments.
Attendees will walk away with an understanding of how to develop, implement and evaluate marketing plans for e-resources that work for their individual needs, while also building basic visibility within the community they serve. Attendees will also learn how to better connect with their students and faculty and gather and share their successes with influential members of the community.
This session details five years of gathering and analyzing e-resource usage statistics for Appalachian State University and Western Carolina University. Presented with the challenge of gathering and analyzing e-resource usage statistics, something akin to a jigsaw puzzle that required assembling numerous small, often oddly shaped, interlocking, and tessellating pieces that rarely fit together, the first challenge was methodology. Excel spreadsheets were employed to manage the discovery and compilation of hundreds of URL’s, logins and passwords.
Plunging into this brave new world of standards and protocols required understanding how each vendor reported their usage statistics. Some offered COUNTER (Counting Online Usage of NeTworked Electronic Resources) reports while others had their own system of reporting.
When the initial puzzle was complete, the fun was over and gathering stats manually became tedious and monotonous. Looking for a way to automate the process we were thrilled to learn of SUSHI (Standardized Usage Statistics Harvesting Initiative), a new initiative for harvesting the data automatically, and began to explore how to implement this new protocol. Soon it was apparent we did not have the technical support or server capability to proceed with the project on our own.
Both universities now have Ebsco’s Usage Consolidation product. ASU is just beginning but WCU now has 6 months of experience with the product. Kristin will share details of WCU’s implementation.
We intend for this case study to stimulate further discussion and research on alternative options and extensions of a collaborative model for gathering and analyzing usage statistics in other institutions or contexts.
Beth Bernhardt (slides)
Iain Hrynaszkiewicz (slides)
How do you measure the impact of research? Do you know Impact Factor from ImpactStory; or F1000Prime score from Altmetric score? Understanding the how research impact is measured is very important for funders, institutions and individuals. Traditional metrics have focused on citations of papers and journals but there is growing realisation – highlighted in 2013 by the San Francisco Declaration on Research Assessment (DORA) – that the Impact Factor is inappropriate for assessing individual papers, institutions, and individual researchers. The internet has given us a wealth of new data on how much published papers (and other products of research) are read, reused and revered by the authors’ peers. Alternative (“alt”) metrics can measure how many times a paper is downloaded, tweeted, or shared on social media; how many bloggers have written about it; how many readers a paper has on social reference managers such as Mendeley. These interactions are possible to track, aggregate and measure and, to some extent, be understood. And F1000Prime aims to provide context to altmetrics with human-readable comments along with numerical article scores.
Altmetrics tools can be used by librarians to assess the impact of institutions’ research and to give a better indication, than the Impact Factor alone, of which journals are publishing the best research. With the rise of remote digital access to libraries, librarians should embrace altmetrics as an educational service to researchers who may now rarely need to visit the library in person, as well as for research assessment. This session will also discuss limitations and benefits of Impact Factors and alternative metrics and how some of the most important research impacts are currently immeasurable. Using a case study from a librarian "in the trenches," the value of altmetrics tools for universities and funding agencies will also be considered.
Facets and other metadata-based functionality used in library interfaces tend to be generic (subject, author, format/type, location, date, language, location), mimicking basic indices (such as the Z39.50 profiles and other catalogue-related standards). They provide a way to limit results and interact with content based on perceived similarities across grossly dissimilar content.
Library and library-related systems have tended to keep facets and metadata-based functionality generic in order to ensure applicability to as much content as possible in the results sets and item displays. In doing this are they truly serving the end user or deceiving them?
Facets and other metadata-driven functionality do not need to be generic. They can be smart. They can provide a way to analyze results and content and answer questions about results sets and the items within them. However, to be smart, the application of facets and metadata has to be smart... algorithimic, based on knowing what the probable domain of knowledge is and what the indexing specific to that domain can offer.
This session focuses on examples of how metadata-based functionality can provide a more focused and navigable experience for the end user, precisely because the experience is based on the user’s subject domain.
Libraries and library consortia are buying increasing numbers of e-books through a variety of acquisition models, and analysis of previous usage can be used to help make these purchases more effective and targeted. This session provides two perspectives – a consortium of 73 academic libraries and an individual academic library – and gives practical examples of how this approach can be implemented.
The Virtual Library of Virginia (VIVA) consortium used a pilot Publisher/Subject Collection Analysis to explore ways to use print circulation data to inform future, collaborative e-book purchases. An important consideration was defining “high circulating” books in a way that allowed member libraries of all sizes to participate, and central to this analysis was distillation and normalization of the data. Libraries provided the ISBN, call number, and total number of circulations, among other data items, to the VIVA central office, which used the ISBN to generate normalized publisher names. Initial results of the analysis have provided easy drilling down to show top publishers for a given subject area.
In a similar vein, American University Library has analyzed historical approval orders to identify which publishers may be the best fits to be removed from the approval plan and changed to package and frontlist purchases made directly from the publisher. Benefits of this approach include more comprehensive coverage of a publisher’s titles and easier tracking of cost per publisher. Circulation data for the books was used as a prioritization mechanism, and purchases have already been made using this method.
Attendees will learn the procedures for doing collection analyses of this kind, including the process of mapping ISBNs to publishers and the scripts used to process the data efficiently for the VIVA project. A discussion with the audience will follow the presentation.
The Library Publishing Coalition (LPC) Project is a two year project with seed funding from the Educopia Institute to develop a community-defined and established Library Publishing Coalition to meet an urgent need within the information profession for more library publishing skill-building, professional development and networking opportunities, and virtual and in-person forums for sharing best practices and documentation.
In January 2013, the LPC Project, with support and participation from 51 member libraries, began working on a core set of activities defined by the community including:
Sarah Lippincott, LPC Program Manager, Educopia Institute, will present on the results of the recent library publishing services survey used to populate the forthcoming Directory of Library Publishing Services. Following Sarah's presentation, three Deans and Directors of member libraries will discuss the formation of the LPC, how they envision the future coalition supporting the development of publishing partnerships (especially with the complementary programs of university presses and scholarly societies), and their perspectives on the role of library publishing services in today's research and scholarly environment.
The discussion will be followed by a question and answer period. In addition to learning about the results of the survey and hearing from different member libraries, participants can expect to be offered an opportunity to provide feedback on the kinds of services that would best meet the diverse needs of the library publishing community and to gain a better understanding of the broad range of LPC Project activities.
Open access publishing continues to be a topic of debate and discussion in the popular media, blogs and listservs. A panel representing different points of view will discuss open access journals and articles from the perspective of metadata and accessibility. Open access content is the utmost accessible content, if students and researchers know how to find it, and know how to judge whether what they find is worthy of inclusion in their research. The discussion will focus on how to make open access publications and articles more accessible.
Questions the session will strive to answer are:
Attendees will be encouraged to engage in the discussion and provide input as to what they see as being done well, and what issues they have with open access content. This potentially dynamic discussion we hope will be thought provoking and foster new thinking on the importance of metadata in reliably finding content – and finding reliable content!