Workflow for Preventing the Transfer of Duplicates to LSF
Mechanisms for preventing transfer of duplicates to LSF
Task force members: Chris Killheffer, Julie Linden, Rick Sarcia, Dajin Sun
Goal: To recommend tools, reports, or other mechanisms that will eliminate unnecessary transfer of duplicate copies to LSF.
Most libraries and units that transfer materials to LSF work from a pick list. The pick list is a report of Orbis records that meet the criteria established by that library or unit. For example, a pick list may include volumes from 1) a certain call number range; 2) below a certain circulation threshold; 3) published before a certain date. Libraries and units devise criteria that meet their local needs, taking shelving space, materials type/conditions, and student and researcher needs into account.
Recommended method for eliminating transfer of duplicates
A pick list that does not include duplicates is the most efficient way to prevent transfer of duplicates to LSF and to prevent labor-intensive catalog searching for duplicates during the transfer process.
To generate a pick list that eliminates duplicates, use a tool (Collection Analysis Tool or ODBC queries) that accomplishes the following:
a) Set desired LSF-transfer criteria (circulation threshold, publication date threshold, location, etc.)
b) For each bibliographic record, attach holdings records for both the "home" location (the library from which the materials are to be transferred) and any/all circulating (i.e. unrestricted) LSF locations.
c) Separate the results into two reports:
Report one will include all the bib records that have an LSF holdings location. These are the likely duplicates, and are therefore not candidates for transfer to LSF. The transferring library or unit can use this report as a picklist for withdrawals, or may wish to do further analysis according to the criteria described in the LSF Duplicates Policy.
Report two will include all the bib records that do not have an LSF holdings location. Because these items are not duplicated as circulating items at LSF, they can be transferred to LSF.
There may be instances of an on-campus holding and an LSF holding, on the same bib record, which are not duplicates – for example, volume 1 is at LSF, volume 2 is on campus. However, at this point, it is not worth investing the time into finding such instances – either through manual intervention or by devising a program – given the expected low yield and the need to efficiently identify known duplicates.
Libraries and units may wish to broaden their LSF transfer criteria; because the resulting pick list will be filtered to eliminate duplicates, libraries will want to ensure that the pick lists are yielding enough material to meet LSF transfer goals.
Libraries and units should consider options for the identified duplicates, including possible withdrawal.
The above procedures will work well for monographs with duplicates recorded on multiple MFHDs. Recent testing on selection files, conducted by Chris Killheffer and Lauren Brown, demonstrated that most duplicates are indeed recorded on multiple MFHDs attached to a single bibliographic record, and that the number of duplicates recorded on separate bibliographic records is so low that it is not worth the effort to identify them.
Identifying serials duplication is far more challenging. An on-campus serial holding may completely duplicate, partly duplicate, or not at all duplicate the holdings of that serial at LSF. While it is possible to devise an automated way to compare holdings, the results are complicated by factors such as: holdings records not correctly recorded; existence of separately analyzed records and a series record. Identifying duplicate analyzed multipart monographs (MPMs) is also challenging because of inconsistencies in cataloging practice. We recommend that techniques for identifying duplicate serials and MPMs be revisited after the monograph duplicate identification process is well-established.
These procedures do not apply to Mudd, which is not working from a pick list and where efficient, item-in-hand procedures for identifying duplicates have been in place since the beginning of the large-scale transfer project.
Tools and training:
The existing Collection Analysis Tools should be modified to allow for the criteria input and report output described above. The tool needs to encompass both LC call numbers and non-LC call numbers (which is currently true of the Collection Analysis Tool version 1, but not version 2).
Those YUL staff who already use the ODBC connection to Orbis to run reports as pick lists may need further training in writing SQL queries so as to utilize the full potential of this tool and enhance the library’s overall reporting and data-manipulation capabilities.