Voyager bulk import: Difference between revisions

(Changed import code for ACLS from RPL_035 to ACLS)
m (Text replacement - "[TIND batch upload from MARC]" to "[TIND batch upload from MARCXML]")
 
(2 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{{Voyager}}
(See [[TIND batch upload from MARCXML]] for TIND version)
Records can be bulk uploaded directly to Hamnet by command line on the Voyager server, or via the "webadmin" web interface to the server. Use the web interface for small batches (fewer than 1,000 records) only. When using the command line bulk import program, "for optimum import performance, import 10,000 records (or less) at one time. If your record file is larger than 10,000 records, it should be broken into smaller sets of records (using the -b and -e parameters) and then imported one after the other."  
Records can be bulk uploaded directly to Hamnet by command line on the Voyager server, or via the "webadmin" web interface to the server. Use the web interface for small batches (fewer than 1,000 records) only. When using the command line bulk import program, "for optimum import performance, import 10,000 records (or less) at one time. If your record file is larger than 10,000 records, it should be broken into smaller sets of records (using the -b and -e parameters) and then imported one after the other."  



Latest revision as of 15:47, 23 January 2023

Ambox notice.png This page refers to the Voyager ILS, an Ex Libris product used at the Folger from 1996 to 2022.

For current documentation on using the catalog, see Category:Catalog.
For current documentation on staff use of the ILS, see Category:TIND ILS.


(See TIND batch upload from MARCXML for TIND version)

Records can be bulk uploaded directly to Hamnet by command line on the Voyager server, or via the "webadmin" web interface to the server. Use the web interface for small batches (fewer than 1,000 records) only. When using the command line bulk import program, "for optimum import performance, import 10,000 records (or less) at one time. If your record file is larger than 10,000 records, it should be broken into smaller sets of records (using the -b and -e parameters) and then imported one after the other."

Record sets can be bibliographic records only (Bibs); authority records only; Bibs plus holdings records (MFHDs); Bibs plus MFHDs plus Purchase Orders (POs); Bibs plus MFHDs plus Item records; or Bibs plus MFHDs plus POs plus Items.

New Bulk Import Profiles for each source must be created in the SysAdmin module. Do not apply an existing profile to a new vendor's records: there are too many variables to risk it.

File preparation

Examine the MARC file for non-Folger practice or unusual fields. Use a utility such as MarcGlobal to add, edit, and delete information as needed.

Common changes include:


Other changes might include:

  • Add 1XX/7XX $e Relator term based on content of $4 Relator code (Women Writers Online records: opted not to change before load: too time-consuming)


Bulk replacement

Web interface

Voyager Batch Upload Information when using web interface to replace Bib records with edited records having identical Bib IDs:

  • Import Code: RPL_001 (defined in SysAdmin to replace existing records with incoming records when they have the same 001, formerly called REPLACE)
  • Operator Name: Create a meaningful name in SysAdmin if necessary; Operator Name is optional, but it's useful to have something other than a blank space in the "History" tab
  • Begin record / end record: upload the first three or so as tests before doing the rest of a large batch
  • Uncheck "Show/Approve MARC display before database load?" if you've already verified the contents and integrity of the file (it takes a long time to load them into the "preview" and there's no point if you've just had the same file open in MARCView or the equivalent)
  • Voyager 8 has a bug that prevents log files from being sent as an attachment when they're too large to send in the body of an email ("--- No log file available ---" displays in the "COMPLETED" email). Workaround: View log file in WebAdmin > Report Files.

Command line interface

When importing large sets of records, run keyword regen separately afterwards. Doing it as part of the load will make it go really slowly.

Parameters:

-f Filename -- required (use the complete path)
-i Import code -- required (e.g. RPL_001 for "Replace on 001 match")
-o Operator name -- not required
-b Begin record -- not required
-e End record -- not required
-C Do NOT create 035 -- not required
-M Allow multiple bulk import processes -- not required (allow at least 90 seconds between processes to avoid overwhelming the system)

Steps for uploading records:

  1. Rename source file something meaningful, e.g. BSLW-20161223-basefile.MRC
  2. Copy sourcefile to /m1/voyager/folgerdb/local
  3. PuTTY to server
  4. Log in
  5. Change directory to /m1/voyager/folgerdb/sbin
  6. Run Pbulkimport with the appropriate parameters

Example: Pbulkimport -f/m1/voyager/folgerdb/local/BSLW-20161223-basefile.MRC -oBSLW_base -iRPL_001 -b10001 -e20000 -C
Example: Pbulkimport -f/m1/voyager/folgerdb/local/acls-1-15.mrc -oACLS -iACLS -b1 -e3 -C

  1. Check sizes of files in /m1/voyager/folgerdb/rpt to make sure they look right (e.g. a 1KB log file indicates the process didn't run, probably because of a typo in the command line)
  2. Open log file and scroll to the end: verify that all records were processed correctly
  3. Run keyword regen if bib or holdings records uploaded; not applicable to authority records.