Author Archive for Audra



31
Aug
11

SAA Days 2 & 3: assessment, copyright, conversation

I started Wednesday with a birthday breakfast with a friend from college, then lunch with a former mentor, followed by roundtable meetings. I focused on the Archivists’ Toolkit / Archon Roundtable meeting, which is always a big draw for archivists interested in new developments with the software programs. Perhaps the biggest news came from Merilee Proffitt of OCLC, who announced that ArchiveGrid discovery interface for finding aids has been updated and will be freely available (no longer subscription based) for users seeking archival collections online. A demo of the updated interface, to be released soon, was available in the Exhibit Hall. In addition, Jennifer Waxman and Nathan Stevens described their digital object workflow plug-in for Archivists’ Toolkit to help archivists avoid cut-and-paste of digital object information. Their plugin is available online and allows archivists to map persistent identifiers to files in digital repositories, auto-create digital object handles, create tab-delimited work orders, and create a workflow from the rapid entry dropdown in AT.

On Thursday, I attended Session 109: “Engaged! Innovative Engagement and Outreach and Its Assessment.” The session was based on responses to the 2010 ARL survey on special collections (SPEC Kit 317), which found that 90% of special collections librarians are doing ongoing events, instruction sessions, and exhibits. The speakers were interested in how to assess the success of these efforts. Genya O’Meara from NC State cited Michelle McCoy’s article entitled “The Manuscript as Question: Teaching Primary Sources in the Archives — The China Missions Project,” published in C&RL in 2010, suggesting that we have a need for standard metrics for assessment of our outreach work as archivists. Steve MacLeod of UC Irvine explored his work with the Humanities Core Course program, which teaches writing skills in 3 quarters, and how he helped design course sessions with faculty to smoothly incorporate archives instruction into humanities instruction. Basic learning outcomes included the ability to answer two questions: what is a primary source? and what is the different between a first and primary source? He also created a LibGuide for the course and helped subject specialist reference/instruction librarians add primary source resources into their LibGuides. There were over 45 sections, whereby he and his colleagues taught over 1000 students. He suggested that the learning outcomes can help us know when our students “get it.” Florence Turcotte from UF discussed an archives internship program where students got course credit at UF for writing biographical notes and doing basic archival processing. I stepped out of the session in time to catch the riveting tail-end of Session 105: “Pay It Forward: Interns, Volunteers, and the Development of New Archivists and the Archives Profession,” just as Lance Stuchell from the Henry Ford started speaking about the ethics of unpaid intern work. He suggested that paid work is a moral and dignity issue and that unpaid work is not equal to professional work without pay.

After lunch, I headed over to Session 204: “Rights, Risk, and Reality: Beyond ‘Undue Diligence’ in Rights Analysis for Digitization.” I took away a few important points, including “be respectful, not afraid,” that archivists should form communities of practice where we persuade lawyers through peer practice such as the TRLN guidelines and the freshly-endorsed SAA standard Well-intentioned practice document. The speakers called for risk assessment over strict compliance, as well as encouraging the fair use defense and maintaining a liberal take-down policy for any challenges to unpublished material placed online. Perhaps most importantly, Merrilee Proffitt reminded us that no special collections library has been successfully sued for copyright infringement by posting unpublished archival material online for educational use. After looking around the Exhibit Hall, I met a former mentor for dinner and went to the UCLA MLIS alumni party, where I was inspired by colleagues and faculty to list some presentation ideas on a napkin. Ideas for next year (theme: crossing boundaries/borders) included US/Mexico archivist relations; water rights such as the Hoover Dam, Rio Grande, Mulholland, etc; community based archives (my area of interest); and repatriation of Native American material. Lots of great ideas floated around…

(Cross posted at ZSR Professional Development blog.)

Advertisements
31
Aug
11

SAA Day 1: Collecting Repositories and E-Records Workshop

On Tuesday, I arrived in rainy Chicago and headed straight for the Hotel Palomar for the AIMS Project (“Born-Digital Collections: An Inter-Institutional Model for Stewardship”) workshop regarding born-digital archival material in collecting repositories. The free workshop, called “CREW: Collecting Repositories and E-Records Workshop,” included archivists and technologists from around the world to discuss issues related to collection development, accessioning, appraisal, arrangement and description, and discovery and access of born-digital archival materials.

The workshop program started with Glynn Edwards of Stanford and Gretchen Gueguen of UVa, who discussed collection development of born-digital records. The speakers suggested that both collection development policies and donor agreements should have clear language about born-digital material, including asking donors to contribute metadata to electronic records from his/her collection. The challenge, they note, is in collaboratively developing sound guidelines and policies to help archivists/curators make decisions about what to acquire. A group discussion about talking to donors about their personal digital lives and creating a “digital will,” both of which help provide important information about an individual’s work, communication, and history of using technologies.

Kevin Glick and Mark Matienzo from Yale and Seth Shaw from Duke discussed accessioning, the process through which a repository gains control over records and gathers information that informs other functions in the archival workflow. While many of the procedures for accessioning born-digital material is the same for analog material, the speakers distinguished accessioning the records from accessioning the media themselves (ie the Word document versus the floppy disk on which it is saved). Mark described his process of “re-accessioning” material through a forensic (or bit-level) disk imaging process, whereby he write-protected accessioned files to protect data from manipulation. He used FTK imager to create a media log with unique identifiers and physical/logical characteristics of the media, followed by BagIt to create packages with high level info about accessions. Seth discussed Duke’s DataAccessioner program, which he created as an easy way for archivists to migrate and identify data from disks. A group discussion asked: what level of control is necessary for collections containing electronic records at your institution? and, what are the most common barriers to accessioning electronic records, and how would they show up? Our table agreed that barriers include staffing (skills and time); being able to read media; software AND hardware; storage limits; and greater need for students/interns.

Simon Wilson from Hull, Peter Chan from Stanford, and Gabriela Redwine from the Harry Ransom Center at UT Austin discussed arrangement and description. They questioned whether archivists can appraise digital material without knowing content therein, which conflicts with the high-level, minimal processing emphasized in our field in the past few years. Another major issue is with volume: space is cheap, but does that mean archivists shouldn’t appraise? It isn’t practical to describe every item, but how will archivists know what is sensitive or restricted? Hypatia provides an easy-to-use interface that allows drag-and-drop for easy intellectual organization of e-records, as well as the ability to add rights and permissions information. Peter Chan described a complex method for using a combination of AccessData FTK in combination with TransitSolution and Oxygen to compare checksums, find duplicate records, and do a “pattern search” for sensitive terms and numbers (such as social security numbers). Gabi Redwine explored her work with a hybrid collection (analog and digital records) where she learned that descriptive standards should be a learning process for staff, not students or volunteers. Her finding aids for the collection included hyperlinks to electronic content and she advocated for disk imaging. The group discussion following this session was intense! The hotbed topic was: are professional skills of appraisal, arrangement, description still relevant for born digital materials? Our group agreed that appraisal and description remain important; however, we were strongly divided about whether archivists will need to contribute to arrangement of e-records. I believe that arrangement becomes less important as things become more searchable, as argued in David Weinberger’s Everything is Miscellaneous. Arrangement emerged before the digital realm as a way for archivists and librarians to contextualize and organize material based on topics/subjects; however, with better description, users can create their own ways of organizing e-records!

Finally, Gretchen Gueguen (UVa) and Erin O’Meara of UNC Chapel Hill discussed discovery and access. Our goals as archivists include to preserve original format and order as much as possible, and apply restrictions as necessary, while balancing this with our mission to make things accessible and available. Gretchen suggested the idea of Google Books’ “snippet” idea as a way to provide access without compromising privacy or restrictions on sensitive material. Her models for access for digital material include: in-person versus not; authenticated versus not; physical versus online access; and dynamic versus static. Erin described her use of Curator’s Workbenchwithin FOXML and Solr to control access permissions and assign restrictions and roles to e-records. Another group discussion included chewy scenarios for dealing with born-digital materials; my table had to consider: “you are at a large public academic research library; director brings several CDROMs, Zip disks and floppy disks of famous (secretive) professor from campus; they are backup files created over the years; office has more paper files; professor and his laptop are missing; no one can give further details on files; write 1 page plan for preserving/describing files; working institutional repository exists.” With no donor agreement and an understanding that the faculty member was very private, we couldn’t go ahead with full access of the material.

At the end of the day, I left with a much better grasp of how I see myself as an archivist dealing with born-digital material (primarily those on optical and disk media). It seems that item-level description works best for born-digital while aggregate description works best for analog materials. Digital records are dealt with best through collaboratively-created policies and procedures for acquiring, processing, and describing them. Great stuff!

Here is the suggested reading list to help participants prepare for the course:

(Cross posted to ZSR Professional Development blog.)

*Update: all of the workshop presentations have been posted to the born digital archives blog.

15
Jun
11

Teaching digitization for C2C

Most of this post is duplicated on the Professional Development blog at my institution.
I recently volunteered to help teach a workshop entitled “Preparing for a Digitization Project” through NC Connecting to Collections (C2C), an LSTA-funded grant project administered by the North Carolina Department of Cultural Resources. This came about as part of an informal group of archivists, special collections librarians, and digital projects librarians interested in the future of NC ECHO and its efforts to educate staff and volunteers in the cultural heritage institutions across the state about digitization. The group is loosely connected through the now-defunct North Carolina Digital Collections Collaboratory.

Late last year, Nick Graham of the North Carolina Digital Heritage Center was contacted by LeRae Umfleet of NC C2C about teaching a few regional workshops about planning digitization projects. The workshops were created as a way to teach smaller archives, libraries, and museums about planning, implementing, and sustaining digitization efforts. I volunteered to help with the workshops, which were held in January 2011 in Hickory as well as this past Monday in Wilson.

The workshops were promoted through multiple listservs and were open to staff, board members, and volunteers across the state. Each workshop cost $10 and included lunch for participants. Many of the participants reminded me of the folks at the workshops for Preserving Forsyth’s Past. The crowd was enthusiastic and curious, asking lots of questions and taking notes. Nick Graham and Maggie Dickson covered project preparation, metadata, and the NC Digital Heritage Center (and how to get involved); I discussed the project process and digital production as well as free resources for digital publishing; and Lisa Gregory from the State Archives discussed metadata and digital preservation.

I must confess that the information was so helpful, I found myself taking notes! When Nick stepped up to describe the efforts of the Digital Heritage Center, which at this time is digitizing and hosting materials from across the state at no cost, I learned that they will be seeking nominations for North Carolina historical newspapers to digitize in the near future, and that they are also interested in accepting digitized video formats. Lisa also introduced the group to NC PMDO, Preservation Metadata for Digital Objects, which includes a free preservation metadata tool. It is always a joy to help educate repositories across the state in digitization standards and processes!

05
Apr
11

Society of NC Archivists meeting: Morehead City

Most of this post is duplicated on the Professional Development blog at my institution.

While many of my colleagues were in Philadelphia for ACRL, I traveled east to the coast of North Carolina for the joint conference of the Society of North Carolina Archivists and the South Carolina Archival Association in Morehead City.

After arriving on Wednesday around dinnertime with my carpooling partner Katie (Archivist and Special Collections Librarian at Elon), we met up with Gretchen (Digital Initiatives Librarian at ECU) for dinner at a seaside restaurant and discussion about digital projects and, of course, seafood.

On Thursday, the conference kicked off with an opening plenary from two unique scholars: David Moore of the NC Maritime Museum talked about artist renditions of Blackbeard, Stede Bonnet, and other pirates, as well as archival research that helped contextualize these works; Ralph Wilbanks of the National Underwater and Marine Agency detailed his team’s discovery of the H.L. Hunley submarine, including the Civil War-era men trapped inside.

Session 1 on Thursday, succinctly titled “Digital Initiatives,” highlighted important work being done at the Avery Center for African American Research at the College of Charleston, UNC Charlotte, and ECU. Amanda Ross and Jessica Farrell from the College of Charleston described the challenges and successes of digitization of material culture, namely slave artifacts and African artwork in their collections. Of primary importance was the maintenance of color and shape fidelity of 3-D objects, which they dealt with economically with 2 flourescent lights with clamps, a Nikon D80 with a 18-200 mm lens by Quantaray (although they recommend a macro lens), a tripod, and a $50 roll of heavy white paper. Their makeshift lab and Dublin Core metadata project resulted in the Avery Artifact Collection within the Lowcountry Digital Library. Kristy Dixon and Katie McCormick from UNC Charlotte spoke carefully about the need for strategic thinking and collaboration at a broad level for special collections and archives today, in particular creating partnerships with systems staff and technical services staff. They noted that with the reorganization of their library, 6 technical services librarians/staff were added to their department of special collections!

Finally, Mark Custer and Jennifer Joyner from ECU explored the future of archival description with a discussion about ECU’s implementation of EAC-CFP, essentially authority records for creators of archival materials. Mark found inspiration from SNAC, the Social Networks and Archival Context Project (a project of UVa and the California Digital Library) to incorporate and create names for their archival collections. Mark used Google Refine‘s cluster and edit feature to pull all their EAD files into one file, grabbed URLs through VIAF and WorldCat identities, and hope to share their authority records with SNAC. Mark clarified the project, saying:

Firstly, we are not partnered with anyone involved in the excellent SNAC project. Instead, we decided to undertake a smaller, SNAC-like project here at ECU (i.e., we mined our EAD data in order to create EAC records). To accomplish this, I wrote an XSLT stylesheet to extract and clean up our local data. Only after working through that step did we then import this data into Google Refine. With Refine, we did a number of things, but the two things discussed in our presentation were: 1) cluster and edit our names with the well-established, advanced algorithms provided in that product 2) grab more data from databases like WorldCat Identities and VIAF without doing any extra scripting work outside of Google Refine.

Secondly, we haven’t enhanced our finding aid interface at all at this point. In fact, we’ve only put in a few weeks’ worth of work into the project so far, so none of our work is represented online yet. The HTML views of the Frances Renfrow Doak EAC record that we demonstrated were created by an XSLT stylesheet authored by Brian Tingle at the California Digital Library. He has graciously provided some of the tools that the SNAC project is using online at: https://bitbucket.org/btingle/cpf2html/.

Lastly, these authority records have stayed with us; mostly because, at this point, they’re unfinished (e.g., we still need to finish that clustering step within Refine, which requires a bit of extra work). But the ultimate goal, of course, is to share this data as widely as possible. Toward that end, I tend to think that we also need to be curating this data as collaboratively as possible.

The final session of the day was the SNCA Business Meeting, where I gave my report as the Archives Week Chair. That evening, a reception was held to celebrate the award winners for SNCA and give conference attendees the opportunity to participate in a behind-the-scenes tour of the NC Maritime Museum. Lots of fun ensued during the pirate-themed tours and I almost had enough energy to go to karaoke with some other young archivists.

On Friday, I moderated the session entitled “Statewide Digital Library Projects,” with speakers Nick Graham from the NC Digital Heritage Center and Kate Boyd from the SC Digital Library. The session highlighted interesting parallels and differences between the two statewide initiatives. Kate Boyd explained that the SCDL is a multisite project nested in multiple universities with distributed “buckets” for description and digitization. Their project uses a multi-host version of CONTENTdm, with some projects hosted and branded specifically to certain regions and institutions. Users can browse by county, institution, and date, and the site includes teacher-created lesson plans. The “About” section includes scanning and metadata guidelines; Kate mentioned that the update to CONTENTdm 6 would help with zoom and expand/reduce views of their digital objects. Nick Graham gave a brief background on the formation of the NCDHC, including NC ECHO and its survey and digitization guidelines. He explained that the NCDHC has minimal selection criteria: simply have no copyright/privacy concerns and a title. The NCDHC displays its digital objects through one instance of CONTENTdm. Both programs are supported by a mix of institutional and government funding/support, and both speakers emphasized the value of word of mouth marketing and shared branding for better collaborative efforts.

Later that morning, I attended a session regarding “Collaboration in Records Management.” Jennifer Neal of the Catholic Diocese of Charleston Archives gave an interesting presentation about the creation of a records management policy for her institution. Among the many reasons to begin an RM program, Jennifer noted that it was likely the legal reasons that were most important, both federal and state (and in her case, organizational rules). She recommended a pilot RM program with an enthusiastic department, as well as a friendly department liaison with organizational tendencies. Jennifer came up with “RM Fridays” as a pre-determined method for making time to sort, shred, organize, and inventory the materials for her pilot department. Her metrics were stunning: 135 record cartons were destroyed and 245 were organized and sent off site. Kelly Eubank from the NC State Archives explained how the state archives uses ArchiveIt to harvest social media sites and websites of government agencies and officials. She then explored, briefly, their use of BagIt to validate GIS geospatial files as part of their GeoMAPP project.

It was great to meet and network with archival professionals from both Carolinas and learn about some of the innovative and creative projects happening in their institutions. Right now I am thinking about EAC, collaboration with tech services, CONTENTdm, and records management.

03
Mar
11

Writing my own job description

**Update, 4 April 2011: My position description has not yet been updated. This is a draft document. Thanks for reading!

Last week, our long-time coordinator of Baptist archives retired. At the same time, our department head asked each of us to put on our thinking caps and consider ways to restructure our team to better reflect our priorities.

Knowing that an opportunity to discuss restructuring doesn’t come very often, I began looking critically at my current position description. As processing archivist and digital projects manager, I knew that my position was a combination of two important jobs suggested by an archival consultant months before I arrived. These two jobs were meshed into a project archivist position. Over the next year, it became more clear that my position seemed to schizophrenically address the same priority: to create access to archival materials. The consultant’s report suggested a processing archivist and a digital projects manager as two separate roles.

I recognized that the consultant’s suggestions were appropriate. I spoke with my intern-turned-associate who recently finished library school, and she seemed to be a perfect fit for a position that we started calling “access archivist,” essentially taking on the role of coordinating processing, accessioning, and helping select collections for digitization. I began to think that I could serve the role of digital projects archivist…and brought the idea to the department head.

Here is (the second draft, after suggestions from my supervisor) what I created. The role is based on job descriptions for positions including “digital projects archivist,” “digital archivist,” and “digital projects librarian.”

POSITION SUMMARY:

The Digital Projects Archivist will coordinate a program for digitization of analog and curation of born-digital archival resources; supervise metadata creation, authority control, quality control, and workflow for digitization projects involving archival and manuscript materials; and participate in digital preservation efforts. S/he will create and improve online finding aids and lead the department to more effective and robust implementation of descriptive tools and standards. S/he will process archival and manuscript collections, focusing in particular on those collections that are un- or under processed and are promising candidates for digital projects. The Digital Projects Archivist is responsible for maintaining equipment used for digitization. S/he will help coordinate discovery tools and interfaces for digital archival materials and collections; supervise student assistants; and provide user instruction for students, faculty, and other researchers. This is a twelve-month Visiting Assistant Librarian appointment, reporting to the Director of Special Collections and University Archives.

QUALIFICATIONS:

Education, Experience, and Training

Master’s degree in Library Science; at least two years working in an academic or special collections library with an emphasis in archival processing and description and/or digital projects. An equivalent combination of education and experience may be accepted.

Knowledge, Skills, Ability

  • Expert knowledge of archival description software, such as Archivists’ Toolkit.
  • Knowledge of recent changes in archival practice, particularly minimal processing; coupled with the judgment and research skills to proceed beyond minimal processing when a collection merits it.
  • Experience with digital projects, preferably in a coordinating role.
  • Ability to manage a digital production lab, including a variety of scanners, scanning software, scanning techniques and best practices for a wide range of formats. High level of organization and foresight in managing multiple projects executed by multiple individuals; coupled with the ability to communicate the organization and workflow to others so that they understand rationales.
  • Attentiveness to good order and security of originals chosen for digital projects.
  • Understanding of metadata standards and description of digital collections including Dublin Core, MODS, METS, XML/XSL, EAD, MARC, LCSH, AAT, and other traditional and non-traditional schemas.
  • Working knowledge of XML, XSLT, databases, web design, and digital asset management systems.
  • Familiarity with standards and best practices for digital collections and digital preservation.
  • Understanding of best practices for rights management, copyright, and associated concepts related to digitization.
  • Excellent interpersonal and oral and written communication skills.
  • Ability to work effectively, both independently and collaboratively, within a collegial environment.
  • Ability to succeed as Library Faculty in an academic environment.
  • Evidence of ability to represent Wake Forest University within the University and to external audiences at state, regional, and national levels.

ESSENTIAL FUNCTIONS:

  • Advance the processes by which the department expands its intellectual control over its holdings through the skilled use of archival management software (such as Archivists’ Toolkit) and migration of this information to future tools. The Digital Project Archivist will invest at least a third of his/her time in addressing the department’s archival backlog and inadequate finding aids.
  • Coordinate a digital projects advisory group and program for the creation, access, and preservation of digital archival and manuscript collections.
  • Coordinate workflow design, digital production, and quality control for digital projects in collaboration with a variety of programming and digital project-oriented librarians and staff, including the Preservation Librarian, Web Services Librarian, Digital Production Coordinator, and Access Archivist.
  • Participate in the coordination of the department’s digital production lab, including troubleshooting for scanning equipment and communicating maintenance concerns with the technology group within the Research, Instruction, and Technology Services team.
  • Supervise creation and application of metadata for digital collections, including thesauri and descriptive schemas.
  • Manage digital assets created from digitized and born-digital archival collections through collaboration with digital projects staff as well as shared coordination of the library’s institutional repository (DSpace).
  • Participate in supervision of archival processing and description according to DACS, including creation of EAD XML and MARC records for finding aids.
  • Ensure compliance with grant-funded digital projects and assist Department of Special Collections and Archives with grant proposals and grant-funded project workflows.
  • Help lead user interface changes for finding aids and digital collections in collaboration with the Department of Special Collections and Archives and the Research, Instruction, and Technology Services team.
  • Provide reference service and research assistance in Special Collections. Provide library instruction in Special Collections and in general collection for subject areas of expertise.
  • Participate in outreach, marketing, assessment, and other library initiatives. Contribute cooperatively to library initiatives. Participate in team and library-wide activities. Serve on library committees.
  • Participate in local, regional, or national professional organizations; enrich professional experience by attending conferences and continuing education opportunities.
  • Perform other duties as assigned.

Essentially, my goal is to move forward with the ongoing process of creating and formalizing digitization policies and processes. Processing and knowing about the scanning equipment were added in the second draft, though I feel that these could be full time positions on their own. Now I wait for feedback/approval from library administration. Have you ever written your own job description?

23
Nov
10

Sharing MARC from Archivists’ Toolkit

A few weeks ago, I shared an excited tweet with the archives twitterverse announcing that I had successfully tested importing a MARC record from Archivists’ Toolkit into WorldCat. The tweet garnered more attention than I had anticipated, including a few direct messages from fellow archivists wanting to know how we came up with a solution to the MARC from AT problem. Here is what we did.

The problems with MARCXML exported from AT are few but significant. My colleague Mark Custer at ECU recently posted to the AT user group listserv a question about the fact that AT does not currently allow subfields for subject headings, so the MARC from AT is missing the subfield indicators. I set up a meeting with a cataloger at my library to help me look at the MARCXML files being exported from AT to see what her thoughts were about whether the records could be considered complete. We took a look at MARC for archival material already on WorldCat and compared that to what we exported from AT. She illustrated what she saw as the issues that would prevent proper sharing of the MARC with our local catalog and WorldCat:

  • Missing fixed fields including Ctrl, Desc, and Date (if no date range was included in the finding aid)
  • Missing subject heading subfield delimiters
  • 650 used instead of 600 field in some instances
  • Missing indicators for 245 (and 545, optional)
  • Missing cataloging source for 049 and 040

Because the MARC exported from AT is in MARCXML format and our catalogers work with the MRC format, we used MARCedit to convert the record from MARCXML to MRC. Once these missing and erroneous elements were fixed using MARCedit, we were ready to test import the record. Our library’s account with OCLC Connexion accepts imported records in DAT format, so we saved the MRC file as a DAT file. We tried uploading to Connexion using local bibliographic import and were successful. We determined that it would probably be easier to edit the MARC directly in Connexion, so we will do that in the future. The cataloger and I decided to upload the file to WorldCat as an official record, which worked, as well as to our local catalog, which also worked!

One issue for my library is that our finding aids are missing subject terms and authority work that most catalogers would require for submission to WorldCat. We have started incorporating this cataloger into our processing workflow and introduced her to the Names and Subjects modules in AT so that she can finalize subject headings and names that we assign. We can also consider an automated batch update for all our exported MARCXML to include the edits listed above, incorporating help from our technology team and their knowledge of FTP and scripting. In the meantime, we will be submitting our MARC one at a time since our finding aids are incomplete.

Here’s a recap of our tentative workflow, for your information:

  • Open MARCedit, then Tools
  • Choose MARCXML file as input file
  • Tell program output file name (copy and paste input file info; change ending to .mrc)
  • Select MARC21XML to MARC plus Translate to MARC8
  • Select Execute
  • Open OCLC Connexion
  • Import records; browse to .mrc file
  • Edit directly in OCLC Connexion
  • Update fixed fields including Ctrl, Desc, and Date
  • Change 650 to 600 when necessary
  • Add subfield delimiters to subject headings
  • Add indicators to 545, 245 as needed
  • Add cataloging source to 040 and 049
  • Save and validate
  • Login to OCLC, select ActionHoldingsUpdateHolding to load directly to WorldCat

Thoughts, comments, ideas, and suggestions are gratefully welcomed! I am really curious to know how others approach this issue.

02
Nov
10

Creating a processing guide

I learned much about the standards of archival processing while I was a fellow at the Center for Primary Research and Training at UCLA. While there, I processed the papers of art critic Jules Langsner, the papers of activist and scholar Josephine Fowler, and the pop culture collection of Middle Eastern Americana created by Jonathan Friedlander. Perhaps most important for my professional development, however, was the training I received from CFPRT Coordinator Kelley Wolfe Bachli, who wrote a succinct and informative processing manual to train each CFPRT fellow.

I brought this training manual with me to North Carolina, and with Kelley’s permission I incorporated her work with the standards used at my institution, DACS, and the Archivist’s Toolkit User Manual. The result? The Archival Processing Guide for Staff, Students, and Volunteers. I also include the chapters about processing and the over-the-shoulder look at processing from Michael J. Fox and Peter L. Wilkerson’s Introduction to Archives, now available free online.

The guide and its rules are constantly under review but I think this would be a great starting resource for any archives or special collections repository looking for some standards for training staff, students, and volunteers about the basics of archival processing. Comments are welcome!