Posts Tagged ‘community-based archives

31
Aug
11

SAA Days 4 & 5: e-records, metrics, collaboration

Friday in Chicago started with coffee with Christian Dupont from Atlas Systems, followed by Session 302: “Practical Approaches to Born-Digital Records: What Works Today.” The session was packed…standing-room only (some archivists quipped that we must have broken fire codes with the number of people sitting on the floor)! Chris Prom from U Illinois, Urbana-Champaign, moderated the excellent panel on practical solutions to dealing with born-digital archival collections. Suzanne Belovari of Tufts referred to the AIMS project (which sponsored the workshop I attended on Tuesday) and the Personal Archives in Digital Media (paradigm) project, which offers an excellent “Workbook on digital private papers” and “Guidelines for creators of personal archives.” She also referenced the research of Catherine Marshall of the Center for the Study of Digital Libraries at Texas A&M, who has posted her research and papers regarding personal digital archives on her website. All of the speakers referred to Chris Prom’s Practical E-Records blog, which includes lots of guidelines and tools for archivists to deal with born digital material.

Ben Goldman of U Wyoming, who wrote an excellent piece in RB&M entitled “Bridging the Gap: Taking Practical Steps Toward Managing Born-Digital Collections in Manuscript Repositories,” talked about basic steps for dealing with electronic records, including network storage, virus checking, format information, generating checksums, and capturing descriptive metadata. He uses Enterprise Checker for virus checking, Duke DataAccessioner to generate checksums, and a Word doc or spreadsheet to track actions taken for individual files. Melissa Salrin of U Illinois, Urbana-Champaign spoke about her use of a program called Firefly to detect social security numbers in files, TreeSize Pro to identify file types, and a process through which she ensures that the files are read-only when moved. She urged the audience to remember to document every step of the transfer process, and that “people use and create files electronically as inefficiently as analog.” Laura Carroll, formerly of Emory, talked about the famous Salman Rushdie digital archives, noting that donor restrictions are what helped shape their workflow for dealing with Rushdie’s born digital material. The material is now available on a secure Fedora repository. Seth Shaw from Duke spoke about DataAccessioner (see previous posts) but mostly spoke eloquently in what promises to be an historic speech about the need to “do something, even if it isn’t perfect.”

After lunch, I attended Session 410: “The Archivists’ Toolkit: Innovative Uses and Collaborations. The session highlighted interesting collaborations and experiments with AT, and the most interesting was by Adrianna Del Collo of the Met, who found a way to convert folder-level inventories into XML for import into AT. Following the session, I was invited last-minute to a meeting of the “Processing Metrics Collaborative,” led by Emily Novak Gustainis of Harvard. The small group included two brief presentations by Emily Walters of NC State and Adrienne Pruitt of the Free Library of Philadelphia, both of whom have experimented with Gustainis’ Processing Metrics Database, which is an exciting tool to help archivists track statistical information about archival processing timing and costs. Walters also mentioned NC State’s new tool called Steady, which allows archivists to take container list spreadsheets and easily convert them into XML stub documents for easy import into AT. Walters used the PMD for tracking supply cost and time tracking, while Pruitt used the database to help with grant applications. Everyone noted that metrics should be used to compare collections, processing levels, and collection needs, taking special care to note that metrics should NOT be used to compare people. The average processing rate at NC State for their architectural material was 4 linear feet per hour, while it was 2 linear feet per hour for folder lists at Princeton (as noted by meeting participant Christie Petersen).

On Saturday morning I woke up early to prepare for my session, Session 503: “Exposing Hidden Collections Through Consortia and Collaboration.” I was honored and proud to chair the session with distinguished speakers Holly Mengel of the Philadelphia Area Consortium of Special Collections Libraries, Nick Graham of the North Carolina Digital Heritage Center, and Sherri Berger of the California Digital Library. The panelists defined and explored the exposure of hidden collections, from local/practical projects to regional/service-based projects. Each spoke about levels of “hidden-ness,” and the decisionmaking process of choosing partners and service recipients. It was a joy to listen to and facilitate presentations by archivists with such inspirational projects.

After my session, I attended Session 605: “Acquiring Organizational Records in a Social Media World: Documentation Strategies in the Facebook Era.” The focus on documenting student groups is very appealing, since documenting student life is one of the greatest challenges for university archivists. Most of the speakers recommended web archiving for twitter and facebook, which were not new ideas to me. However, Jackie Esposito of Penn State suggested a new strategy for documenting student organizations, which focuses on capture/recapture of social media sites and direct conversations with student groups, including the requirement that every group have a student archivist or historian. Jackie taught an “Archives 101” class to these students during the week after 7 pm early in the fall, and made sure to follow up with student groups before graduation.

After lunch, I went to Session 702: “Return on Investment: Metadata, Metrics, and Management.” All I can say about the session is…wow. Joyce Chapman of TRLN (formerly an NC State Library Fellow) spoke about her research into ROI (return on investment) for manual metadata enhancement and a project to understand researcher expectations of finding aids. The first project addressed the challenge of measuring value in a nonprofit (which cannot measure value via sales like for-profit organizations) through A/B testing of enhancements made to photographic metadata by cataloging staff. Her testing found that page views for enhanced metadata records were quadruple those of unenhanced records, a staggering statistic. Web analytics found that 28% of search strings for their photographs included names, which were only added to enhanced records. In terms of cataloger time, their goal was 5 minutes per image but the average was 7 minutes of metadata work per image. Her project documentation is available online. In her other study, she did a study of discovery success within finding aids by academic researchers using behavior, perception, and rank information. In order from most to least useful for researchers were: collection inventory, abstract, subjects, scope and contents, and biography/history. The abstract was looked at first in 60% of user tests. Users did not know the difference between abstract and scope and contents notes; in fact, 64% of users did not even read the scope at all after reading the abstract! Researchers explained that their reason for ignoring the biography/history note was a lack of trust in the information, since biographies/histories do not tend to include footnotes and the notes are impossible to cite.

Emily Novak Gustainis from Harvard talked about her processing metrics database, as mentioned in the paragraph about the “Processing Metrics Collaborative” session. Her reasoning behind metrics was simple: it is hard to change something until you know what you are doing. Her database tracks 38 aspects of archival processing, including timing and processing levels. She repeated that you cannot compare people, only collections; however, an employee report showed that a permanent processing archivist was spending only 20% of his time processing, so her team was able to use this information to better leverage staff responsibilities to respond to this information.

Adrian Turner from the California Digital Library talked about the Uncovering California Environmental Collections (UCEC) project, a CLIR-funded grant project to help process environmental collections across the state. While metrics were not built into the project, the group thought that it would be beneficial for the project. In another project, the UC Next Generation Technical Services initiative found 71000 feet in backlogs, and developed tactics for collection-level records in EAD and Archivists’ Toolkit using minimal processing techniques. Through info gathering in a Google doc spreadsheet, they found no discernable difference between date ranges, personal papers, and record groups processed through their project. They found processing rates of 1 linear foot per hour for series level arrangement and description and 4-6 linear feet per hour for folder level arrangement and description. He recommended formally incorporating metrics into project plans and creating a shared methodology for processing levels.

I had to head out for Midway before Q&A started to get on the train in time for my return flight, which thankfully wasn’t canceled from Hurricane Irene. As the train passed through Chicago, I found myself thinking about the energizing and inspiring the projects, tools, and theory that comes from attending SAA…and how much I look forward to SAA 2012.

(Cross posted to ZSR Professional Development blog.)

Advertisements
15
Jun
11

Teaching digitization for C2C

Most of this post is duplicated on the Professional Development blog at my institution.
I recently volunteered to help teach a workshop entitled “Preparing for a Digitization Project” through NC Connecting to Collections (C2C), an LSTA-funded grant project administered by the North Carolina Department of Cultural Resources. This came about as part of an informal group of archivists, special collections librarians, and digital projects librarians interested in the future of NC ECHO and its efforts to educate staff and volunteers in the cultural heritage institutions across the state about digitization. The group is loosely connected through the now-defunct North Carolina Digital Collections Collaboratory.

Late last year, Nick Graham of the North Carolina Digital Heritage Center was contacted by LeRae Umfleet of NC C2C about teaching a few regional workshops about planning digitization projects. The workshops were created as a way to teach smaller archives, libraries, and museums about planning, implementing, and sustaining digitization efforts. I volunteered to help with the workshops, which were held in January 2011 in Hickory as well as this past Monday in Wilson.

The workshops were promoted through multiple listservs and were open to staff, board members, and volunteers across the state. Each workshop cost $10 and included lunch for participants. Many of the participants reminded me of the folks at the workshops for Preserving Forsyth’s Past. The crowd was enthusiastic and curious, asking lots of questions and taking notes. Nick Graham and Maggie Dickson covered project preparation, metadata, and the NC Digital Heritage Center (and how to get involved); I discussed the project process and digital production as well as free resources for digital publishing; and Lisa Gregory from the State Archives discussed metadata and digital preservation.

I must confess that the information was so helpful, I found myself taking notes! When Nick stepped up to describe the efforts of the Digital Heritage Center, which at this time is digitizing and hosting materials from across the state at no cost, I learned that they will be seeking nominations for North Carolina historical newspapers to digitize in the near future, and that they are also interested in accepting digitized video formats. Lisa also introduced the group to NC PMDO, Preservation Metadata for Digital Objects, which includes a free preservation metadata tool. It is always a joy to help educate repositories across the state in digitization standards and processes!

17
Aug
10

Reflections: SAA 2010 in Washington DC

*Portions of this post are duplicated at the WFU ZSR Professional Development blog.

This has been my favorite SAA of the three I have attended, mostly because I felt like I had a purpose and specific topics to explore there. The TwapperKeeper archive for #saa10 is available and includes a ton of great resources. I also got the chance to have my curriculum vitae reviewed at the Career Center not once, but twice! I loved every moment of being in DC and will definitely be attending more of the receptions/socials next time!

Tuesday, August 10 was the Research Forum, of which I was a part as a poster presenter. My poster featured the LSTA outreach grant given to my library and the local public library and explored outreach and instruction to these “citizen archivists.” I got a lot of encouraging feedback and questions about our project, including an introduction to the California Digital Library’s hosted instances of Archivist’s Toolkit and Archon, which they use for smaller repositories in the state to post their finding aids.

Wednesday, August 11 consisted primarily of round table meetings, including the highly-anticipated meeting of the Archivists Toolkit/Archon Round Table. The development of ArchivesSpace, the next generation archives management tool to replace AT and Archon, was discussed. Development of the tool is planned to begin in early 2011. Jackie Dooley from OCLC announced that results from a survey of academic and research libraries’ special collections departments will be released. A few interesting findings:

  • Of the 275 institutions surveyed, about 1/3 use Archivist’s Toolkit; 11% use Archon
  • 70% have used EAD for their finding aids
  • About 75% use word processing software for their finding aids
  • Less than 50% of institutions’ finding aids are online

A handful of brief presentations from AT users followed, including Nancy Enneking from the Getty. Nancy demonstrated the use of reports in AT for creating useful statistics to demonstrate processing, accessioning, and other features of staff work with special collections. She mentioned that AT can be linked to Access with MySQL for another way to work with statistics in AT. Corey Nimer from BYU discussed the use of plug-ins to supplement AT, which I have not yet used and hope to implement.

Perhaps more interestingly, Marissa Hudspeth from the Rockefeller and Sibyl Shaefer from the University of Vermont introduced their development of a reference module in AT, which would allow patron registration, use tracking, duplication requests, personal user accounts, et cetera. Although there is much debate in the archives community about whether this is a good use of AT (since it was originally designed for description/content management of archives), parts of the module should be released in Fall 2010. They said they’d post a formal announcement on the ATUG listserv soon.

On Thursday, August 12, sessions began bright and early. I started the day with Session 102: “Structured Data Is Essential for Effective Archival Description and Discovery: True or False?” Overall summary: usability studies, tabbed finding aids, and photos in finding aids are great! While the panel concluded that structured data is not essential for archival description and discovery due to search tools, Noah Huffman from Duke demonstrated how incorporating more EAD into MARC as part of their library’s discovery layer resulted in increased discovery of archival materials.

Session 201 included a panel of law professors and copyright experts, who gave an update on intellectual property legislation. Peter Jaszi introduced the best practice and fair use project at the Center for Social Media, a 5-year effort to analyze best practice for fair use. Their guidelines for documentary filmmakers could be used as an example for research libraries. In addition, the organization also created a statement of best practices for fair use of dance materials, hosted at the Dance Heritage Center. Mr. Jaszi argued that Section 1201 does not equal copyright, but what he called “para-copyright law” that can be maneuvered around by cultural heritage institutions for fair use. I was also introduced to Peter Hirtle’s book about copyright (and a free download) entitled Copyright and Cultural Institutions: Guidelines for Digitization for U.S. Libraries, Archives, and Museums, which I have started to read.

I wandered out of Session 201 into Session 209, “Archivist or Educator? Meet Your Institution’s Goals by Being Both,” which featured archivists who teach. The speakers emphasized the study of how students learn as the core of becoming a good teacher. One recommendation included attending a history or social sciences course in order to see how faculty/teachers teach and how students respond. I was inspired to consider faculty themes, focuses, and specialties when thinking about how to reach out to students.

Around 5:30 pm, the Exhibit Hall opened along with the presentation of the graduate student poster session. I always enjoy seeing the work of emerging scholars in the archival field, and this year was no different. One poster featured the Philadelphia Area Consortium of Special Collections Libraries in a CLIR-funded project to process hidden collections in the Philadelphia region — not those within larger repositories, but within smaller repositories without the resources or means to process and make available their materials. The graduate student who created the poster served as a processor, traveling to local repositories and communicating her progress and plan to a project manager. This is an exciting concept, since outreach grants tend to focus on digitization or instruction, not the act of physically processing the archival materials or creating finding aids.

On Friday, August 13, I started the morning with Session 308, “Making Digital Archives a Pleasure to Use,” which ended up focusing on user-centered design. User studies at the National Archives and WGBH Boston found that users preferred annotation tools, faceted searching, and filtered searching. Emphasis was placed on an iterative approach to design: prototype, feedback, refinement.

I headed afterward to Session 410, “Beyond the Ivory Tower: Archival Collaboration, Community Partnerships, and Access Issues in Building Women’s Collections.” The panel, while focused on women’s collections, explored collaborative projects in a universally applicable way. L. Rebecca Johnson Melvin from the University of Delaware described the library’s oral history project to record Afra-Latina experiences in Delaware. They found the Library of Congress’ Veterans’ History Project documentation useful for the creation of their project in order to reach out to the Hispanic community of Delaware. T-Kay Sangwand from the University of Texas, Austin, described how the June L. Mazer Lesbian Archives were processed and digitized, then stored at UCLA. Ms. Sangwand suggested that successful collaborations build trust and transparency, articulate expectations from both sides, include stakeholders from diverse groups, and integrate the community into the preservation process. One speaker noted that collaborative projects are “a lot like donor relations” in the sense that you have to incorporate trust, communications, and contracts in order to create a mutually-beneficial result.

On Saturday, August 14, I sat in on Session 502, “Not on Google? It Doesn’t Exist,” which focused on search engine optimization and findability of archival materials. One thing to remember: Java is evil for cultural heritage because it cannot be searched. The session was a bit introductory in nature, but I did learn about a new resource called Linkypedia, which shows how Wikipedia and social media interact with cultural heritage websites.

Then I headed to Session 601, “Balancing Public Services with Technical Services in the Age of Basic Processing,” which featured the use of More Product, Less Process, aka “basic processing,” in order to best serve patrons. After a few minutes I decided to head over to Session 604, “Bibliographic Control of Archival Materials.” The release of RDA and the RDA Toolkit (available free until August 30) has opened up the bibliographic control world to the archival world in new ways. While much of the discussion was outside of my area of knowledge (much was discussed about MARC fields), I learned that even places like Harvard have issues with cross-referencing different types of resources that use different descriptive schemas.

My last session at SAA was 705, “The Real Reference Revolution,” which was an engaging exploration of reference approaches for archivists. Multiple institutions use Google Calendar for student hours, research appointments, and special hours. One panelist suggested having a blog where students could describe their work experience. Rachel Donahue described what she called “proactive reference tools” such as Zotero groups to add new materials from your collection and share those with interested researchers, and Google Feedburner.

It was a whirlwind experience and I left feeling invigorated and ready to tackle new challenges and ideas. Whew!

09
Jun
10

The NC Digital Heritage Center is (Finally) Here: Reflections

This morning, Nick Graham sent out a message to the North Carolina Library Association announcing DigitalNC.org, the new digital repository for primary resources across the state digitized at UNC Chapel Hill.  Nick, formerly of NC Maps, is the newly-appointed coordinator for the North Carolina Digital Heritage Center, a development which I have followed closely here at Touchable Archives. The focus of the NC Digital Heritage Center and its matching website, according to the site:

“The North Carolina Digital Heritage Center is a statewide digitization and digital publishing program housed in the North Carolina Collection at the University of North Carolina at Chapel Hill. The Digital Heritage Center works with cultural heritage institutions across North Carolina to digitize and publish historic materials online. Through its free or low-cost digitization and online hosting services, the Digital Heritage Center provides libraries, archives, museums, historic sites, and other cultural heritage institutions with the opportunity to publicize and share their rare and unique collections online. The Center operates in conjunction with the State Library of North Carolina’s NC ECHO (North Carolina Exploring Cultural Heritage Online) project. It is supported by the State Library of North Carolina with funds from the Institute of Museum and Library Services under the provisions of the Library Services and Technology Act.”

Some of you who are familiar with North Carolina may wonder, “what happened to NC ECHO?” Based on discussions with colleagues across the state, it looks as though NC ECHO no longer exists as it originated*. (*Since I am relatively new to the state as a librarchivist, I am still unclear about the original purpose of the NC ECHO Project. Two of the largest deliverables from NC ECHO include its survey and institutional directory and its LSTA digitization grant funding program.) The preservation and emergency response focus of NC ECHO has become NC Connecting to Collections and NC SHRAB’s Traveling Archivist program, as well as possible regional emergency response networks like MACREN. The digitization planning and project funding aspect of NC ECHO appears to have joined with UNC Chapel Hill to form the NC Digital Heritage Center.

In previous posts, I have been excited about this Digital Heritage Center being North Carolina’s version of the California Digital Library’s Calisphere. I originally thought that the CDL was a statewide initiative of the state library, but recently realized that it is, like the NCDHC, an initiative of a university system. The CDL is not a resource provided by the state library of California. It is a project of the University of California system. This is what the digital collections portal of the California State Library looks like; this is what the State Library of North Carolina’s digital repository looks like. Why do the statewide library and archives systems for these states have such limited digital resource, while academic libraries in these states carry digital collections technology and access into the future? Wouldn’t it make more sense for the state library to be the digital repository, instead of providing funding for it?

The obvious answer is that the state library does not have the technological resources or expertise to make this happen. Academic libraries and archives are research-oriented, so they are able to do more experimentation and use the knowledge of systems librarians and programmers to create new and innovative resources. Perhaps most importantly, the state library supports academic libraries that make these resources accessible, which is possibly the only reason I am willing to overlook the potential conflict of interest of having UNC and the state library so closely intertwined.

The NC Digital Heritage Center arrives at an exciting moment in the history of digital libraries and digital collections. The team and advisory board exist to provide project management, digitization, and web hosting to smaller and less-funded institutions in the state in order to create access to primary resources across the state. I hope that institutions both large and small can participate in this effort to create a statewide digital repository. In this way, resources from community-based institutions and repositories holding the history of underrepresented groups can be made available for research and review like never before. I continue to follow closely the development of the Center.

25
Nov
09

Preservation and digitization for all

First off, a few words of gratitude in this season of thanks-giving. I am thankful for my job, where I learn every day about public service, local history, and get to use my skills as an archivist. I am grateful that our county finally decided to upgrade our outdated county website (including the public library) to CSS, and that it will be coming out in early 2010. Finally, I am grateful for the grants my department has received, most recently the NC SHRAB’s Traveling Archivist Program.

Speaking of grants, my library (in partnership with Wake Forest University) recently received an outreach grant from the State Library that provides digitization equipment and preservation training in locations throughout our county. This grant is unique to North Carolina and is being watched carefully by the State Library due to its somewhat unusual concept. Put simply, we are putting expensive scanners “out there” for the general public and providing preservation education for nonprofit groups and individuals.

This Saturday was our first workshop, which was focused on local nonprofit organizations. From genealogy clubs to food banks, churches to social clubs, we sent emails and postcards to as many groups as we could find. Our workshop’s limited RSVP list was filled within a week, and I began hearing from groups that I know I had not yet invited! We are having three more rounds of workshops in 2010.

On Saturday, we brought in Rachel Hoff, preservation expert from UNC Chapel Hill, as well as Barry Davis, multimedia coordinator at Wake Forest, to teach our community partners about preservation, repair, and digitization of their organization’s archives. The enthusiasm of our participants was absolutely contagious. Not only were they fully engaged from 10 am to 5 pm, but they were thrilled to learn about book repair, archival housing, and the steps to use our VHS-to-digital, cassette-to-digital, slide scanner, and flatbed scanner!

We need to get all of the public library staff involved with the equipment to the point where they are comfortable showing a customer how to use the scanners. At a small public library branch with a few full-time staff, it is hard enough to get the staff trained on the equipment, let alone ask them to spend time with a customer who is just getting started! So we’ve decided to expand our training on the digitization equipment to become part of our regular computer training classes, allowing for small seminars.

While it sounds simple, the grant is compelling in its implications. This equipment will be open to the public. There are no restrictions as to what can be digitized, and no requirements that digital objects be shared with our libraries or hosted on a designated server. It is empowering for community-based archives to be provided with training and resources to preserve their history their way. I will post more in the future as our project develops.

In related news: the NC Digital Heritage Center is coming…!

12
Nov
09

Forming iDEALS for tomorrow’s information professionals

On Monday, I participated in the first Information, Diversity, Engagement, Access and Libraries (iDEAL) Summit in the Department of Library and Information Studies at the University of North Carolina at Greensboro. The innovative summit was the brainchild of new department chair Dr. Clara Chu, the event was meant to create a “community approach to discussing and identifying strategies to address information, diversity, engagement, access and libraries (iDEAL) in our education, research, practice and community building.”

The event format appeared to be modeled partially from the 2006 UCLA Diversity Recruitment Summit, which incorporated small group discussion, brainstorming, and reflection as a larger group. While the UCLA event focused on ways to bring diversity to the field, iDEALS attempted to address ways to better prepare future information professionals for “relevant, appropriate and effective services in and with diverse, globalized and technological communities.”

Perhaps what made the program so unique was the diversity of participants. Faculty, students, and practitioners were invited to participate in the discussion, creating an intellectual potpourri. Small groups were sent into sessions where they discussed real world experience, education and professional development, research, and community, as they relate to LIS education, research, and practice.

I was part of a group led by Dr. Nora Bird, who further divided our group to discuss specific topics listed above. She avoided allowing participants from the same group sit together (i.e. no two students sitting together). My small group was asked to focus on education and professional development, something with which I have recent experience!

As we brainstormed skills and knowledge for graduates with respect to diversity, engagement, and access, the conversation kept returning to a lack of opportunities for LIS students to feel truly engaged with the local community, as well as opportunities for students to gain valuable professional training (read: not shelving books). Desired skills and knowledge: empathy, ability to listen to others, openness, exposure to different types of communities and cultures, ability to TEACH, being an advocate, and being knowledgeable about existing and new resources. There were a lot more suggestions, but we crystallized our discussion into two main points: mentoring and service learning.

We concluded that today’s LIS students need mentoring from a variety of sources. The student government can arrange 2nd-year/1st-year mentorships; alumni can provide networking and mentoring opportunities at the local level; NCLA/SNCA can continue and expand their mentor programs; and of course, there is always the national level. Mentoring does not just provide networking opportunities, but it also creates professional development that cannot happen in the classroom. Professional skills can be learned simply by watching and listening to an active practitioner.  Finally, professional organizations should encourage research at the graduate level by providing student poster sessions (especially at the state level) and supplementing or changingmerit-based scholarships into research funding.

Perhaps most importantly, we felt that service learning (as opposed to internships/practicums) offered the greatest opportunity for education and professional development to LIS students. By “learning by doing,” students are able to take classroom knowledge and apply it to a real-life situation. In particular, service learning projects with community-based organizations push developing information professionals into a new role as resident “expert,” where he or she must make decisions about how to deal with challenging situations. Service learners must teach and share knowledge — in effect, becoming advocates. Service learning provides a variety of experience for a budding information professional in a short period, and provides the chance to experience different communities. We felt that student organizations and LIS departments have a responsibility to help create community organization projects for students, with clear learning objectives and goals. These projects must be mutually beneficial.

I must admit my influence in this discussion was based on my experience with service learning at UCLA. I chose to work with Visual Communications, an Asian-American nonprofit film/media organization. Without an archivist, my peers and I were seen as archivists by default, and found ourselves using newly learned techniques and approaches to arranging, preserving, and making accessible their archives. This could not have happened in an established archive, where our work would have been more limited and, perhaps, at a paraprofessional level. The challenges of a limited budget and overworked staff are familiar today. My peers and I also learned about the information needs of a diverse and underserved community.

While the iDEAL Summit was focused on ways to improve UNCG’s program, nearly everyone I spoke to felt that this method could be replicated on other campuses and in other communities. It was inspiring to see three types of information professionals — students, faculty, and practitioners — in the same room, asking for the same transformation. I wonder how many other LIS programs incorporate service learning into the curriculum. In the near future, I hope to see more service learning as well as more practitioners who mentor.

03
Sep
09

THATCamp Austin reflections

With THATCamp Pacific Northwest coming up next month, it’s about time I posted about my experiences at THATCamp Austin. I think I’ve been delaying this post for a while out of simultaneous excitement that I got to participate and fear that I’ll be exposed as a big groupie of all the amazing folks who participated in THATCamp.

This year was the first regional session of the original THATCamp, or “The Humanities and Technology Camp,” first held by the Center for History and New Media at George Mason University. As a user-generated “unconference” consisting of discussion groups, training sessions, and “dork shorts” demonstrating new projects, THATCamp is an ideal kind of spontaneous, creative outlet for  newbie archivists/digital humanists/historians.  Lisa Grimm was one of the archivists in attendance in June and wrote this inspiring post about the potential for THATCamp in Austin.

A few weeks later, THATCamp Austin was  born (care of Lisa Grimm, Ben Brumfield, Peter Keane, and Jeanne Kramer-Smyth). As I read the excited tweets about the program and encouraging news that anyone interested in digital humanities could apply, my hesitation about being a public library archivist/special collections librarian among digital humanities folks began to subside.  I applied and my idea to discuss redefining the  boundaries of memory institutions was accepted!

Overall, I could sense that the environment at THATCamp would be supportive, energetic, and a lot of fun. My enthusiasm grew as I got to the UT-Austin lecture hall where our event would be held. A narrow hallway was filled with smiling faces, free pizza, and free t-shirts thanks to some angel sponsors and a few incredibly hardworking organizers.

We settled ourselves in an auditorium in the basement of the building, with live tweets popping up on the overhead screen. Open discussion, creativity, and freedom of thought was the order of the evening — I was overjoyed! We shouted out our potential topics and organized ourselves on loosely-related themes. I chose to participate in the session on crowdsourcing in digital projects and was a discussion leader for the session on “web x.x and diversity and community.”

I didn’t take notes. For the first time in my career, my ubiquitous notebook sits devoid of scribbled entries, doodles, or quotes. Perhaps it’s because I found it faster to type than to write…so most of my remarks, in reverse chronological order, can be seen via tweets:

Perhaps the best thing about THATCamp was being given the opportunity to speak freely about new concepts with intelligent, creative folks in a non-competitive, relatively unstructured environment. No one had to submit a proposal a year in advance (many of these projects and ideas will have morphed multiple times within a few months). I relished the chance to meet some of the emerging contributors to my field and have conversations with my colleagues without the constraints of a formal panel. I am so grateful to have been there and cannot wait to see what concepts and innovations come out of future THATCamps!