Grateful Med: Personal Computing and User-Friendly Design

gm_homescreenIt is that time again — I’ve written another Circulating Now┬áblog post. This post is the final once since I’m quickly approaching the end of my residency at the National Library of Medicine. I hope you enjoy it.

This one is about Grateful Med, perhaps NLM’s most well-known piece of home-grown software. In this blog post, I discuss how Grateful Med fits into the rest of NLM’s infrastructure, personal computing, and the importance of user-friendly design. These posts have been a treat to write.

Innovation, Control, Context: Part 1 of my notes on “The Maintainers”

About two weeks ago, I attended “The Maintainers,” a conference on technology history that emphasized the necessity and difficulty of maintenance. You can read more about it here.

The biggest point I want to make about the conference as a whole is that it was conceived and organized in response to the overwhelming emphasis on “innovation” in popular histories and conceptions of technology. The term, “innovation” is so frequently used that it nearly loses any sense of meaning, as this event demonstrates. There are issues with defining an event like “The Maintainers” in opposition to a term that has proven so flexible and amorphous that it can mean many things to many different people. Throughout the conference, the exact parameters of “maintenance” came up frequently, with, for example, some disagreements about how repair relates to maintenance.

However, even if maintenance was not always clearly defined for the group, the breadth of speakers was impressive. With historians, activists, and artists speaking, the flexibility of “maintenance” being defined in opposition to “innovation,” seemed to allow for more voices in what could have been an overly academic conference. Without clear parameters for “maintenance,” people were able to speak about things from internet governance to community stool libraries. It was refreshing.

Now, I have two responses to the conference. The first I will cover in this blog post, and the second (about data management, research reproducibility, and scholarly communications) I will cover in a later blog post.

Innovation, Control, and Context

When one begins to look at the ways in which different start-ups have tried to “innovate” office communications and organization, the line between innovation and maintenance becomes unclear. Slack, for example, has been heralded as an innovative solution for office communication and has been covered by news organizations like the Wall Street Journal and Fortune Magazine. Yet, one of the talks, given by Ellan Spero titled,” ‘A Card for Everything, Miss Whittle!’ – A Maintainer’s Approach to the Organization of Academic-Industrial Research at the Mellon Institute for Industrial Research,” outlines the practice of office communication as “maintenance.” What is so different about these two scenarios that they are viewed and defined so differently?

First, let me give some background on Miss Whittle and the Mellon Institute for Industrial Research, as taken from my notes on Spero’s talk. Lois Whittle worked in the office of the Mellon Institute for Industrial Research before it merged with the Carnegie Institute of Technology to become Carnegie Mellon University. She devised a system for organizing the fellowship agreements at the Mellon Institute that enabled efficient communication across a complex organization. Effectively, Miss Whittle was doing the same type work- and published on it in the Journal of Industrial Chemistry – as Slack.

There are several obvious differences between these two situations. The first is the historic time period. Miss Whittle worked mostly in the early part of the 20th century, and she worked with different tools than many people use now. The second is gender. Miss Whittle is a woman and Slack’s founder and CEO is male. I don’t want to downplay either of these differences, and there is a lot to be said about the term “innovation” and how it relates to race, gender, and class.

However, what I’m seeing as the main demarcation of maintenance vs. innovation in this instance is control – control over the shape and future of the company and the technologies being created. The entire purpose of Slack is the streamlining of office communications. Miss Whittle’s work is ancillary to the work of the Mellon Institute. Although both situations deal with office communication and organization, Slack is in control of its product and how it can be used. Miss Whittle responded to a need within a company, and Miss Whittle did not have control over how her technique would be used over time or if it would be used. Slack is able to put out its technologies as a product to be bought and sold, and it has control over this product. Control over the technology makes it appear more “innovative” because one can decide how it is marketed. It is the purpose of the company rather than a by-product of its work.

Of course, with control over a product, one has the ability to sell that product. In some ways, “innovation” may be best understood as a term connected the particular shape of capital and technology today, much like “disruption,” as you can read more about in this New Yorker article. Miss Whittle had nothing to sell – can this be tied to a difference in perception of these two separate technologies?

In any case, innovation and maintenance (in so far as one defines it in opposition to innovation) seem dependent on context more than the nature of the technology created. The reason why a piece of technology is deemed innovation or maintenance has little to do with the technology created and much more to do with structures in which that technology was created and was/is used.


NLM and Early Networked Computing

nlm nlmuid-101446025-img
Yvonne Scott dials into MEDLINE, ca. 1971. Available at NLM Digital Collections, NLMUID 101446025

I’ve written another post at Circulating Now, the NLM blog on history of medicine. This one is titled, “MEDLARS II: MEDLINE & Instantaneous Search.”

Before the Internet, there were other computer networks, and I’ve highlighted how NLM was able to provide for online bibliographic searches, starting in 1970.

You can view it here:


That Long-Promised White Paper on Software Preservation

So, this paper was finished in the early fall, and I’m only now getting around to making it public. Regretfully, what that means is that some of the paper is out of date. Several important projects and papers, like The Software Preservation Network and the recent paper on emulating digital art objects from Cornell, are not included.

That said, I still think the paper is worthwhile. It considers software preservation in a broad context. It includes strategies like recording users interacting with software, and while all of the strategies discussed may not preserve software in a way that future users can interact with, they do document software as a cultural object. Institutions that have collecting mandates that include software-related materials may find this research useful. Although emulation is becoming more of a feasible option for many institutions, it is important to consider a wide array of preservation and documentation techniques that could prove useful to future researchers.

Software is a part of day-to-day life, and while it is important to preserve software, it is also important to consider what that piece of software’s function is. The impetus for preserving an MMORPG and the impetus for preserving an algorithm from a scientific experiment may be entirely different, and the ways that institutions approach software should reflect that difference.

If I missed a project or if you have any comments or questions, please don’t hesitate to contact me.

And, here it is – the long-promised white paper.

Starting is Half the Battle: Collecting as the First Step to Software Preservation

Quick note: This blog post will also be posted on The Future of Information Alliance (FIA) website.

Image 3
Example of the types of software developed at the National Library of Medicine

As the National Digital Stewardship Resident (NDSR) at the National Library of Medicine, I am currently devising a software preservation pilot strategy. What this strategy entails is the repository ingest of software materials held on obsolete media, the description of said materials, and the creation or digitization of contextual materials. Complicating this project is the fact that there has not been a comprehensive collection strategy for software at NLM, and many documents and copies of software have been lost over time. With this in mind, the first, and perhaps most important step, is for institutions to include software and software documentation as a part of their pre-existing collection strategies.

Software preservation is, quite simply, the attempt to make software usable many years in the future. Although it is possible to save the bytes of a piece of software, providing access to it and making it usable is more difficult. Because software relies on complex technical infrastructures in order to operate properly, future users may not be able to interact with software in a meaningful way if an institution only saves the bytes. For a software program to function, it needs to be installed on the correct operating system, on the correct hardware, and with any necessary ancillary programs or code libraries also installed. For a preservationist, this can be a nightmare.

Emulation is one way to deal with these complex dependencies, and recent attempts to make emulation an easier option for libraries and archives have made immense gaines. There are a variety of services – EaaS and The Olive Archive – that help people emulate computing environments and assist libraries and museums with the vast technical dependencies of software programs. While this technology is a big step forward for the field and for access to software-based materials, it does not constitute a complete response to the need for software preservation. Before getting to the point of emulating materials, an institution needs to address how it will collect software.

The process of devising and implementing a collection strategy for software and related materials can be daunting even for institutions whose collections are already closely aligned with software, computing, and technology history. Regardless, a comprehensive collection strategy is the first and most important step to preserving software. Without a collection strategy, it is more likely than not that the software will be lost before an institution has managed to device and implement larger strategies for futures users to access that software, either through emulation or another tool.

A proper collection strategy for software based materials should reflect the larger collection goals of the institution. It does not make sense for an art library to begin collecting scientific software, but it does make sense for an art library to collect software that artists created or used as part of their creative process. Just as adding A/V materials to a collection strategy does not mean that an institution needs to collect all A/V materials, collecting software does not mean collecting all software. The same care and attention paid to the wider collection strategy needs to be taken when considering software acquisitions.

Presenting about software preservation at the National Library of Medicine; Photo by Ben Petersen

Part of this collection strategy should include contemporaneous documentation and manuals. Throughout my project at NDSR, I’ve relied on manuals and documentation in order to get software running and to understand what the software is meant to do. Without this documentation, having copies of the software would prove of limited value. Furthermore, the marketing and packaging material for software can have historical importance itself. Any collection strategy for born digital materials should consider what analog material also needs to be collected so that future historians, archivists, and researchers can properly contextualize and assign meaning to a piece of software.

It is important the the collection strategy is well-documented and added to the wide collection strategy documents at the institution. Communicating the importance of collecting software or software-based materials is an essential part of beginning a software preservation program. An institution relies on many employees to acquire, care for, and provide access to its materials. Creating accessible documentation about software collecting and engaging in an open dialogue about collecting software is an important aspect to creating a sustainable program.

After software is added to an institution’s collection strategy as is applicable, there are many other questions and issues to attend to. Moving forward, it will be vital for institutions to get software into a digital repository and off of volatile tangible media like floppy disks and CD-ROMS. If the first step is to collect software, the second step is to save the bytes. However, simply adding software to the collections strategy will ensure that the institution, in the future, will have materials to preserve and showcase in whatever manner it decides best suits its resources, audiences, and needs. Without a comprehensive collection strategy, future actions, projects, and programs will be severely limited.