Help the Digital Preservation Q&A at StackExchange

Stack Exchange Q&A site proposal: Digital Preservation

Join us.

I’ve recently committed to the Digital Preservation Q&A proposal at StackExchange. This is a resource I really hope  comes to fruition, as there’s a lack of sites to support exchange of strategies and advice for people involved in digital preservation, as well as to field questions from persons familiarizing themselves with the practice.

This latter audience has been on my mind particularly since leaving the DPOE program last year. Although we have fielded questions over an email listserv, this venue has a few significant weaknesses:

  • It’s difficult to bookmark or reference back to advice or information within a thread.
  • The email body and thread is not friendly to text formatting, links, and other formatting that would make information more readable, digestible and inclusive.
  •  The information is unstructured — one can not apply tags, select a topic as a favorite, vote up a discussion, or track edits in any systematic way.

By contrast, the StackExchange approach is a mix between a question-and-answer site and Wikipedia, with some reward elements to provide incentive for good contributions. There are a host of topics covered under the network, from gardening to LEGOs to electrical engineering. The network hosts an Area 51 site, which maintains all the topics proposed presently that users are interested in, but which are not yet formal sites. There’s a lot there, and you’d likely be interested in a few.

Why StackExchange? It features all the methods to structure information I described above. I really can’t imagine a better format (at least, not one already set up and sorted out) for building up a knowledge base in digital preservation, and one that can adjust with time. Digital preservation is a practice that will change immensely with time. There will be an assortment of questions and procedures, ranging from the obscure rescue efforts to large scale and contemporary migration processes.

As part of the state archives here in Mississippi, I do a good bit of training to state employees on electronic records management and preservation. Required retention periods for born-digital objects can range from three to fifteen or more years, while many are marked for permanent retention and will be deposited here at the archives. Considered planning for digital content repeatedly comes up. A single good resource to point them to would be very welcomed.

Consider committing if the topic interests you. It’s especially helpful if you’re already engaged in other StackExchange sites, and as noted there are a whole lot of topics to join, so there’s ample opportunity to get involved with StackExchange. Any interest does help!

DPOE National Calendar

I want to give a brief shout out to the DPOE National Calendar, brand spanking new as of June 2011.

The idea is to have a single, general purpose calendar that covers digital preservation workshops, talks, etc., across the country. If you’re giving a talk or workshop, no matter how small the audience, consider submitting it here. And of course you can check the calendar to attend events, whether online or local to your area.

A longer post on DPOE is still forthcoming.

Next Week: Digital Preservation in D.C.

Next week I’ll be attending a train-the-trainer workshop hosted by the Library of Congress in D.C. I’m thrilled to be attending and I’m really looking forward to meeting the other participants.

The Digital Preservation Outreach and Education (DPOE) program is a recent initiative by LOC to “foster national outreach and education to encourage individuals and organizations to actively preserve their digital content.”

Since attendees are coming from a variety of institutions, it’s going to be really interesting to discuss the different contexts in which digital preservation can be introduced. Audiences and clients can make a big difference in how you articulate a subject – and identifying the core issues within those variations is a (perhaps lofty) goal of mine for this workshop.

That, as well as feedback on training and workshop execution, of which my position requires a good deal of, cannot be too welcome!

I hope to have a post or two on the workshop during or shortly after.

Book Review: Racing the Beam [re-post]

A re-post from the Preserving Games blog, February 12, 2010.

Montfort, N., & Bogost, I. (2009). Racing the Beam: The Atari Video Computer System. Platform Studies. Cambridge, Massachusetts: MIT Press.

Racing the Beam
Racing the Beam: The Atari Video Computer System

Just want to give a brief rundown on a really great read I’ve come across. MIT has started a “Platform Studies” series of books where the idea is to examine a platform and its technologies to understand how this informs creative work done on the platform. Platforms could range from gaming consoles, to a programming language, to an operating system, or even the Web itself if this is the platform upon which creative work is being made. The platform in this case is the Atari Video Computer System, the first Atari home system, later referred to as the Atari 2600 in the wake of the newer Atari 5200.

The authors examine the Atari VCS as a computing system, and take care to elaborate the unique (really exceptionally odd) constraints found there. Six games are investigated in chronological order, giving the reader a sense of the programming community’s advancing skill and knowledge of the system: Combat (1977), Adventure (1980), Yar’s Revenge (1981), Pac-Man (1982), Pitfall! (1982), and Star Wars: The Empire Strikes Back (1982).

The most prominent technical details are explained in first few chapters, and they illuminate each game’s construction as an exceptional act of engineering and ingenuity. Just to give an idea of the unique affordances of the Atari VCS, here are a few of the most characteristic details:

  • The custom sound and graphics chip, the Television Interface Adapter (TIA), is specifically designed to work with a TV’s CRT ray. The ray itself sprays the electrons onto the inside of a TV screen, left to right, one horizontal scan line at a time, taking a brief break at the end of each line (a “horizontal blank”) and a longer break at the bottom line, before resetting to the top and starting over again (a “vertical blank”). A programmer only has those tiny breaks to send any instructions to the TIA, and really only the vertical break provided enough time to send any game logic to the system.
  • It was imperative that game logic be sent at these breaks because the Atari VCS had no room for a video buffer. This meant there was no way to store an image of the next frame of the game, all graphic instructions are written in real time (sound instructions had to be dropped in on one of the breaks). A designer or programmer could choose to restrict the visual field of the game in exchange for more time to send game logic instructions. Pitfall! is an example of this.
  • This means there are no pixels on the Atari VCS. Pixels require horizontal and vertical planes, but for the Atari VCS, there is only horizontal scan lines. There is no logical vertical division at all for the computational system. As the beam goes across the screen, a programmer can send a signal to one of the TIA’s register to change the color. Thus, the “pixels” are really a measure of time (the clock counts of the processor) and not space.
  • Sprites, such as they existed for the Atari VCS, were hard-coded into the ROM of the system. Programmers had five: two player sprites, two missiles, and one ball. Reworking that setup (clearly designed for Pong and the like) into something like Adventure, Pitfall!, or even the Pac-Man port is an amazing feet.

The book doesn’t refrain from the technical. I could have used even more elaboration than what is presented in the book, but after a certain point the book would turn into an academic or technical tome (not that there’s anything wrong with that), so I appreciate the fine line walked here. The authors succeed at illuminating technical constraints enough for the general reader to understand the quality of the engineering solutions being described. Moreover, the authors leave room to discuss the cultural significance of the platform, and to reflect on how the mechanics and aesthetics of these Atari titles have informed genres and gameplay presently.

Attending IDCC ’10

6th International Digital Curation Conference banner

I’m happy to break the months-long silence here just to say I’ll be heading to the 6th International Digital Curation Conference in Chicago, Monday to Wednesday of this week. I’ll be manning the poster for the Dr. Winget’s Preserving Games research project, explaining to all willing passerby our findings regarding record creation in video game development and some key implications for curation of these records.

I’ll be able to catch a few talks on Tuesday and Wednesday before heading out. Just a few I’m interested in hearing:

  • “Idiosyncrasy at Scale: Data Curation in the Humanities.” John Unsworth, Dean & Professor, Graduate School of Library and Information Science & Director Illinois Informatics Institute, University of Illinois at Urbana-Champaign.
  • “Linking to Scientific Data: Identity Problems of Unruly and Poorly Bounded Digital Objects” Laura Wynholds, University of California, Los Angeles.
  • “DataStaR: Using the Semantic Web approach for Data Curation” Huda Khan, Brian Caruso, Brian Lowe, Jon Corson-Rikert, Diane Dietrich & Gail Steinhart, Cornell University.
  • “Dependency Analysis of Legacy Digital Materials to Support Emulation Based Preservation” Aaron Hsu & Geoffrey Brown, Indiana University.
  • “What constitutes successful format conversion? Towards a formalisation of “intellectual content” C.M.Sperberg-McQueen, Black Mesa Technologies LLC.
  • “Assessing the preservation condition of large and heterogeneous electronic records collections with visualizations” Maria Esteva, Weijia Xu, Suyog Dutt Jain & Jennifer Lee, University of Texas at Austin

DCC seems to be pretty serious about “amplifying” the conference to non-attendees and attendees alike. There’s a Twitter account (@idcc10) and a Netvibes dashboard, which will host all manner of media and feeds for the conference.

Here’s the ‘Minute Madness’ slide, which accompanies (appropriately enough) a one-minute rundown of the project:

Winget-Sampon IDCC '10 Minute Madness Slide
IDCC '10 Minute Madness Slide

Here’s the PowerPoint slide.

What Went Wrong? A Survey of Problems in Game Development [re-post]

A re-post from the Preserving Games blog, October 19, 2009.

Fábio Petrillo et al., “What went wrong? A survey of problems in game development,” Computers in Entertainment 7, no. 1 (2, 2009): 1-22.

This February 2009 article from the Computer in Entertainment magazine of ACM takes a look at the game industry and compares its difficulties to the larger software industry. Specifically the authors analyze twenty postmortems from the archives of Gamasutra.com to characterize the problems that plague game development. I believe Gamasutra has discontinued this series but postmortems are still published by sister publication Game Developer.

A postmortem “designates a document that summarizes the project development experience, with a strong emphasis on the positive and negative aspects of the development cycle.” After reviewing the literature discussing problems present in the software industry, the authors begin to analyze the problems described in the postmortems. The games covered and the problems identified and quantified are in a table that describes the number of occurrences and overall frequency (click for larger image). Note: sometime in the future (the Web 3.0 future?) I would provide a link to the actual dataset rather than a .PNG  showing you a picture of the dataset.

What Went Wrong Interview Table
What Went Wrong Interview Table

The authors’ categories provide a helpful navigation to the issues that arise in a game development project. As they note, this study sees the most cited problems as unreal or ambitious scope and features creep, both constituting 75% of all problems described. Notable for game archivists is a 40% frequency for the lack of documentation problem as well. The authors note low occurrences for crunch time and over budget (25%), both “said to be ‘universal.'” It’s difficult however to draw expansive conclusions from a small dataset. Moreover postmortems were not team projects or collaboratively written, rather a single participant is responsible for the postmortem. The authors usefully provide other limitations to put the data in context.

The authors conclude that the electronic games industry does indeed suffer from problems in the larger software industry (overly ambitious plans and poor requirements analysis) as well as woes peculiar to itself: the first to experiment with new technologies, tool problems, and collaboration between disparate professionals, among others.

On a final note, the postmortems are still available at Gamasutra, and they are really fascinating reads. It becomes clear just how young an engineering and creative discipline digital game-making is, and how much fluctuation there is in how a game turns out. There are some great examples and stories there; the authors of this article cite quite a few of them.