Just a little post to say I’ll be speaking at the NAGARA e-Records Conference this year in Austin, Texas. I’ll be describing the efforts of CoSA’s State Electronic Records Initiative (SERI) over the past few years – specifically our educative efforts, and the upcoming electronic records training workshops this year and next. These workshops will collectively be attended by every state and territorial archives and records program in the country.
Mississippi Department of Archives and History on Flickr
I’m happy to post that the Mississippi Department of Archives and History now has a Flickr page for our archival material. This is in addition to the Digital Archives we host already, along with numerous other scans scattered about in the catalog which are not exhibited.
I’m optimistic that Flickr will add something important to our online presentations. Along with user feedback in the form of comments and tags, Flickr allows us to more quickly highlight and share material not already exhibited or which exists as a single item outside of a collection. We also have our eye on joining The Commons at Flickr once we’ve managed the account for a while.
Some Thoughts on Flickr
So, it’s been a while since Flickr was the new hotness. Instagram, Pinterest, Facebook, Twitter and a handful of other platforms have established themselves as the preferred way for individuals to share photos. There are as well a fewarticles describing Yahoo’s mismanagement and costly misunderstanding of Flickr’s value and purpose.
(And yes, Flickr missed a few boats – for instance, amplifying its social network. Check out the vestigial Singleness option for you on your profile: Single, Taken, Open, ‘Rather Not Say’ (distinct from simply not filling in the options at all, of course). Not sorry to see this one go by.)
I remain convinced however that there is simply no better social media platform for a cultural institute to share their photos on than Flickr. Despite some rough years, Flickr still offers the very best space for showcasing this type of material.
It gives the photos adequate space for descriptive and technical metadata.
It manages and displays high-resolution photos very well.
Its grouping mechanism of sets and collections aligns well the archives, museums and libraries.
Built-in support for Creative Commons licenses and an appropriate license for archival material – No known copyright restrictions.
And there has been an uptick in activity from the Flickr camp of late – a splendid uploader and organizer built on HTML5 being two of them. Flickr still has immense value.
I am especially interested to see how user contributions turn out. This has been a subject that cultural institutes on Flickr have discussed before – see this post by Larry Cebula and the discussion on Flickr generated from it. The issue discussed in those links is how valuable the user contributions are — given the signal-to-noise ratio of great contributions to unhelpful contributions.
I can’t help but feel that Flickr could benefit from a filtering or ranking system that elevates and highlights valuable comments and lowers or hides less valuable or incorrect contributions — a solution suggested in the aforementioned Flickr thread. Wikipedia does this through editing. Reddit does this through voting. Stack Exchange does this through voting and a point-based reputation system linked to site privileges. All potentially valid ways of emphasizing the good over the not-so-great. Flickr could provide purpose and direction to its social network and the resulting content through systems like these (and finally get the confidence to drop the ‘Singleness’ option on its profile pages).
There are naturally any number of wonderful contributions, and any number of trivial or silly ones. It’s just that ratio that is the deciding factor. As I say, I’m interested and optimistic that we can get a good community going, and I’m really looking forward to more engagement with patrons and interested persons through the platform.
I’ve recently committed to the Digital Preservation Q&A proposal at StackExchange. This is a resource I really hope comes to fruition, as there’s a lack of sites to support exchange of strategies and advice for people involved in digital preservation, as well as to field questions from persons familiarizing themselves with the practice.
This latter audience has been on my mind particularly since leaving the DPOE program last year. Although we have fielded questions over an email listserv, this venue has a few significant weaknesses:
It’s difficult to bookmark or reference back to advice or information within a thread.
The email body and thread is not friendly to text formatting, links, and other formatting that would make information more readable, digestible and inclusive.
The information is unstructured — one can not apply tags, select a topic as a favorite, vote up a discussion, or track edits in any systematic way.
By contrast, the StackExchange approach is a mix between a question-and-answer site and Wikipedia, with some reward elements to provide incentive for good contributions. There are a host of topics covered under the network, from gardening to LEGOs to electrical engineering. The network hosts an Area 51 site, which maintains all the topics proposed presently that users are interested in, but which are not yet formal sites. There’s a lot there, and you’d likely be interested in a few.
Why StackExchange? It features all the methods to structure information I described above. I really can’t imagine a better format (at least, not one already set up and sorted out) for building up a knowledge base in digital preservation, and one that can adjust with time. Digital preservation is a practice that will change immensely with time. There will be an assortment of questions and procedures, ranging from the obscure rescue efforts to large scale and contemporary migration processes.
As part of the state archives here in Mississippi, I do a good bit of training to state employees on electronic records management and preservation. Required retention periods for born-digital objects can range from three to fifteen or more years, while many are marked for permanent retention and will be deposited here at the archives. Considered planning for digital content repeatedly comes up. A single good resource to point them to would be very welcomed.
Consider committing if the topic interests you. It’s especially helpful if you’re already engaged in other StackExchange sites, and as noted there are a whole lot of topics to join, so there’s ample opportunity to get involved with StackExchange. Any interest does help!
The idea is to have a single, general purpose calendar that covers digital preservation workshops, talks, etc., across the country. If you’re giving a talk or workshop, no matter how small the audience, consider submitting it here. And of course you can check the calendar to attend events, whether online or local to your area.
Next week I’ll be attending a train-the-trainer workshop hosted by the Library of Congress in D.C. I’m thrilled to be attending and I’m really looking forward to meeting the other participants.
The Digital Preservation Outreach and Education (DPOE) program is a recent initiative by LOC to “foster national outreach and education to encourage individuals and organizations to actively preserve their digital content.”
Since attendees are coming from a variety of institutions, it’s going to be really interesting to discuss the different contexts in which digital preservation can be introduced. Audiences and clients can make a big difference in how you articulate a subject – and identifying the core issues within those variations is a (perhaps lofty) goal of mine for this workshop.
That, as well as feedback on training and workshop execution, of which my position requires a good deal of, cannot be too welcome!
I hope to have a post or two on the workshop during or shortly after.
I’ve been reading a series of posts by David Rosenthal on his blog analyzing the issue of format obsolescence. Traditionally, and at least in my education, format obsolescence has been treated as one of the great bugaboos of digital preservation. In response, a number of tools and resources have been developed focusing on format identification and validation (DROID, JHOVE, FITS, PRONOM and the upcoming UDFR to name a few prominent ones). Looking at the preservation landscape, it’s clear that format sustainability has been forefront in the collective effort.
Rosenthal however makes a convincing argument that this placement of effort is misguided, and is not providing the best ROI for the digital preservation community. I won’t repeat his arguments, except to say that Rosenthal places the format obsolescence issue in a historical context that suggests much has changed in computing since, and indicates other areas much overlooked (bit fixity, storage costs and hardware quality) that are shaping up as problematic indeed. Here’s a starter to his posts:
That should get one started although there are many, many posts on the subject. Given those dates, I’m pretty late to the party, but I feel this is required reading for digital preservationists, agreement or no aside.
After a few reads you may be running for the nearest self-healing, mirrored ZFS volume, waking up in cold sweats and mumbling on about silent data corruption. Scary.
Montfort, N., & Bogost, I. (2009). Racing the Beam: The Atari Video Computer System. Platform Studies. Cambridge, Massachusetts: MIT Press.
Just want to give a brief rundown on a really great read I’ve come across. MIT has started a “Platform Studies” series of books where the idea is to examine a platform and its technologies to understand how this informs creative work done on the platform. Platforms could range from gaming consoles, to a programming language, to an operating system, or even the Web itself if this is the platform upon which creative work is being made. The platform in this case is the Atari Video Computer System, the first Atari home system, later referred to as the Atari 2600 in the wake of the newer Atari 5200.
The authors examine the Atari VCS as a computing system, and take care to elaborate the unique (really exceptionally odd) constraints found there. Six games are investigated in chronological order, giving the reader a sense of the programming community’s advancing skill and knowledge of the system: Combat (1977), Adventure (1980), Yar’s Revenge (1981), Pac-Man (1982), Pitfall! (1982), and Star Wars: The Empire Strikes Back (1982).
The most prominent technical details are explained in first few chapters, and they illuminate each game’s construction as an exceptional act of engineering and ingenuity. Just to give an idea of the unique affordances of the Atari VCS, here are a few of the most characteristic details:
The custom sound and graphics chip, the Television Interface Adapter (TIA), is specifically designed to work with a TV’s CRT ray. The ray itself sprays the electrons onto the inside of a TV screen, left to right, one horizontal scan line at a time, taking a brief break at the end of each line (a “horizontal blank”) and a longer break at the bottom line, before resetting to the top and starting over again (a “vertical blank”). A programmer only has those tiny breaks to send any instructions to the TIA, and really only the vertical break provided enough time to send any game logic to the system.
It was imperative that game logic be sent at these breaks because the Atari VCS had no room for a video buffer. This meant there was no way to store an image of the next frame of the game, all graphic instructions are written in real time (sound instructions had to be dropped in on one of the breaks). A designer or programmer could choose to restrict the visual field of the game in exchange for more time to send game logic instructions. Pitfall! is an example of this.
This means there are no pixels on the Atari VCS. Pixels require horizontal and vertical planes, but for the Atari VCS, there is only horizontal scan lines. There is no logical vertical division at all for the computational system. As the beam goes across the screen, a programmer can send a signal to one of the TIA’s register to change the color. Thus, the “pixels” are really a measure of time (the clock counts of the processor) and not space.
Sprites, such as they existed for the Atari VCS, were hard-coded into the ROM of the system. Programmers had five: two player sprites, two missiles, and one ball. Reworking that setup (clearly designed for Pong and the like) into something like Adventure, Pitfall!, or even the Pac-Man port is an amazing feet.
The book doesn’t refrain from the technical. I could have used even more elaboration than what is presented in the book, but after a certain point the book would turn into an academic or technical tome (not that there’s anything wrong with that), so I appreciate the fine line walked here. The authors succeed at illuminating technical constraints enough for the general reader to understand the quality of the engineering solutions being described. Moreover, the authors leave room to discuss the cultural significance of the platform, and to reflect on how the mechanics and aesthetics of these Atari titles have informed genres and gameplay presently.