Puzzle Games for Software?

I just read Robert Patrick’s essay on eMuseums, hosted at Paul McJones’ excellent Dusty Decks blog. It’s a great read and addresses some of the problems of presenting computer history in an effective, and extensible, fashion.

I was specifically interested in Mr. Patrick’s thoughts on presenting software history. Hardware is a more intuitive museum subject in significant ways (its object-ness among them), but of course museums successfully convey subjects which have no direct corresponding object (e.g. touching the actual clothes of a Civil Rights victim) quite well. Still software remains especially difficult to present in an interesting way.

Mr. Patrick states that software’s workings are opaque to users, and suggests a multithreaded approach to software history that documents the different software types (applications, subroutines, operating systems, etc.) as they emerge, ascend or recede over time as separate threads.

Along with this, I am specifically interested in conveying to the museum goer the architecture, engineering, and writing of software. There is no better way to communicate the human labor, ingenuity, and yes, the toil, that goes into software making. Quoting industry numbers does not tell the museum goer that software is frequently an epic engineering project with considerable drama, not just externally (between departments, coders, and investors), but internally as well (in engineering problem solving). How to convey this drama?

Software is both an engineering and creative endeavor, and it exercises a rich figurative language that suggests physical play and work: variables are passed, object are created, something is trimmed, cleaned or scrubbed, a request is made, an exception is thrown, a thread stops and starts, etc.

I think this language indicates a way to illustrate the software engineering problem space abstracted away from specific commands and syntaxes. For example, museum visitors could manipulate some system (either a physical system or video game-type piece) with certain constraints emulating those of the coder. Come to think of it, puzzle games do a fine job of such demonstration already (perhaps more Portal than Braid). They could likely be much better demonstrations, of course, if they were directed toward this specific purpose.

I would love to see the day when some of the problems, solutions, tricks, etc. of software engineering are conveyed as well as those of medieval cathedrals or the Giza pyramids.

Modular Emulation and Modular Description

Diagram from 'Modular emulation as a long-term preservation strategy for digital objects'
Diagram from 'Modular emulation as a long-term preservation strategy for digital objects'

For work this week I have focused on adding content to the catalog and moving the local instance live. On the latter, it’s almost ready! There are a couple of hitches presently but everything has migrated correctly. For the former, I have been finishing up details on the Apple IIe, and adding an Osborne 1 to the site.

For the Apple IIe, I’ve scanned some documents Matt has kept through the years: packing lists, warranties, business reply forms, manual errata, etc. These add a good deal of use context to the machine. For instance, the Apple IIe came with a wrench and nut plate for adding and swapping expansion cards. The computer was really meant to be modified and expanded upon by the user. It is really a very open device. Not only does one not need screws to access the motherboard and cards, one doesn’t even need to turn the machine on its side or upside down. It opens in its regular orientation, sitting on the desk. Besides this, a printer registration card from Star Micronics provides a list of popular computing magazines from which the purchaser can indicate which he or she reads. These range from Apple Orchard and 80 Microcomputing to Dr. Dobb’s Journal.

The Osborne 1 is clearly a less openable device, but it’s providing a good test of the how flexible the modeling we have used so far really is. Like the Apple IIe, it is a fully-functioning system, but unlike that machine, there is no physical computer case to base components around. The Osborne’s form factor prevents this sort of distinction since it’s a single containing unit. Still, the system has a motherboard (which hosts all the connections and software), and it does have component pieces, such as a 300 baud modem, the 5″ CRT display (dwarfed between two Fujitsu floppy disk drives), the microprocessor, etc.

A video game preservation paper has been making the rounds of late. Dave tweeted about its discussion on Slashdot, then it showed up at Ars. It’s a good paper, and I was particularly interested in one of its citations, a 2005 paper from the National Library of the Netherlands that proposes modular emulation as a new tact on the emulation front. The authors, Jeffrey van der Hoeven and Hilde van Wijngaarden, describe some common emulation woes such as stack emulation (the rabbit hole of emulators emulating emulators and so on to persist the particular emulator of interest through future platforms), emulator migration (rewriting the emulator over and over to persist the emualtor through future platforms), and the present limitations of Lorie’s UVC for behaviorally complex data with intense I/O requirements (like software).

Modular emulation proposes breaking down emulation to component parts in the interest of reusing those component parts in different and new configurations:

Emulation of a hardware environment by emulating the components of the hardware architecture as individual emulators and interconnecting them in order to create a full emulation process. In this, each distinct module is a small emulator that reproduces the functional behaviour of its related hardware component, forming part of the total emulation process.

This makes a lot of sense to me, and it maps perfectly to the modeling we are investigating here. For example, instead of concerning ourselves with writing an emulator for the Apple IIe (which as an actual and specific machine is always going to vary in expansion cards and internal peripherals, etc.), we instead focus on a solid emulator for the MOS 6502 8-bit microprocessor that handles the system’s computations. That processor appears in many, many machines, so having that emulation software is much more useful than the Apple IIe as an unbreakable whole. It needs to be combined with other emulator-components, of course.

The benefit of modeling and describing systems by components is that if done consistently by a large number of persons, one begins to generate a collective database of parts and pieces. This can facilitate recognition of similarities across platforms (could be useful for platform studies endeavors), easier groupings of system properties, etc., and ideally, more expedient and cheaper emulation. It also strikes me that persisting these independent emulations pieces would be infinitely easier than managing a more monolithic systemwide emulation piece. And finally, this incremental approach to emulation is simply closer to the true internals of the machines, and that better accuracy of description is educational.

Space Invaders

A re-post, slightly reworked, from the Preserving Games blog, June 2, 2009. In retrospect, I wish I’d compared two version of the game with significant gameplay differences. There are any number of versions with alterations in speed, enemies’ health, scoring, etc. Ah well.

Space Invaders Unit Art
Space Invaders Unit Art

Space Invaders is iconic. You need only look at UT’s own Videogame Archive logo to get a feel for the pervasiveness of its visuals and the sort of shorthand it’s become for videogames in general. Back in 1977 game developer Toshihiro Nishikado began work on Space Invaders, creating by hand the hardware necessary to program the game. What happened after its release in 1978 is now gaming history, so I decided to take some time to find and play the most “original” version of Space Invaders I could find.

Of course here we enter shaky grounds. The most original version (earliest version?) might be a Japanese black-and-white cocktail table unit with woodgrain sides, largely unadorned, with a two-way joystick for control. But likely any American audience will consider an upright unit with movement buttons and two cellophane strips (for coloring sections of the screen) to be original enough. A bibliography of Space Invaders‘ complete gamut of ports, bootlegs and revisions would be a very considerable undertaking.

I suppose some empirical metric can be brought to the discussion by insisting that the item is more original if it is earlier than another iteration of that object. But here (if not for Space Invaders specifically, than for many other games) it can be difficult to distinguish between a prototype of the game and the first complete version of a game, especially when games are not published per se, but just iterated upon. The software may hit version 1.0, but this is not necessarily a more complete or even more accomplished version of the game. Adventure‘s most favored and most familiar version is one which came after Will Crowther’s initial solo release.

In any case, back to Space Invaders. For the curious person not willing to seek out original units, emulation is the natural way to go. The best general site I’ve located for getting a handle on the multiple versions of the game is CAESAR, which will often sport all kinds of image captures (logos, units, screenshots) of various versions of a game. The best general emulator for arcade games is MAME. Though presently only a 0.138u3 release, it’s very functional and well-polished, and is available for Windows and Linux users. I used an OS X port of the project, MAME OS X.

The other bit of shaky ground are the images, or ROMs, themselves. ROMs are full data captures of read-only memory chips, in our case arcade memory chips. Theses files are what emulators read to generate a copy of the game on a personal computer. Arcade ROMs are available at many sites like ROM World, but it should be noted that copyright restrictions are still in affect for most properties.

To sum up: search for the ROM you want at ROM World or a similar site, run MAME and load this ROM, and if you like, go to CAESAR to try and determine exactly which version you’re playing.

A word of caution: playing ROMs with emulators is presently in the realm of the hobbyist, and as such all the technical kinks are not ironed out. You may run into problems trying to play certain ROMs. The ROM, which contains several different files, may be missing some critical ones needed by the emulator. This is because another version of the game, or another game entirely that uses some of the same instructions as the ROM you’re attempting to load, will have those files instead. In that case you’ll need to download those files as well. If you encounter this problem you may consider downloading all the versions of the game, or go to an emulation forum to try and determine which ROM package will have the files you need. Aracde@Home, an excellent resource itself, has very active forums.

Space Invaders, Logitec Version
Space Invaders, CV Version
Space Invaders, CV Version

On to Space Invaders. I played two different versions: Logitec’s 1978 bootleg of the title, and the “CV” version (though I still haven’t found what that stands for), also 1978. Both versions use actually-colored pixels to emulate the cellophane-colored screens of the original units, and as you can see, the CV version emulates more strips of cellophane than the bootleg, though I prefer Logitec’s sparser color scheme. I imagine it to be closer to the real thing.

Gameplay has been described as simplistic by today’s standard, though I don’t find this to be entirely true. While Space Invaders certainly features simple dynamics and play, these aren’t necessarily any simpler than other shoot ‘em ups or first-person shooters today. There’s strategic use of cover, nimble placement of your fighter to hit the sides of the invaders rather than the center (opposing shots cancel each other), and careful picking off of invaders at the extreme left and right to increase across-screen travel time for the invaders.

It’s the slower pace and lower visual stimulation, as well as the less frequent positive reinforcement Space Invaders provides, that elicits the simplistic descriptor. Witness the debilitating addiction afforded by PopCap’s recent release Plants vs. Zombies to see how keenly some game developers understand the elemental appeal of well-polished and finely honed reward systems. In comparison, Space Invaders is sparse in the extreme.

But along with the pleasant explosion-graphic of the aliens, Space Invaders has the high score reward, the only indication you have that you were ever there, and the only way to “win” the game. Space Invaders tugs at what must be a basic need to hold out as long as possible against impossible odds. The story ends the same way every time: the invasion is a success, and you’ve perished in a futile struggle. But for the few seconds that Space Invaders captures your attention and you’re still alive, you are wholly in its web.

Developer Harvey Smith noted in a talk that the arcade genre distills games to the most basic and demanding mechanics. In place of compelling characters, story, writing, acting, and perhaps graphics as well, are the fundamental draws of satisfying character control and play incentives. An arcade game’s world, its limitations and the role the player has within it must be intuitively grasped. Space Invaders still succeeds on all these points, and its 30+ year-old mechanics are still solid, cohesive and rewarding.

Spawn Labs and Mobile Gaming [re-post]

This is a re-post from my original on the Preserving Games blog, April 5, 2009. How fast the game industry moves. OnLive must surely be considerable competition for Spawn Lab’s HD-720, and Vircion is apparently no more.

When I think of mobile gaming, whether it’s on a laptop, iPhone or other device, I think of games sensitive to the processing constraints of those platforms: web-based items like Bejeweled or Tower Defense, a retro-graphics piece like Battle for Wesnoth, etc. The latest games on the PS3, 360 or Wii do not come to mind.

Peter Walker of Spawn Labs and Vircion Inc., based out of Austin, gave a talk last week for the Texas Advanced Computing Center that explained the company’s plan to break this mold in the realm of console gaming. Their ambition is to allow gamers to play their console games on any computer at any location. The central idea behind this technology is that remote servers will handle the processor-intensive rendering of graphics and other game computation, sending an audio/video stream of the processed results to the user’s computer. That would allow the client’s computer to strictly handle those AV streams, rather than be responsible for the serious number-crunching. Gamers would essentially be playing the AV streams of the processed results, which would be dictated by whatever input the player sent to the server.

Walker identified some trends in gaming: gigabyte requirements increase, as do CPU and GPU processing requirements. Moore’s Law seems to be in full effect. But Walker points out that cooling power can’t keep up. Already an increasing percent of battery power is spent just cooling the processing chips. As a result mobile platforms like laptops simply cannot pack the cooling power necessary to run resource-intensive games (at least not without burning your lap). Walker points out the success of smaller-scale games, but notes that these are a different kind of gaming experience, casual and less time-intensive than the likes of Call of Duty 4 or Mass Effect, and constitute a gaming experience of a different order. Spawn’s research is aimed at bringing the gaming experience of the most graphically intense console works to a mobile community.

This is achieved through utilizing the increasing pervasiveness of broadband and the efficiency of audio and video codecs, specifically the H.264 standard. Key to this standard is the Scalable Video Codec. This would allow the client computer to select whichever particular bitstream it was set up to decode, among the many a gaming server would offer. That allows a gaming server to only encode and transmit once and simultaneously support a range of client machines with different codec-decoding capabilities.

So what could all this mean for game preservation? Well, in a certain sense it will make the job much more challenging. Technology like this continues the trend of client computers handling less and less of of the actual content. In this model, gamers are essentially reacting to, and playing in, a movie (the AV stream) that their input incurs. Very little of the game’s actual binary content resides on the gamer’s computer. Compare this to a classic like Ultima 4, where a considerable preservation step is accomplished by possessing uncorrupted copies of the original 3.5” or 5.25” disks. Move ahead to World of Warcaft: the client side CD contains a lot of code and information, but a very large part of the game itself is not to be found there. In the model being developed here, gamers could possess even less of the binary makeup of the game, and might simply purchase a license to play the game. That leaves the individual’s personal game material as a minor component of preservation.

At the same time, it’s a fascinating model for more flexible, platform-agnostic gaming, and it could point the way to interesting methods of preserving games. After all, if gamers could be happy interacting with a dynamic video stream, perhaps a preservation effort could employ a similar approach. If nothing else, the success of a model like this might open up the possibility of remote access to actively preserved games.

Flippy Disks, Continued

Loadstar magazine, Issue 142, Disk 1
Loadstar magazine, Issue 142, Disk 1

Here’s an update on the flippy disk problem for this week. In retrospect, that the reed switch was operating correctly was obvious: since wiring to the photo sensor was cut and rerouted to the reed switch and the regular side of the disks were being imaged, clearly the switch is operating correctly. Otherwise, the drive would error out for lack of a pulse for the index hole. (Slapping forehead.)

Instead the problem likely lays in the second, user-created write-unprotect notch. For the set of C64 disks at hand, these are circular and probably dealt with a hole punch. Using a disk with a larger unprotect notch (a straight-edged cut more similar to the manufacturer’s) did allow the FC5025 to read the flip side.

Manufacturer's write-unprotect notch
Manufacturer's write-unprotect notch

It seems then that the drive must detect a write-unprotect notch, or it will not be able to read any tracks. I have not located anything in the Fc5025 documentation indicating that it is unable to image write-protected disks, so it may be that rewiring the drive has introduced some logic which insists that an undetected write-unprotect notch prevent track reads.

Custom hole-punched write-unprotect notch
Custom hole-punched write-unprotect notch

There is also a collection of Loadstar disks here. For the disks that have a second write-unprotect notch (all but a few out of about fifty disks), both sides have been imaged fine by the FC5025 in D64 image format. However, we are considering taking G64 images of these disks as well. This format may allow the transfer of data used in copy-protected schemes which would otherwise be overlooked by the D64 format. As I’ve been learning at the C64 Preservation Project and ShadowM’s Commodore 64 site, the Commodore 1541 drive has an I/O, ROM, RAM and CPU. Code can therefore be sent to the drive and this is the basis of very wide variety of copy-protection schemes embedded in the Group Code Recording encoding of the disks. These may range from data stored in the unused upper five tracks of C64 disks (tracks 35-40) to strange header and gap data and custom formats.

Continue reading “Flippy Disks, Continued”

Flippy Disks and Reed Switches

5.25" Drive Outfitted with Magnet, and Photodiode Wires Rerouted
5.25" Drive Outfitted with Magnet, and Photodiode Wires Rerouted

My first week at MITH has been marked by a firsthand encounter with a tradition of 1980s personal computing and data storage: the flippy disk. Before I get into that though I thought I would briefly introduce myself.

I’m a graduate student at UT Austin’s School of Information. Although I originally set out to study book and manuscript preservation, my interests soon turned to digital media preservation and curation. Specifically, newer media that uses unique digital affordances are of interest: video games, interactive narratives, etc. Not surprisingly legacy media and vintage hardware can play into this quite a bit.

My background is in the humanities, studying literature. But like a lot of folks here, I have at least some technical aptitude when it comes to computing. I fondly remember trying to program an adventure game in QBASIC, using nothing but text prompts, if…else conditions, and GOTO. It was unashamed spaghetti code, but fun.

On to flippy disks. The majority of 5.25″ disks were design to be used on single side only, but users could produce another notch on the opposite edge of the manufacturer’s own. When the user inserted the disk (upside-down) into the drive, the new write-unprotect notch allowed the floppy disk drive to treat the side as writable, thus doubling the user’s storage capacity.

Continue reading “Flippy Disks and Reed Switches”

Notes on the Open Archival Information System (OAIS)

Back in 2002 Consultative Committee for Space Data Systems made a recommendation to the ISO for an Open Archival Information System. The recommendation has found broad acceptance and varying levels of compliance are usually elaborated upon in the digital repository software packages like DSpace. Since we want our archive to have a future as a federated or cooperating (OAIS terms) archive, and since the terminologies and concepts created in this document are widespread, I decided to take some notes on the recommendation as they relate to potential metadata elements we’ll employ.

The recommendation mostly concerns itself with the long term preservation of digital objects, although the framework incorporates metadata for physical objects as well. Broadly, OAIS defines an Information Object as a Data Object coupled with its Representation Information. The Representation Information allows a person to understand how the bits in the Data Object are to be interpreted. An example would be a TIFF file (Data Object) coupled with an ASCII document (Representation Information) detailing the headers, its compression method, etc., like here: TIFF description at Digital Preservation (The Library of Congress). Of course, one might also want Representation Information for the ASCII file, to explain how characters are interpreted in that format. OAIS terms this phenomenon recursive Representation Information and one might eventually accrue a Representation Network of such digital objects. One stops when the Knowledge Base of your Designated Community has the requisite knowledge to understand your top-most piece of Representation Information.

OAIS defines two types of Representation Information: Structure and Semantic. Structure Information describes the data format applied to the bit sequence to derive more meaningful values like characters, pixels, numbers, etc. Semantic Information describes the social meaning behind these higher values (for example that the text characters are English).

OAIS discourages using software that can access and use Data Objects as a replacement for comprehensive Representation Information. Although that would serve the end user well enough for a time, the software itself naturally poses its own obsolescence problem. Of course, the digital media we would like to preserve is mostly software itself. We may have datasets, images, scans, etc., but the majority of digital assets we hold are complete software packages. This includes operating systems, office suites, computer games, console games (on cartridges) and so on. Retrieving Representation Information for all these types of software will be a considerable and ongoing task, as most software will consist multiple file types.

Continue reading “Notes on the Open Archival Information System (OAIS)”

Mechanisms: An Annotation

Kirschenbaum, M. (2008). Mechanisms: New media and the forensic imagination. Cambridge: MIT Press.

Matthew Kirschenbaum, Associate Professor of English and Associate Director at the Maryland Institute for Technology in the Humanities (MITH), here examines digital media in the context of traditional textual studies and bibliography. Kirschenbaum presents to the reader forensic techniques for data recovery and investigation that reveal how digital media, typically assigned attributes like ephemeralness, repeatability and variability (what he terms a traditional “screen essentialism” attitude about digital media), actually fulfills traditional bibliographic requirements of individualism, provenance and inscription.

Central to understanding these qualities of new digital media is an understanding of the affordances and technical mechanics of the dominant storage device for the last twenty or so years: the magnetic hard disk drive. Kirschenbaum reveals how data inscription on these devices (the magnetic fluxes inscribed on the drive’s multiple platters) can identify past events and previous inscriptions in a discrete spatial territory, much like the clues traditionally found textual scholars. The author makes a distinction between this forensic materiality and the more familiar formal materiality of digital media: its carefully controlled and highly engineered behavior we see on the screen. The author elaborates on how software engineering and extensive error checking at every level of the computer works to migrate magnetic fluxes to actual human-readable documents on the screen. Even at the formal materiality level many bibliographic and textual details are overlooked for lack of close inspection: multiple versions, multiple operating environments, actual textual differences between works, etc.

Three case studies illuminate these topics: a forensic and textual analysis of a Mystery House disk image, a bibliographic and historic look at the multiple versions of Afternoon: A Story by Michael Joyce, and a look at the social and textual transmissions of William Gibson’s “Agrippa.”

Kirschenbaum’s central argument is that traditional characterizations of electronic texts and media (fluid, repeatable, identical, ephemeral) is insufficient for bibliographic, preservationist, and textual purposes, and that the media itself, upon closer examination, supports none of these characterizations.