2

Pascal Spoken Here

“Hear how Tommy went from knowing nothing about code to building one of Time’s 50 Best Websites.” So reads ad copy from Codecademy, one of many online, self-paced tools for learning to write software. Whether online or in institutions, “learn to code” has become the lazy mantra of today’s educational aspiration.

Lost in debates about whether and how to learn to code is the fact that learning to program has become harder rather than easier, even as the personal computer has become more prevalent and influential. Issues of access and diversity notwithstanding, in the early days of the microcomputer, many of us learned to program because that’s what one did with a personal computer. It sounds unlikely today, I know, but advertisements from the early days of the PC drive this point home. Consider a two-page spread from 1977, the first year of the Apple II. It ran in Scientific American, among other publications.

ump-bogost-fig01a
ump-bogost-fig01b

On the verso, a turtlenecked man sits at the kitchen table, working on some kind of chart portrayed nonchalantly on the display to his side. The gendered stereotype of the male computer programmer–operator is unfortunately reinforced by the woman who stands at the sink in the background, but the implication is clear: computing is a kind of work that one can do at home—that one might need to do there.

On the recto, comprehensive ad copy flanks detailed specifications, including a glamour shot of the motherboard instead of the casing. “But you don’t even need to know a RAM from a ROM to use and enjoy Apple II,” the text reads. “It’s the first personal computer with a fast version of BASIC—the English-like programming language—permanently built in. That means you can begin running your Apple II the first evening, entering your own instructions and watching them work, even if you’ve had no previous computer experience.”

The implication is clear: you’re going to bring home your new $1,298 Apple II, set it up on your kitchen table, and start writing some programs. And not just because you’re a dork who reads Scientific American; no, the ad continues to explain that everyone in your house can learn to do it:

But the biggest benefit—no matter how you use Apple II—is that you and your family increase familiarity with the computer itself. The more you experiment with it, the more you discover about its potential.

What’s more, the ad suggests an organic oscillation between using the computer and programming it, the one influencing the other:

Start by playing PONG. Then invent your own games using the input keyboard, game paddles and built-in speaker. As you experiment you’ll acquire new programming skills which will open up new ways to use your Apple II. You’ll learn to “paint” dazzling color displays using the unique color graphics commands in Apple BASIC, and write programs to create beautiful kaleidoscopic designs. As you master Apple BASIC, you’ll be able to organize, index and store data on household finances, income tax, recipes, and record collections. You can learn to chart your biorhythms, balance your checking account, even control your home environment.

Finally, there’s a clear implication that becoming more fluent in computing involves becoming a more agile and determined programmer, not merely a more adept user:

Best of all, Apple II is designed to grow with you. As your skill and experience with computing increase, you may want to add new Apple peripherals. For example, a refined, more sophisticated BASIC language is being developed for advanced scientific and mathematical applications. And in addition to the built-in audio, video and game interfaces, there’s room for eight plug-in options such as a prototyping board for experimenting with interfaces to other equipment.

ump-bogost-fig02

Another, undated ad targets educators, presumably for institutional purchases. It too depicts programming as an inevitable rather than a possible use of the machine: “Apple engages student interest with sound and color video. In fact, your students will be able to write programs and create high-resolution graphics.” Later on, the ad expresses a sentiment almost entirely alien from the perspective of Apple in the twenty-first century. Instead of assuming that a computer is a device for consuming media, including software programs as apps, the educator’s Apple II reprises the 1977 ad’s implication that programming is an inevitable consequence of owning a computer:

Once you’ve unlocked the power of the desk-top computer, you’ll be using Apple in ways you never dreamed of. You don’t want to be limited by the availability of pre-programmed cartridges. You’ll want a computer, like Apple, that you can also program yourself. . . . The more you and your students learn about computers, the more your imagination will demand. So you’ll want a computer that can grow with you as your skills and experience grow.

This copy underscores many of the differences between computing in the late 1970s and early 1980s and computing today. For one part, a computer was an investment, more like an appliance than a consumable. Like a pet of a different sort, in fact: a fixture in the home for work and for play that would remain a companion as indefinitely as its mortality would allow. The Apple II was unique in its facility for user expansion and customization in both software and hardware. But for another part, straight out of the box, anyone could make the computer do something. And then, soon enough, anyone with a little patience and interest could make the computer do anything it was capable of.

So fundamental was programming to the experience of computing that it even found its way into the advertising itself. A 1979 print ad promotes the availability of a Pascal development environment for the Apple II. It’s is an amazing spread, partly because it includes a reverse-printed, iron-on decal that would produce a T-shirt with the message “Pascal Spoken Here.” But the ad also makes good on the promise made in the 1977 “Introducing Apple II” ad by offering more advanced tools to help improve an owner’s ability to program the machine:

With Pascal, programs can be written, debugged and executed in just one-third the time required for equivalent BASIC programs. With just one-third the memory. On top of that, Pascal is easy to understand, elegant and able to handle advanced applications. It allows one programmer to pick up where another left off with minimal chance of foul up.

ump-bogost-fig03a
ump-bogost-fig03b

The mention of foul-ups conjures those lovely yet horrifying BASIC magazine listings, one of the standard ways to distribute and share programs at the time. Pascal, as it happens, went on to become the native development environment for the Apple Lisa and the original Macintosh. It was still possible to program the Mac in Object Pascal through System 6 and 7, and Apple supported the language up until the IBM PowerPC architecture switch in 1994.

These examples underscore the good fortune that blessed those of us who started using computers at or near the start of the microcomputer era. Learning a new language or environment was a far less frequent and more specific affair, and yet a far more familiar and comfortable one. Why? Because it was possible to learn to program computers in time with their very evolution.

When we advocate for “learning to code” today, we fail to remember that contexts like those of the era of the Apple II no longer exist. The problem is not just that coding has become more complex and more difficult, as common wisdom surrounding the “genius coder” might suggest. Rather, the machines themselves have changed. Once, not so long ago even, they were devices meant to be customized and added onto by their users, who were all assumed to be latent, potential programmers, much the way woodworkers were (and still are) expected to fashion jigs for their table saws, with their table saws. But today, computers are not just devices for everyone but devices meant to be sold to everyone.

Perhaps it was inevitable that computer users would lose touch with the process of crafting software as their ranks swelled from the thousands to the millions to the billions. But it’s equally possible that the architecture, construction, marketing, and use of computers have contributed just as much to the decline of computational literacy among the computer user, in favor of the young, spry specialist.

In some cases, the weird, accidental material conditions of the practice of software development have an impact on the sort of practice it facilitates. For example, the separations created by long compiles in the days of applications (rather than apps) created invitations to dig deep into the design and operation of a programming language to avoid unnecessary delays and failures due to compilation errors. And before the Internet, programming environments were both better and more centrally documented, in print volumes or secondary texts that could be studied away from the computer.

Today, documentation is often renounced in favor of “crowdsourced” solutions, such as the coding help forums at Stackoverflow.com, a website where programmers can ask questions of one another and answer them. On one hand, many more people are available to assist with programming challenges. On the other hand, one already has to possess so much literacy to address those challenges that access is restricted rather than expanded. Back in the early 1980s, when a computer like the Apple IIe came with a BASIC manual, the beginning programmer at least found himself safely ensconced within the sandbox of a well-documented yet still powerful system. Today, computers are glass and aluminum mysteries off of whose surfaces computational curiosity slips like mercury.

I don’t mean to invoke nostalgia for better, simpler times. Rather, to acknowledge that the deliberate or accidental conditions for creativity have an influence on the ways we carry out those practices. Ironically, as computers have become more popular and more diverse, as “learning to code” has become more desirable and marketable, the diversity of those practices may have declined more than proliferated.

When Apple moved to the App Store model for software distribution, it also introduced a submission and approval process for developers seeking to publish their apps. In addition to charging a ninety-nine-dollar annual fee just for the privilege of using the system, Apple’s approval process—every app for sale gets vetted by the company—has proven arbitrary at best and questionable at worst. For example, Apple has a tendency to deny publication to politically controversial apps. (Among them is Molleindustria’s 2011 title Phone Story, a mobile game about the political consequences of mobile electronics manufacture, among them Congolese coltan mining and Chinese electronics factory labor.) But another restriction imposed by Apple’s centralization of publishing and distribution relates to the tools with which software is created in the first place.

Apple has always tested for and denied its publication of apps that use “private” application programming interfaces (APIs)—that is, programs that make use of portions of the operating system that Apple has created for internal use only. But at one point, Apple also attempted restrictions that dictated the programming languages and environments that could be used to write software for iOS:

Applications must be originally written in Objective-C, C, C++, or JavaScript as executed by the iPhone OS WebKit engine, and only code written in C, C++, and Objective-C may compile and directly link against the Documented APIs (e.g., Applications that link to Documented APIs through an intermediary translation or compatibility layer or tool are prohibited).

The policy prevented apps from running interpreted code—that is, from loading and running software within an app. Emulators for older computers, such as the Commodore 64 and (ironically) the Apple II, were prohibited, as was the children’s educational programming environment Scratch.

But it seems to have been launched as a frontal assault in Steve Jobs’s unlikely war against Adobe’s Flash tool kit, the cross-platform animation and programming environment first made popular on the Web in the 1990s. Adobe had created a Flash exporter for iPhone, which they scrapped in the wake of the prohibition, only to later reinstate it after the dust cleared. The Federal Trade Commission and the Department of Justice reportedly considered launching an antitrust investigation into Apple to determine if the policy would unlawfully foreclose competition on rival platforms—an unthinkable outcome in 1980, even if the Apple II was as successful then, relatively speaking, as the iPhone was thirty years later.

The computational ecosystem is burgeoning. We have more platforms today than ever before, from mobile devices to microcomputers to game consoles to specialized embedded systems. Yet, a prevailing attitude about computational creativity longs for uniformity: game engines that target multiple platforms to produce the same plain-vanilla experience; authoring tools that export to every popular device at the lowest common denominator; and, of course, the tyranny of the Web, where everything that once worked well on a particular platform is remade to work poorly everywhere (just think of Google Docs, which took Microsoft’s bloated desktop office suite and refashioned its components into terrible online renditions). After Apple’s prohibition of nonnative development tools, Flash developers responded with understandable fury. But rather than antitrust idealism, ignorance probably motivated their anxiety: many such developers just didn’t possess an adeptness with iOS-native tools like Objective-C, and they feared being locked out of Apple’s popular platform.

From our vantage point, Apple’s 1977 version of learning and deploying programming might seem quaint, even if also appealing. Today we have offloaded such facilities into concepts like “hacker” and “maker,” lifestyle activities that already imply that one has managed to “learn to code” by other means, and in which the act of programming or designing circuits becomes an end practiced for its own sake, often as an affectation.

But it is not entirely impossible to imagine recuperating such a situation, one in which hardware ships with the tools to make software for it—for personal use as much as for producing the latest hit app or billion-dollar start-up. Apple itself made gestures toward such a possible future in 2014, when it introduced a new programming language called Swift, meant to make iOS and OS X programming easier and more rapid. Still, programming has become so decoupled from computer ownership and use that a new language is hardly sufficient to return us to those salad days of the late 1970s. Back then, owning a computer entailed being a programmer, much like owning a sewing machine entailed being a seamster or a seamstress.