A Blog by Jonathan Low

 

Oct 20, 2019

The Lines of Code That Changed Everything

The often obscure and hidden influences that have shaped our lives. JL

Slate reports:

Code shapes our lives.Culturally, code exists in a nether zone. We can feel its effects on our reality, but we rarely see it, and it’s inscrutable to non-initiates. (The folks in Silicon Valley like it; it helps them self-mythologize as wizards.) Computer scientists, software developers, historians, policymakers, and journalists were asked to pick: Which code had a huge influence? It’s not a comprehensive list. It’s meant to help us ponder how code undergirds our lives and decisions made by programmers ripple into the future. The most consequential code creates new behaviors by removing friction.When software makes it easier to do something, we do more of it.
Back in 2009, Facebook launched a world-changing piece of code—the “like” button. “Like” was the brainchild of several programmers and designers, including Leah Pearlman and Justin Rosenstein. They’d hypothesized that Facebook users were often too busy to leave comments on their friends’ posts—but if there were a simple button to push, boom: It would unlock a ton of uplifting affirmations. “Friends could validate each other with that much more frequency and ease,” as Pearlman later said.
It worked—maybe a little too well. By making “like” a frictionless gesture, by 2012 we’d mashed it more than 1 trillion times, and it really did unlock a flood of validation. But it had unsettling side effects, too. We’d post a photo, then sit there refreshing the page anxiously, waiting for the “likes” to increase. We’d wonder why someone else was getting more likes. So we began amping up the voltage in our daily online behavior: trying to be funnier, more caustic, more glamorous, more extreme.
Code shapes our lives. As the venture capitalist Marc Andreessen has written, “software is eating the world,” though at this point it’s probably more accurate to say software is digesting it.
Culturally, code exists in a nether zone. We can feel its gnostic effects on our everyday reality, but we rarely see it, and it’s quite inscrutable to non-initiates. (The folks in Silicon Valley like it that way; it helps them self-mythologize as wizards.) We construct top-10 lists for movies, games, TV—pieces of work that shape our souls. But we don’t sit around compiling lists of the world’s most consequential bits of code, even though they arguably inform the zeitgeist just as much.
So Slate decided to do precisely that. To shed light on the software that has tilted the world on its axis, the editors polled computer scientists, software developers, historians, policymakers, and journalists. They were asked to pick: Which pieces of code had a huge influence? Which ones warped our lives? About 75 responded with all sorts of ideas, and Slate has selected 36. It’s not a comprehensive list—it couldn’t be, given the massive welter of influential code that’s been written. (One fave of mine that didn’t make the cut: “Quicksort”! Or maybe Ada Lovelace’s Bernoulli algorithm.) Like all lists, it’s meant to provoke thought—to help us ponder anew how code undergirds our lives and how decisions made by programmers ripple into the future.
There’s code you’ve probably heard of, like HTML. Other code is powerful (like Monte Carlo simulations, which is used to model probabilities) but totally foreign to civilians. Some contain deadly mistakes, like the flaw in the Boeing 737 Max. And some are flat-out creepy, like the tracking pixel that lets marketers know whether you’ve opened an email.
One clear trend illustrated here: The most consequential code often creates new behaviors by removing friction. When software makes it easier to do something, we do more of it. The 1988 code that first created “Internet Relay Chat” allowed the denizens of the early internet to text-chat with one another in real time. Now real-time text is everywhere, from eye-glazingly infinite workplace Slack confabs to the riot of trolling and countertrolling in a Twitch livestream.
It’s not always clear at first when some code will become epoch-defining. Oftentimes it starts off as a weird experiment, a trial balloon. Back in 1961, Spacewar!, the first virally popular video game, might have seemed a pretty frivolous way to use a cabinet-size computer that cost, at the time, $120,000. (That’s more than $1 million in 2019 dollars.) But it pioneered many of the concepts that helped computers go mainstream: representing data as icons, allowing users to manipulate those icons with handheld controllers.
Code’s effects can surprise everyone, including the coders. —Clive Thompson, author of Coders: The Making of a New Tribe and the Remaking of the World. 

Binary Punch Cards
Date: 1725
The first code

Binary programming long predates what we think of as computers. Basile Bouchon is believed to be the first person to punch holes into paper and use it to control a machine: In 1725, he invented a loom that wove its patterns based on the instructions provided in the perforated paper it was fed. A punched hole is the “one,” and the absence of a punched hole is the “zero.” As much as things have changed since then, the essential building block of code has not. —Elena Botella, Slate

The First Modern Code Executed
Date: 1948
Ushered in both the use of computer code and the computer models of nuclear devastation that shaped the Cold War arms race

The Electrical Numerical Integrator and Computer was the first programmable electronic computer. Completed in 1945, it was configured for each new problem by wiring connections between its many components. When one task, such as an addition, finished, a pulse triggered the next. But a few years later, Klára Dán von Neumann and Los Alamos scientist Nicholas Metropolis wired ENIAC to run the first modern code ever executed on any computer: hundreds of numerical instructions executed from an addressable read-only memory (ENIAC’s function table switches). They simulated the explosion of several atomic bomb designs being evaluated at Los Alamos National Lab in New Mexico, using the Monte Carlo technique by which a complex system is simulated, step by virtual step, to repeatedly map the probability distribution of possible outcomes. Von Neumann and Metropolis sent more than 20,000 cards back to the nuclear scientists at Los Alamos, tracing the progress of simulated neutrons through detonating warheads. The distant descendants of this code are still in use at Los Alamos today. —Thomas Haigh, co-author of ENIAC in Action: Making and Remaking the Modern Computer

Grace Hopper’s Compiler
Date: 1952
Made it possible for computers to process words

Grace Hopper was programming an early computer when she decided to make the whole thing easier by rooting it in human language. Hopper, who enlisted in the US Naval Reserve during World War II, knew that people like her superiors in the military struggled to understand binary code. If programming languages could be English-based, the work would be less prone to errors and more accessible to those who didn’t have a Ph.D. in mathematics.
Some scoffed at the idea, but by the early 1950s she had devised a compiler—a set of instructions that converts a more intelligible kind of code to the lower-level code directly processed by the machine. With that tool, she and her lab developed FLOW-MATIC, the first programming language to incorporate English words based on that process.
Spacewar!
Date: 1961
The first distributed video game
In late 1961 a group of young MIT employees, students, and associates (many of them members of the Tech Model Railroad Club) gained late-night access to a recently donated DEC PDP-1 computer. The leading edge of nonmilitary computing, the PDP-1 sold for $120,000 (that would be a bit more than $1 million today), featured 18-bit word length, and used paper tape for program storage. Over the course of five months, these programmers created a game in which two players control spaceships—the needle and the wedge—that engage in a one-on-one space battle while avoiding the gravity well of a star at center screen.
Spacewar! spread quickly across the early “hacker” community. It was later distributed by DEC with each PDP-1, preloaded in the core memory and ready to demonstrate when installed. The program significantly influenced the small coding community of the 1960s and inspired generations of video game creators. It lives on in emulations and is demonstrated regularly at the Computer History Museum on the last operational PDP-1. Steve Russell, the lead coder, said at a 2018 Smithsonian panel, “It’s more than 50 years old. There are no outstanding user complaints. There are no crash reports. And support is still available.”
The Origins of Email
Date: 1965
Come on. It’s email.  1961, Massachusetts Institute of Technology hackers created a system to let multiple users log into the same computer, and they began leaving little messages for each other. In 1965, a group of coders decided to create a formal command system for sending, receiving, and displaying these little digital missives. Higher-ups resisted the “MAIL” command at first, thinking it was a bit frivolous, but its usage took off—so much so that by 1971, MIT even saw the first piece of spam: an anti–Vietnam War message.
The Police Beat Algorithm
Date: 1968
The start of modern predictive policing—and computerized racial profiling When President Lyndon Johnson formed the President’s Commission on Law Enforcement and Administration of Justice in 1965, he asked it to examine how computers could help us solve the nation’s “crime problem”—a problem he and the commission defined as both “urban” and “black.” The answer to this call was the Police Beat Algorithm, which aimed to solve planning problems like how many officers should patrol a given area of a city. By combining the PBA with a crime database, police officials could produce automated suspect profiles based on the racial demographics of police beats and deploy resources (officers, weapons, and other equipment) accordingly—before any crime was actually committed. Today’s predictive policing practices disproportionately surveil and criminalize black and brown people. As the story of the PBA reminds us, this is not the result of an unforeseen technological glitch: It is the perfection of the technology’s 50-year design.
The Apollo 11 Lunar Module’s BAILOUT Code
Date: 1969
The code that kept the lunar module’s computer from running out of space in space 
The Apollo Guidance Computer was a marvel: As Poppy Northcutt, who calculated Apollo’s return-to-Earth trajectories, told me, the AGC had less computing power than the greeting cards today that record a personal message. Yet it worked.
That limited power and storage space meant that tasks had to be carefully managed so the AGC was always focused on the most important jobs. If it ran out of space to perform tasks, that wouldn’t happen. The AGC software team knew there were eventualities they couldn’t plan for. So they created BAILOUT. When the computer was at risk of running out of space (or “overflow”), the AGC triggered BAILOUT to schedule less important data and operations so it could keep the vital ones up and running.
As the Eagle lander descended toward the moon’s surface, at 30,000 feet the AGC flashed a “1202” alarm, which neither Neil Armstrong nor the flight controller in Houston immediately recognized. But in less than 30 seconds, the computer experts in Mission Control relayed that the AGC software was doing just what it was supposed to: drop lower-priority work and restart the important jobs (so quickly that it was imperceptible to the crew). Armstrong and Buzz Aldrin would continue to get what they absolutely needed from the AGC to keep on the path to touchdown.
Overflow alarms would sound three more times before Armstrong uttered “the Eagle has landed,” but always because things worked as intended. The word “bailout” normally signals the failed end of a mission, but here it helped make humanity’s highest achievement a reality.

Hello, World!
Date: 1972 or earlier
The phrase that has introduced generations to code

main( ) { printf("hello, world\n"); }
When you sit down to learn a new programming language, the first thing the tutorial has you do is get the computer to display the phrase “Hello, world!” Perhaps the most famous early example comes from a Bell Laboratories memorandum called “Programming in C—A Tutorial,” written in 1974, though it was also found in a 1972 manual for another language, B, and may go back even earlier than that. Hello, World! is a beautiful bit of pedagogy. It’s a small, achievable task that offers an early sense of accomplishment. It’s a standard, so it helps illustrate the differences between different programming languages. It’s also a quick and easy way for advanced programmers to make sure everything is working correctly after installing a new environment. (Sometimes programmers use “time to ‘hello world’ ” as a speed test to compare languages and environments.) Perhaps most importantly, “Hello, world!” is doe-eyed, friendly, and helps convey the scale at which the new programmer’s code can have an effect. That is, the world.

The Null-Terminated String
Date: 1972
The most catastrophic design bug in the history of computing

char yellow[26] = {'y', 'e', 'l', 'l', 'o', 'w', '\0'};
In 1972, Dennis Ritchie made a fateful decision: to represent text in his new language with something called a null-terminated string. The concept had been around earlier, but he enshrined it in his new language, which he called C, and the legacy of that decision has been with us ever since.
There are two primary ways that programming languages represent a piece of text: It can have an intrinsic, explicit length—“I contain exactly 10 characters and no more.” Or it can be null-terminated—“Here are a bunch of characters, keep going until you hit the zero-byte at the end, good luck!”
An extremely common mistake in C code is to copy a long string into a shorter string and overflow the end, meaning you are destroying other data that just happened to be nearby. It’s like scribbling past the edge of a whiteboard.
Besides merely causing the program to malfunction, such bugs can be exploited to change a program’s behavior by convincing it to overwrite something with specific, carefully crafted data. These are the buffer overflow attacks. Very nearly every security exploit you’ve ever heard of starts here, beginning with the Morris Worm in 1988. You can code carefully in C to avoid these kinds of bugs, but the language makes this class of mistake easy to make and hard to detect. Nearly every modern language eschews the null-terminated string, but C and C++ still run the substrate of the world, from your router to your “smart” lightbulbs. So we’re still playing whack-a-mole with this class of bug nearly 50 years later.

Telenet
Date: 1975
The first public data network built on packet-switching, the backbone of today’s internet

Before there was the internet, there was ARPANET, a computer network for researchers at the Advanced Research Projects Agency (now DARPA) to trade data between machines. As ARPANET expanded within the government, its creators realized the technology could be valuable to the general public—and how much money could be made in the game. In August 1975, the commercial version of ARPANET, Telenet, went online in seven cities, allowing its earliest customers—mostly computer or database companies—to dial in with their telephones to upload and download data, like proto email messages, or remotely access code stored on a central computer. While ARPANET is often credited with being the earliest version of the modern internet, it might be more accurate to say that Telenet, a service designed for public consumption, is really the web’s precursor. In fact, one of Telenet’s biggest customers in the 1980s was Quantum Link—which later became AOL.
The Vancouver Stock Exchange’s Rounding Error
Date: 1982
A minor decimal distinction with a major cost
In early 1982, the Vancouver Stock Exchange unveiled an electronic stock index initially pegged to a value of 1,000 points. In two years it dropped to half its original value—a confusing trend amid the bull market of the early 1980s. An investigation revealed that the calculations of the index were wrong in just one command, using floor() rather than round(). This command meant that instead of rounding to the third decimal place, the value was being truncated. (Digital computers necessarily have finite resolution, which necessitates rounding or truncation.) So if the index was calculated as 532.7528, it was stored as 532.752, rather than rounded up to 532.753. Over the course of thousands of calculations a day, this seemingly minor difference—essentially rounding down every single time—amounted to a dramatic loss in value. The programming mistake was finally fixed in November 1983, after the index closed around 500 on a Friday. It reopened on Monday at over 1,000, its lost value restored. —Lav Varshney, assistant professor, University of Illinois at Urbana-Champaign

Therac-25
Date: 1985–1987
Proved that overconfidence kills

The headlines, when the truth emerged, gave a warning about modernity. “SOFTWARE BUGS TURNING DEADLY IN COMPLEX ERA,” the Los Angeles Times proclaimed. A machine meant to treat cancer had, at several medical facilities, blasted six patients with massive overdoses of radiation, killing at least three.
As investigators would discover, the Therac-25 had been programmed to allow for a fatal mistake. The machine offered low-power and high-power types of therapy, the latter of which required a metal device to filter the beam. But because of a bug in the software, an operator could accidentally trigger the high-power mode without the necessary metal device in place.
The Therac-25 had been designed as an “improvement” to the Therac-20, and the software was considered foolproof enough that it didn’t need external safety checks. The result: several deaths blamed on overconfident engineers who failed to account for the possibility of a mistake. —Molly Olmstead

Internet Relay Chat
Date: 1988
The original online hang

Internet Relay Chat, better known as IRC, began before most people could even tell you what an internet is. It was the first popular way to chat in real time with other people in a group channel. Early users logged on to share news, for example, during the 1991 coup d’état attempt in the Soviet Union during a media blackout. Chatting itself required a sort of code: To join a channel, you type “/join #[channel name].” (This will look familiar to today’s Slack addicts.) If you wanted to signal something about yourself, you’d type “/me is so tired,” and it’d share your name with the words “so tired” by an asterisk. It’s basic, but for many, it was the first brush with using a command on a computer. And it was a ticket to join the conversation. —April Glaser, Slate

The Morris Worm
Date: 1988
A cold awakening to how large the internet had become

Both Robert Morris and the internet itself were young when Morris, a 23-year-old Cornell graduate student, released the “Morris Worm,” launching what has been called the “first major attack on the Internet.” Approximately 10 percent of the 60,000 computers connected to the internet were hit, causing millions of dollars of damage and leading the New York Times to print the words “the Internet” for the first time. Even tech-savvy people were surprised to see how extensive the worm’s reach was. Morris, who says he never intended to cause so much damage, became the first person indicted under the Computer Fraud and Abuse Act. After being sentenced to three years of probation, he went on to become a co-founder of celebrated startup incubator Y Combinator and an assistant professor of computer science at MIT. —Elena Botella

The One-Line Virus
Date: Circa 1990s
The power of a single line of code—and the ever-surprising fragility of computers

What you see above is a one-line virus of sorts called a fork bomb. It takes some specific conditions to work (including an older, susceptible copy of the Unix operating system). But given those, if you type this command in Bash, it replicates itself over and over until it consumes all available memory in the computer and causes it to crash.
What makes it beautiful is not the danger it represents relative to its size, but that it uses a colon for a function name. Most functions (reusable lines of code) are named descriptively, like “Print” or “isThisEmailValid,” but there’s no rule saying they have to be. In most computing languages, you can’t use a colon as a function name, but you can in Bash.
I first encountered this line as an exhibit in an art museum, the Museum Angewandte Kunst in Frankfurt, Germany, back in 2002. There aren’t many bits of code that wind up displayed in museums.

The HTML Hyperlink
Date: 1990
The tool that let us connect everything to anything, even the unimaginable

<a href = "https://www.slate.com">Slate</a>
Tim Berners-Lee changed the world when he introduced the hyperlink, a snippet of code that lets anyone jump across the World Wide Web. The concept of linking information was not especially new. What was new was the cobbled together punctuation from various computer system conventions to arrive at the colon-slash-slash format of the URL, which could name any and all of those extant items. But while Berners-Lee was concerned with backward compatibility, the hyperlink-anything concept made the idea future-proof. Berners-Lee’s hyperlink was free to become a Buy It Now button, a like vote, a retweet, and much more. Those unexpected use cases should be a reminder that, when standing at the cusp of a technological revolution, the hardest thing to see is what comes next. —Charles Duan, director of technology and innovation, R Street

Introduction of the JPEG
Date: 1992
Forever changed our relationship to photography

We take it for granted that we can fill our cameras with massive numbers of photos. But images used to require huge amounts of data. In 1992, the Joint Photographic Experts Group published specifications for a standard—the JPEG—to make image files smaller. Though other compression formats were available at the time, the JPEG became the worldwide standard, in part because it was royalty-free. JPEGs take advantage of lossy compression, a process that removes aspects of a picture undetectable to the human eye, such as slight variations in color. Lossy compression was also essential to the invention of something else introduced in 1992: the MP3, an audio file format made possible by discarding bits of data undetectable to the human ear. —Aaron Mak, Slate

The Mosaic Browser
Date: 1993
The birth of the web as we see it 
Previous browsers had been clunky affairs, rendering text pretty well but forcing you to view images in a separate window. The Mosaic hackers, led by Marc Andreessen, wanted a browser that laid out images and text side by side. They made the web look familiar, as if it were a cool digital zine or newspaper. It also prompted HTML standards to begin evolving in overdrive as webmasters worldwide began demanding ever more tags to make sites look cool. (Frames seemed like a super-cool idea back then.) 

The Tracking Pixel
Date: 1993
Modern data collection started with these invisible images.

These tiny HTML snippets don’t look like much, but they’re the bedrock of digital advertising, putting them at the center of so many modern problems: surveillance, media consolidation, even misinformation.
Back in the 1990s, web designers used transparent one-pixel images to adjust page layouts. But a computer has to download every image on a webpage—even one imperceptible pixel. In 1993, companies started capitalizing on this: By tracking pixel downloads, they learned who and where you were, and triggered a cookie to be downloaded to your browser. That cookie enabled advertisers to follow you across multiple sites.
Pixel tracking’s success led directly to the Facebook “like” button, which tracks you across every website where it’s embedded. This massive data collection enabled the hyper-targeting that made Facebook ads so successful, shifting billions in revenues away from media companies. As journalism flailed, targeted misinformation thrived—and surveillance-based business models proliferated. (More on the like button below.)
Robots.txt
Date: 1994
A tiny tool with huge implications for search and beyond  If you’ve ever run a Google search, there’s a chance you’ve bumped into a result that says, “A description for this result is not available because of this site’s robots.txt.” Not everyone wants their website to be indexed by a search engine, which is one reason a robots.txt file can be added to a website to ask the bots that catalog the web—sometimes called spiders or crawlers—to move along rather than access that site. Its unusual role of mediating access to website content puts robots.txt among the most litigated bits of code, factoring into more than a dozen cases involving copyright, hacking, trespass, tort law, and even a 2009 judicial misconduct inquiry involving former 9th Circuit Chief Judge Alex Kozinski.
The Wiki
Date: 1994
Paved the way for Wikipedia
Ward Cunningham first invented the Wiki with his site WikiWikiWeb, which he imagined as the simplest possible way to share information. He used a basic markup language, which involves brackets, stringing words together without spaces, and apostrophes around text, for editors to update and organize information linked across pages—a system still in wide use today on wikis, including Wikipedia, which launched in 2001. The accessible format has made the wiki a tool for some of the most important forms of active collaboration online, from tracking security bugs to taking notes. But as with anything editable online, wikis are vulnerable to vandalism and fierce arguments about what should and shouldn’t be posted, which is why Wikipedia includes talk pages and rules that govern how editors can add new information. —April Glaser

The First Pop-Up Ad
Date: Mid-’90s
The scourge of the internet

window.open('https://www.slate.com/')
The basic code to open a new window with a given URL—in this case, Slate’s.
I’ve got my tombstone all picked out. It says, “Click here to win millions!”
More than 20 years ago, I wrote a scrap of JavaScript code that opened a second, small web browser window while opening the page you’d requested. This new window contained an ad—the dreaded pop-up ad. Over the next few years, I watched with horror as pop-up ads spread across the web, adopted by the worst, most intrusive advertisers on the web.
The pop-up ad was designed to solve a real problem: My company, Tripod, let people put whatever content they wanted on a free home page. To subsidize the service, we sold ads. But advertisers didn’t always like the content of the page they featured on, so we decided to separate the ad from the user’s content. Hence, the pop-up ad.
I knew the pop-up ad wasn’t a good solution when I implemented it. Watching it spread across the web was like fixing your car with duct tape and watching everyone else on the road rip off a few strips of the silvery stuff to join in the fun.
Since I launched this unholy beast on the world, I’ve written books, launched companies, taught at universities, but the pop-up is what I will be remembered for. I still expect to get hate mail when this article appears. —Ethan Zuckerman, director of the MIT Center for Civic Media

The Code That Made a T-Shirt Illegal
Date: Circa 1995
Language: Perl
One of the earliest examples of code as activism

#!/bin/perl -s-- -export-a-crypto-system-sig -RSA-3-lines-PERL
$m=unpack(H.$w,$m."\0"x$w),$_=`echo "16do$w 2+4Oi0$d*-^1[d2%Sa 
2/d0,s/^.|\W//g,print 
pack('H*',$_)while read(STDIN,$m,($w=2*$d-1+length($n)&~1)/2)
“WARNING: This shirt is classified as a munition and may not be exported from the United States, or shown to a foreign national,” the shirt warned. For a time, the United States government treated strong encryption like surface-to-air missiles: too dangerous to fall into the hands of America’s foes. The idea made a kind of sense when encryption lived in heavy, expensive devices, but a lot less sense when the State Department tried to tell cryptography researchers in the 1990s they couldn’t post their code on the internet. But the RSA encryption algorithm—one of the basic building blocks of modern cryptography—is elegant enough that it can be written out in just four dense lines of Perl code … short enough to fit on a T-shirt. The original shirts are now collector’s items; the export controls, although not completely gone, have been substantially pared back. —James Grimmelmann, professor of law at Cornell Tech and Cornell Law School

Google’s PageRank Algorithm
Date: 1996
Revolutionizing the way we organize knowledge

Before PageRank, search engines tried to find info based on whether our query words matched the words in the document. But with PageRank, Larry Page and Sergey Brin had a brilliant idea: Knowledge was social—search should be as well. They created an algorithm that ranked the prominence of a page based on how many other pages online linked to it. That single insight is responsible for the mammoth power Google enjoys today. —Clive Thompson

GeoCities Mouse Trails
Date: Mid-1990s
It could make even the most mundane elements of your site sparkle. 

Douglas Englebart and Bill English’s invention of the mouse in the late 1960s transformed the way we could communicate with computers. While using a mouse was intuitive, display technology at the time couldn’t often keep up with the quick movements of the mouse. Operating system developers added the mouse trail—the momentary, shadowy images that showed a cursor’s previous position, making it easier to follow and find.
In the mid-1990s, GeoCities was the first service to offer users a free and easy way to create their own web content. GeoCities’ WYSIWYG editor allowed creators to easily drag and drop content, and to add snippets of customization code—including code to customize the mouse trail visitors saw. On a GeoCities page, your cursor might leave trails of fairy dust, bubbles, or tiny Halloween bats in its wake. These flashy, glittery pixels epitomized a moment in which we were able to make our “web spaces” our own.
By bringing news stories, blogs, blawgs, podcasts, and other forms of web publishing into a standard format, RSS (which stands for Rich Site Summary or Really Simple Syndication) code lets you consume information published by a variety of sources in a single simple, effective, and efficient manner. At its apex, from roughly the launch of Google Reader in 2005 until the tragic death in 2013 of Aaron Swartz, the internet freedom activist who worked on the development of RSS 1.0, RSS was synonymous with publishing on the decentralized internet. Though Google Reader is no more, RSS remains at the forefront of the usable internet, from news aggregators to podcast applications. —David S. Levine, associate professor, Elon University School of Law

The Lost Mars Climate Orbiter
Date: 1999
A mission thwarted by a math mistake

On Sept. 23, 1999, NASA scientists lost communication with the $125 million Mars Climate Orbiter. An investigation later determined the cause: A contractor had written a program for the orbiter using imperial units, as is standard in the U.S., but NASA’s software used the metric system. A simple miscommunication between two pieces of code had sent the orbiter flying off to places unknown.
It’s easy to write off this metric-imperial error as a fluke, but it points to just how tenuous the world of interconnected software is today. All connected technologies—phones, spacecraft, robot juicers, what have you—depend on interfaces to define how to communicate with others. The smallest discrepancies can lead to chaos. —Charles Duan

The Code to Hook a Hellfire Missile to a Drone
Date: Circa 2000–2001
It ushered in drone warfare.

The weaponization of the early version of the Predator drone was a signature moment in not just technology history, but also military and political history. Unmanned systems now proliferate across the battlefield, reshaping how soldiers fight and even where from. The weaponization of the Predator also launched the U.S. into an age of “drone wars.” And, given all the issues raised by increasingly autonomous and armed robotics, we are only at the start. That one simple program may have opened up legal, ethical, and yes, maybe even existential questions. —P.W. Singer, author of Wired for War: The Robotics Revolution and Conflict in the 21st Century

The Roomba’s Guidance System
Date: 2002
Established a new way for technology (and cats) to move in the physical world 

You never forget your first Roomba: Mine was nearly 17 years ago, at a friend’s. I sat on a couch giggling as it whirred its way around the room in fits and starts. It was the dawn of a robotic revolution, one both very silly and deeply serious. (It is truly amazing that a robot sharing DNA with bomb-sweeping machines is vacuuming millions of homes.) Its success has also been hard to replicate since then. The Roomba proved that while our attention gravitates toward hardware—such as freaky backward-kneed, door-opening dogs—software might be even more important for a product’s wide adoption. The Asimovian-named iRobot did not create the first robotic vacuum, but the Roomba became a niche must-have not because of how well it sucks, but because of how well it navigated a room. As a thousand cat videos can attest, in the modern computing age, little has been as viscerally satisfying (and disarming) as watching a Roomba bump into a table leg, rotate itself, and continue on. —Lowen Liu, Slate

Proportional Fair Scheduling for Wireless Networks
Date: Circa 2003
The solution that makes cellphone networks possible

At any given moment in a given area, there are often many more cellphones than there are base station towers. Unmediated, all of these transmissions would interfere with one another and prevent information from being received reliably. So the towers have a prioritization problem to solve: making sure all users can complete their calls, while taking into account the fact that users in noisier places need to be given more resources to receive the same quality of service. The solution? A compromise between the needs of individual users and the overall performance of the entire network. Proportional fair scheduling ensures all users have at least a minimal level of service while maximizing total network throughput. This is done by giving lower priority to users that are anticipated to require more resources. Just three lines of code that make all 3G and 4G cellular networks around the world work. —Lav Varshney

Bitcoin
Date: 2008
The code that inspired confidence in a type of currency that wouldn’t exist without it

Whether you’re a Bitcoin evangelist, a skeptic, or not totally sure what it is, you probably know that it’s a big deal. Bitcoin itself has accrued hundreds of billions of dollars in direct investment, but perhaps more importantly, the underlying technological principle, the blockchain, has been researched for seemingly endless applications, from securing democratic elections to ending nonconsensual sexual encounters.
It all began in 2008, when the pseudonymous Satoshi Nakamoto published a white paper announcing the launch of Bitcoin. It included these lines of code, which calculate the infinitesimally small likelihood that an attacker could take over the Bitcoin blockchain. The math convinced the world that a system made out of untrustworthy people could nevertheless be trusted, paving the way for the creation of at least 2,777 other cryptocurrencies.

Conficker Worm
Date: October 2008 into 2009
Turning infected computers into an army of malicious bots, while sparing others

; BOOL __cdecl HasUkrainianLocale()
                push    ebx
                mov     ebx, ds:GetKeyboardLayoutList
                push    ebp
                push    esi
                xor     ebp, ebp
                push    ebp
                push    ebp
                call    ebx
                mov     esi, eax
                cmp     esi, ebp
                jz      short loc_37680A
Security researchers Tillmann Werner and Felix Leder wrote this code and tested it for functional equivalence in their efforts to understand and fight the Conficker worm.
Tillmann Werner and Felix Leder
A decade ago, as many as 15 million computers were infected with Conficker, a virus exploiting weaknesses in Windows operating systems. The virus was feared but also revered for its sophistication: It enlisted each computer as part of a giant bot army, awaiting orders, and it prevented infected computers from opening security programs or downloading patches that could clear the virus. Its earliest version also had an interesting and potentially telling quirk: It self-destructed within any systems using a Ukrainian keyboard or Ukrainian IP address. Years later, authorities and researchers who reverse-engineered the virus concluded that several Conficker creators were indeed Ukrainian and had designed the virus to avoid breaking their own country’s laws. Luckily, the hackers never deployed their botnet army for ill, and as of 2018, an estimated 350,000 computers were still infected with the virus, a reminder of how easily skilled programmers could unleash international attacks, selectively wreaking havoc upon users. —Jane C. Hu

The Like Button
Date: 2009
It catalyzed the surveillance economy.

Facebook sold the “like” button as a way to show the world we liked The Simpsons or curly fries. But in reality, it took advantage of our cognitive biases and the power of design to goad us into sharing even more information. It followed us around the internet—thanks to the tracking pixel Sara Wachter-Boettcher described above—collecting data on our browsing habits. Facebook then took that information and sold its behavioral targeting algorithm to advertisers. If an outdoor products company wanted to advertise, Facebook knew to target those who had previously “liked” posts about hiking, visited camping websites, and had outdoorsy friends. And when those users “liked” a company’s advertisement, that information was fed back into the targeting algorithm. And so the cycle of surveillance and commercial manipulation continued. All because of a tiny blue thumbs up. —Ari Ezra Waldman, professor, New York Law School

HTTP Strict Transport Security
Date: Circa 2009
Protects your data by defaulting you through secure channels to sites

Strict-Transport-Security: max-age=31536000; includeSubDomains
When you send information over a plain-old HTTP to a website, it’s leaky—someone could intercept it and eavesdrop on your credit card, your health information, your pet name for your partner. HTTPS encrypts your traffic from prying eyes, but for a long time, using the more advanced protocol was optional. Enter HTTP Strict Transport Security, a development that ensured all web traffic sent to and from sites is encrypted from the start. If you try to go to http://google.com, it will automatically direct you to https://google.com. That’s HSTS in action.
HSTS still isn’t widespread: Just an estimated 11.1 percent of websites use it. (Slate is one of them.) But one important moment came in the spring of 2015, when the federal government and industry partners implemented HSTS for 19 government domains—including Whitehouse.gov, AIDS.gov, and donotcall.gov. Soon after, all federal agencies were directed to adopt the standard. Rusty D. Pickens, former acting new media director for the Obama White House

Heartbleed
Date: Written 2012, discovered 2014
One of computing history’s most widespread and pernicious security vulnerabilities In 2014, security researchers discovered a vulnerability in OpenSSL, a hugely popular open-source library used by roughly two-thirds of websites—including DropBox, Twitter, Yahoo, and GitHub—for online communication between two computers. Heartbleed could have allowed perpetrators to steal unencrypted secret information, including credentials and encryption keys, through a buffer over-read vulnerability that affected millions of devices. It drew attention to the risks associated with relying on open-source software for crucial security functions, as well as the challenges of identifying vulnerabilities in code that seems to be working perfectly for years and years. On a more positive note, the discovery of Heartbleed also triggered a prompt and largely effective global response, including coordinated worldwide publicity and remediation efforts that went well beyond many previous such campaigns for earlier vulnerabilities. 

Boeing 737 Max
 Date: Released 2017
 A software error, compounded by corporate greed, led to hundreds of deaths and the grounding of a fleet of planes.

In October 2018, Lion Air Flight 610 dove into the sea shortly after takeoff in a seemingly freak accident. Boeing assured the public that the planes were safe, saying more pilot training and a “software upgrade” were all that was needed. But just four months later, the pilots of an Ethiopian Airlines flight struggled to pull the nose up 20 times while the plane’s automated system tried to push it down. Within minutes after takeoff, everyone aboard was dead. In response, aviation authorities worldwide grounded the planes. Investigations revealed the crashes were caused by the 737 Max’s design, particularly little-known and poorly understood software that could force the plane into repeated nosedives.


0 comments:

Post a Comment