Sabtu, 31 Mei 2014

The Rise of Nintendo: A Story in 8 Bits

From ‘Donkey Kong’ to the NES — how a Japanese company took over the American living room. An exclusive first serial from ‘Console Wars.’

On September 23, 1889, just weeks before his thirtieth birthday, an entrepreneur named Fusajiro Yamauchi opened a small, rickety-looking shop in the heart of Kyoto. To attract the attention of passing rickshaws and wealthy denizens, he inscribed the name of his new enterprise on the storefront window: Nintendo, which had been selected by combining the kanji characters nin, ten, anddo. Taken together, they meant roughly “leave luck to heaven” — though, like most successful entrepreneurs, Yamauchi found success by making his own luck. In an era where most businessmen were content to survive off the modest returns of regional mainstays such as sake, silk, and tea, he decided it was time to try something new. So instead of selling a conventional product, Fusajiro Yamauchi opted for a controversial one, a product that the Japanese government had legalized only five years earlier: playing cards.


In the heart of Kyoto, he and a small team of employees crafted paper from the bark of mulberry trees, mixed with soft clay, and then added the hanafuda card designs with inks made from berries and flower petals. Nintendo’s cards, particularly a series called Daitoryo (which featured an outline of Napoleon Bonaparte on the package), became the most successful in all of Kyoto.
After several decades of staggering success, Fusajiro Yamauchi retired in 1929 and was succeeded by his son-in-law Sekiryo Yamauchi, who ran Nintendo efficiently for nineteen years, but in 1948 he had a stroke and was forced to retire. With no male children, he offered Nintendo’s presidency to his grandson, Hiroshi, who was twenty-one and studying law at Waseda University. It didn’t take long for Hiroshi Yamauchi to make his presence known. He fired every manager that had been appointed by his grandfather and replaced them with young go-getters who he believed could usher Nintendo beyond its conservative past.

With innovation on his mind,Yamauchi branched out into a number of other, less lucrative endeavors, including an instant-rice company and a pay-by-the-hour “love hotel.” These disappointments led Yamauchi to the conclusion that Nintendo’s greatest asset was the meticulous distribution system that it had built over decades of selling playing cards. With such an intricate and expansive pipeline already in place, he narrowed his entrepreneurial scope to products that could be sold in toy and department stores and settled upon a new category called “videogames.”

Yamauchi wanted Nintendo to aggressively get into the videogame business, which was really two separate businesses: home consoles and coin-operated arcade games. He saw the potential in these industries and took the necessary steps for Nintendo to enter both. Despite middling results from titles like Wild Gunman and Battle Shark, Yamauchi remained committed to his new vision and continued to allocate a vast amount of resources toward videogames. In 1977 Nintendo released a shoebox-sized orange console called the Color TV-Game 6, which played six slightly different versions of electronic tennis and was met with a mixed reception. Though the console managed to sell one million units, it ultimately lost money. Nevertheless, Yamauchi remained undeterred. Nintendo continued to put out arcade games (striking out with duds like Monkey Magic and Block Fever) and also continued to release home consoles (like the Color TV-Game 15, which offered fifteen slightly different versions of electronic tennis).

By this point, Nintendo already had penetrated most of Japan, and so Yamauchi set his sights on the place where the videogame frenzy had started: America.

Minoru-Arakawa-HP


1. Arakawa
Yamauchi had dipped a toe into this red, white, and blue pool a few years earlier and was encouraged by the results. In the late 1970s Nintendo had begun working with a trading company that would export arcade cabinets to American distributors, who would in turn sell these games to vendors in the United States. Though the profits from this arrangement were minimal, Yamauchi believed that if he could cut out the trading companies and send over someone he trusted to grow Nintendo’s business organically, then there was a lot of money to be made.

The American market would be risky, tricky, and perpetually persnickety. There appeared to be only one man properly equipped for the challenge: Minoru Arakawa, a frustratingly shy but brilliant thirty-four-year-old MIT graduate. Not only did Arakawa possess the insight and intellect to open a U.S. division of Nintendo, but he was already living in North America and had an incalculable fondness for America (some of his most cherished memories had come from a post college cross-country trip he’d taken in a used Volkswagen bus).

In every way, Minoru Arakawa appeared to be the perfect candidate … except that he happened to be married to Yamauchi’s daughter, Yoko, who blamed Nintendo for turning her father callous. She simply refused to let her husband join Nintendo, as she did not want to watch history repeat itself.

Yamauchi initially proposed the idea to Arakawa in early 1980. Following a pleasant family dinner, Yamauchi spent two hours discussing his plans for the expansion of Nintendo and concluded by stating that the success of his plan hinged on Arakawa. Anticipating his daughter’s reluctance, Yamauchi explained that this American division would be a completely independent subsidiary. Arakawa wrestled with the decision as well as with the objections of his wife, who cautioned him that no matter what he accomplished, he would always be perceived as nothing more than the son-in-law. Arakawa decided that the opportunity was too good to pass up, and in May 1980 he and his wife left Vancouver to start Nintendo of America (NOA).

Arakawa and his wife spent their days at the office and their nights observing games and players at local arcades. They learned a lot this way, but no amount of knowledge could make up for the fact that for Nintendo to gain a foothold, they needed a strong sales network. So Arakawa set up a meeting with a couple of guys who he thought might be able to help: Al Stone and Ron Judy.

2. Stone and Judy
Al Stone and Ron Judy were old friends from the University of Washington, where they had lived in the same frat house and had been known to embark on promising get-rich-quick schemes together (like buying soon-to-be-discarded local wine cheaply and then reselling it to their college brethren with less sophisticated palates). After graduating, they started a trucking business in Seattle called Chase Express. Chase struggled, and while they continued to invest in it and hope for a turnaround, they also began looking for alternative business opportunities.

They eventually found their answer, though it still involved big rigs. Through a friend in Hawaii, Ron Judy had been informed that a Japanese trading company was seeking a distributor to sell some arcade games made by Nintendo. Intrigued, he agreed to test the waters and received a crate containing a few arcade cabinets of Nintendo’s Space Fever. Though the game was little more than a shameless rip-off of Taito’s Space Invaders, Judy got his brother-in-law to place the games in some of the taverns he owned in south Seattle. Much to his delight, the machines were quickly overrun with quarters, which convinced him and Stone that this was their future. They formed a distribution company called Far East Video and used their assets from the trucking industry to travel around the country and sell Nintendo games to bars, arcades, hotel lounges, and pizza parlors.

Donkey-Kong-HP

Space Fever was followed by Space Launcher (underwhelming), which was followed by Space Firebird (disappointing), which was followed by a slew of unsuccessful non-space-themed games. After this string of mediocre misfires, Stone and Judy were ready to quit, and Arakawa couldn’t help but reconsider his new vocation. More than ever, Nintendo of America needed a megahit, like Pong or Pac-Man. And just when it appeared time was running out, Arakawa believed that he’d found what he needed: Radarscope.

At first glance, Radarscope may have appeared to be just another shoot-’em-up space game, but it distinguished itself with incredibly sharp graphics and an innovative 3-D perspective. After receiving positive feedback from test locations around the Seattle area, Arakawa invested much of NOA’s remaining resources in three thousand units. But a few weeks later, before the rest of the arcade cabinets even arrived, Arakawa felt an ominous chill upon revisiting the test locations, where he noticed that nobody was playing his crucial new game. That foreboding was validated after the three thousand units finally arrived and Stone and Judy found that operators had little interest. Radarscope was fun at first, the consensus appeared to be, but it lacked replay value.

3. Miyamoto
With so much invested in this game, the last remaining hope was for a designer in Japan to quickly create a game and send over processors with that new game to America, where NOA employees could swap out the motherboard and then repaint the arcade cabinets. This task was given to Shigeru Miyamoto, a floppy-haired first-time designer who believed that videogames should be treated with the same respect given to books, movies, and television shows. His efforts to elevate the art form were given a boost when he was informed that Nintendo was close to finalizing a licensing deal with King Features, enabling him to develop his game around the popular cartoon series Popeye the Sailor Man. Using those characters, he began crafting a game where Popeye must rescue his beloved Olive Oyl by hopping over obstacles tossed in his way by his obese archenemy, Bluto.

Shipments containing the code for Miyamoto’s new game began to arrive. Due to last-minute negotiation issues with King Features, Nintendo had lost the rights to Popeye, which forced Miyamoto to come up with something else. As a result, Arakawa, Stone, Judy, and a handful of warehouse employees didn’t know what to expect. They inserted the new processor into one of the thousands of unsold Radarscopemachines and then watched the lights flicker as the words “Donkey Kong” came to life on the arcade screen. The initial impression was that this was a silly game with an even sillier name. Who would possibly want to play a game where a tiny red plumber must rescue his beloved princess by hopping over obstacles tossed in his way by an obese gorilla? Yet, with no remaining options, Stone and Judy set out across the country to sell it.

Never before had there been a quarter magnet quite like Donkey Kong. It was so successful, in fact, that it eventually attracted the attention of a major Hollywood studio, whose high-priced legal team believed that the game violated copyrights, and they threatened to crush Nintendo. To avoid this potentially crippling blow, Arakawa turned to the only lawyer he knew in Seattle: Howard Lincoln, an elegant, imposing former naval attorney whose only claim to fame was having modeled for Norman Rockwell’s painting The Scoutmaster when he was a child.

4. Lincoln
Lincoln had first crossed paths with Arakawa about one year earlier when Stone and Judy needed him to review their contract with Nintendo of America. After that, Lincoln slowly but surely took on the role of Arakawa’s consigliere, weighing in on any matter with legal ramifications. As Nintendo of America grew, Lincoln drew up new employment agreements, looked at various business deals, and handled some tough matters (like siccing the U.S. marshals on Donkey Kongcounterfeiters). Through it all, Lincoln and Arakawa forged an unshakable friendship. Which is why Lincoln was the first person Arakawa contacted when, in April 1982, MCA Universal sent a telex to NCL explaining that Nintendo had forty-eight hours to hand over all profits from Donkey Kong due to the game’s copyright infringement on their 1933 classic King Kong.

Donkey-Kong-HP


It didn’t take long for them to realize that this was a high-stakes shakedown. Though never explicit, Universal’s ultimatum was simple: Settle or we’ll make life at Nintendo so difficult that the company will fold. The prudent thing to do was pay the ransom. But Lincoln had an ace tucked up his sleeve: in all of his research, there didn’t appear to be a single document indicating that Universal had trademarked King Kong, which would place the gorilla in the public domain. And in early 1983, Judge Robert W. Sweet sided with Nintendo. He concluded that they had not infringed and, as Lincoln had predicted earlier, he awarded Nintendo over $1 million in legal fees and damages.

The Donkey Kong fiasco-turned-feat caused many ripples, but three waves in particular were instrumental in creating the eventual tsunami that would be Nintendo. First, Lincoln became NOA’s senior vice president. Second the countersuit set the tone for the litigious stance that many would say later defined the company. And third — and most importantly — the verdict kept the Donkey cash flowing, providing Nintendo with a war chest at what would soon prove to be a crucial moment.

5. Borofsky and Associates
By the early 1980s, the videogame bonanza had become so lucrative that everyone wanted in. This included companies that had no business entering the market (like Purina), companies that didn’t understand the market (like Dunhill Electronics, whose Tax Avoiders allowed players to jockey past a maze of evil accountants and onerous IRS agents), and lowbrow outfits that polarized the market (like Mystique, whose flair for pornographic titles was highlighted by their 1982 anticlassic Custer’s Revenge, which follows a naked cowboy on his quest to rape Native American women). The marketplace was overrun by a glut of smut, muck, and mediocrity.

And just like that, the North American videogame industry ground to a halt. Hardware companies (like Atari) went bankrupt, software companies (like Sega) were sold for pennies on the dollar, and retailers (like Sears) vowed never to go into the business again. Meanwhile, Nintendo quietly glided through the bloody waters on a gorilla-shaped raft. The continuing cash flow from Donkey Kong enabled Arakawa, Stone, Judy, and Lincoln to dream of a new world order, one where NOA miraculously resurrected the industry and Nintendo reigned supreme. Not now, perhaps, but one day soon.

In Japan, however, that time had already come. Yamauchi’s large investment in R&D had paid off in the Family Computer. The Famicom, as it was commonly called, was an 8-bit console that stood head and shoulders above anything that had ever come before. It was released in July 1983 along with three games: Donkey Kong, Donkey Kong Jr., and Popeye, which Miyamoto ended up designing after licensing negotiations got back on track. The Famicom stumbled out of the gate but was soon rescued by heavy advertising and the release that September of Super Mario Bros. (another Miyamoto brainchild).

Nintendo-1-HP


As sales soared to staggering heights, Yamauchi pressured his son-in-law to introduce the Famicom in America. Arakawa resisted, exercising patience. The U.S. market was still licking its wounds from the videogame crash, and releasing the right console at the wrong time would be a recipe for disaster. For this reason, he continued to rebuff the suggestion until 1984, when he was finally willing to consider the notion— but only if the console Nintendo of America sold looked nothing like a console at all.

This wolf-in-sheep’s-clothing logic led to the Advanced Video System (AVS). Though the guts of this machine were nearly identical to the Famicom’s, the AVS hardly resembled its foreign relative. It came with a computer keyboard, a musical keyboard, and a cassette recorder; and aesthetically, it was slim and sleek, with a subdued gray coloring that contrasted sharply with the Famicom’s peppy red and white palette. Nintendo’s AVS, the non-console console, was first introduced at the 1984 Winter Consumer Electronics Show. Arakawa simply wanted to gauge the market reaction, which was distressing: nothing but scoffs, sighs, and sob stories. Nobody there wanted anything to do with the AVS, except for a tanned man with piercing blue eyes who stared at the Advanced Video System as if it were the sword in the stone. He then introduced himself with an understated sureness that would have made even King Arthur jealous. His name was Sam Borofsky.

Borofsky ran Sam Borofsky Associates, a marketing and sales representative firm based in Manhattan. Back in the late seventies, they became one of the first firms to represent videogames, and at the height of the boom they had been responsible for over 30 percent of Atari’s sales. If Nintendo of America ever wanted retailers to reopen their doors, then these were the guys who ought to do the knocking. From Borofsky’s end, the attraction was equally strong. Ever since Atari had imploded, he’d been scouring the country in search of the next big thing, and as he reviewed what Nintendo had to offer, he believed he had found it.

Arakawa, however, still needed convincing, which Borofsky was happy to provide. He spent months detailing the reasons for Atari’s downfall and outlining plans for a proposed launch. Meanwhile, Nintendo of America put another new costume on the Famicom, dressing it up as an all-in-one entertainment center for kids. The result of the rebranding effort was a clunky gray lunchbox-like contraption and, along with that, a new lexicon to differentiate it from its predecessors: cartridges were now dubbed Game Paks, the hardware was dubbed the Control Deck, and the entire videogame console was rechristened the Nintendo Entertainment System (NES). And to round out the renovation, the NES came with a pair of groundbreaking peripherals: a slick light-zapper gun and an amiable interactive robot named R.O.B.

6. The Launch
Focus groups suggested that the NES would be a colossal flop, R.O.B. kept malfunctioning during pitches, and the press showed no interest. Arakawa, however, remained undeterred. He temporarily relocated a handful of employees to the East Coast after leasing a warehouse in Hackensack, New Jersey, where Nintendo could house inventory, build in-store displays, and, most important, resemble a legitimate company to still-skeptical retailers. To keep tabs on the progress, Ron Judy made frequent visits to New York, often accompanied by Bruce Lowry, NOA’s bright and blustery VP of sales. Lowry was occasionally able to help Borofsky persuade the big toy chains. At the top of their wish list was Toys “R” Us, whose eventual decision to stock the NES provided Nintendo with much-needed momentum going into the launch.

On the morning of the big day, the Nintendo of America team gathered at FAO Schwarz, where Nintendo had paid for an elaborate window display and an attractive floor space that featured a small mountain of televisions with game footage playing. The moment of truth had finally arrived, and within moments of the store’s opening an excited customer eagerly approached the display, grabbing an NES and all fifteen of its games. The NOA team looked on, watching everything they had been working toward so suddenly come to fruition. It was a dream come true — until they were snapped back to reality upon learning that Customer #1 was actually just a competitor doing due diligence.

That Christmas, the NES was available in over five hundred stores. Though no staggering success, Nintendo managed to sell half of the 100,000 units they’d stocked, which effectively proved to the world that the videogame industry was not dead but had simply been hibernating.

Heading into 1987, what NOA really needed was someone to help roll out the NES nationwide and ensure that at the end of this roller-coaster ride, Nintendo would wind up on top. Someone to prove that the NES was more than just this year’s Christmas fad. Someone who could exploit the potential for expansion and transform Nintendo from a niche sensation into a global juggernaut.

7. Main
That someone turned out to be Peter Main, though at the time he was dealing with matters much more pressing than corporate expansion: beef dip sandwiches and garlic butter buns. As the president of White Spot, a Canadian fast-food chain, Main was used to eating, sleeping, and breathing burgers, but in the summer of 1985 his mind went into overdrive when an outbreak of botulism swept through Vancouver. Health officials alleged that improper refrigeration of garlic oil concentrate was the likely cause and that Main’s restaurants were responsible for the epidemic. Following this horrifying news, he spent much of the next year doing damage control, ensuring that the issue had been resolved and defending the integrity of his beef dip sandwich. When the public outcry finally died down and White Spot’s reputation was restored, he stepped down from his post and took a long vacation to decide what he’d like to do next. That’s when Arakawa called and asked Main to join Nintendo of America.

NES-1


Before Peter Main and Minoru Arakawa were ever colleagues they were friends, and ever since Arakawa had left Vancouver to start NOA, he had been trying to recruit Main. For years Main declined these job offers. Videogames were a far cry from hamburgers, but even though he turned down Arakawa, he often provided friendly advice on Nintendo of America’s strange yet profitable forays into the restaurant business (Arakawa had gone ahead and bought the British Columbia franchise rights to Chuck E. Cheese as well as a pair of seafood bistros in Vancouver). Main’s expertise as a restauranteur only fueled Arakawa’s desire to rope him in, but time after time Main declined the overtures — until that fateful call in late 1986. This time Main was open to a major life change, and it didn’t hurt that Ron Judy was planning to relocate to Europe, which would effectively make Main NOA’s number three.

Though Main lacked any videogame experience, his outsider mentality allowed him to look at the business as something novel and spectacular. To spread this new gospel, he choreographed what he would later describe as Nintendo’s “storming of Normandy,” a full-out advertising, promotion, and distribution blitz that accompanied the rollout of the NES into stores nationwide. Meanwhile, Main provided a trustworthy-looking corporate (and Caucasian) face to a company that many in the outside world still viewed as a foreign curiosity. Main was an expert charmer. And that charm, that talent for cultivating friendships, gained the company credibility with Wall Street, trust from retailers, and respect from parents wanting to know what they were buying.

Month after month, Nintendo of America grew stronger. They sold 2.3 million consoles in 1987 and 6.1 million in 1988. As staggering as these numbers were, sales of the hardware were nothing compared to the software: the company unloaded 10 million games in 1987, and 33 million more in 1988. With numbers like these, it didn’t take Main long to realize that, at the end of the day, the console was just the movie theater, but it was the movies that kept people coming back for more. This personal revelation led to a Hollywood-like title-driven business strategy, and his coining of the phrase “the name of the game is the game.”

Main’s approach to sales and marketing coincided with Arakawa’s overarching philosophy of “quality over quantity.” As Nintendo exploded, there were plenty of opportunities to make a quick buck (hardware upgrades, unnecessary peripherals), exploit the company’s beloved characters (movies, theme parks), or dilute the brand by trying to attract an audience older than Nintendo’s six-to-fourteen-year-olds. But these kinds of things didn’t interest Arakawa. What propelled him was a desire to continually provide Nintendo’s customers with a unique user experience. He set up a toll-free telephone line where Nintendo “Game Counselors” were available all day to help players get through difficult levels, and he initiated the Nintendo Fun Club, which sent a free newsletter to any customer who had sent in a warranty card. Both programs were very costly and could have been offset by charging small fees or obtaining sponsorship, but Arakawa believed that doing so would compromise Nintendo’s mission. And to further safeguard Nintendo from the dangers of impurity, he and his team put into place a series of controversial measures:
1. The Nintendo Seal of Quality: Ron Judy had the novel idea of mandating that all games pass a stringent series of tests to be deemed Nintendo-worthy, ensuring high-caliber product and making software developers beholden to Nintendo’s approval.
2. Third-party licensing program: Howard Lincoln’s strict licensing agreement enabled software designers to make games for the NES but restricted the quantity they could make (five titles per year), required full payment up front (months before revenue from a game would be seen), and charged a hefty royalty (around 10 percent). In addition to these stringent terms, all game makers needed to purchase their cartridges directly from Nintendo.
3. Inventory management: Heeding Sam Borofsky’s suggestion, Peter Main devised an incredibly rigid distribution strategy that purposefully provided licensees and retailers with only a fraction of the products they requested. The goal of this technique was twofold: to create a frenzy for whatever products were available, and to protect overeager industry players from themselves.

Though NOA’s methods drew ire from retailers, anger from software developers, and eventually allegations of antitrust violations from the U.S. government, there was no denying that whatever Nintendo was doing was working — so well, in fact, that Peter Main needed additional reinforcements as he inflicted Nintendo-mania on his adopted homeland.

8. Nintendo Power
Help came in the form of Bill White, a straitlaced marketing whiz whose smallish eyes and oversized, round-rimmed glasses emitted a boyish vibe. Though he was only thirty years old (and, after a haircut, could have passed for thirteen), he spoke about brand recognition, market analysis, and strategic alliances with the expertise of someone twice his age. Part of that precocious nature was due to an almost religious belief in the power of marketing, part was due to his father’s history as a Madison Avenue ad man, and part was due to a chronic insecurity that could only be quieted by winning at everything he did. Peter Main saw the potential in White and hired him in April 1988 to become Nintendo’s first director of advertising and public relations.

When White joined NOA, the marketing department consisted of just three people: himself, Main, and Gail Tilden, an exceptionally smart woman with an encyclopedic memory. The lack of manpower forced White to wear many hats, but his most important responsibility was to forge corporate partnerships. Though Nintendo continued the take the videogame world by storm, the rest of the world still didn’t know what a Nintendo was. To build the brand, White courted Fortune 500 companies, resulting in pivotal promotions, like Pepsi placing a Nintendo ad on over 2 billion cans of soda and Tide featuring Mario on the detergent maker’s giant in-store displays. His biggest coup came with the release of Super Mario 3, when he negotiated for McDonald’s to not only make a Mario-themed Happy Meal but also produce a series of commercials centered around the game. By virtue of his efforts, White became Main’s right-hand man, something of a protégé. But as Main fed White’s ambitions and the young marketer swallowed up more and more responsibility, this left Tilden, the other member of the marketing team, with less and less to do. This displeased Arakawa, who set out to find a better way to utilize one of NOA’s most dynamic employees.

Tilden was at home, nursing her six-week-old son, when Arakawa called and asked her to come into the office the next day for an important meeting. So the following day, after dropping off her son with some trusted coworkers, she went into a meeting with Arakawa. The appetite for Nintendo tips, hints, and supplemental information was insatiable, so Arakawa decided that a full-length magazine would be a better way to deliver exactly what his players wanted.

Nintendo-Power-HP


Tilden was put in charge of bringing this idea to life. She didn’t know much about creating, launching, and distributing a magazine, but, as with everything that had come before, she would figure it out. What she was unlikely to figure out, however, was how to become an inside-and-out expert on Nintendo’s games. She played, yes, but she couldn’t close her eyes and tell you which bush to burn in The Legend of Zelda or King Hippo’s fatal flaw in Mike Tyson’s Punch-Out!! For that kind of intel, there was no one better than Nintendo’s resident expert gamer, Howard Phillips, an always-smiling, freckle-faced videogame prodigy.

Technically, Phillips was NOA’s warehouse manager, but along the way he revealed a preternatural talent for playing, testing, and evaluating games. After earning Arakawa’s trust as a tastemaker, he would scour the arcade scene and write detailed assessments that would go to Japan. Sometimes his advice was implemented, sometimes it was ignored, but in the best-case scenarios he would find something hot, such as the 1982 hit Joust, alert Japan’s R&D to it, and watch it result in a similar Nintendo title — in this case a 1983 Joust-like game called Mario Bros.As Nintendo grew, Phillips’s ill-defined role continued to expand, though he continued to remain the warehouse manager. That all changed, however, when he was selected to be the lieutenant for Tilden’s new endeavor.

In July 1988, Nintendo of America shipped out the first issue ofNintendo Power to the 3.4 million members of the Nintendo Fun Club. Over 30 percent of the recipients immediately bought an annual subscription, marking the fastest that a magazine had ever reached one million paid subscribers. And as the magazine’s audience grew, so did the influence of Howard Phillips.

If Nintendo Power gave kids a chance to step inside the candy factory, then he was their Willy Wonka, magically and eccentrically showing them how the sweets were made. Though Mario was Nintendo’s mascot, Phillips became the face of Nintendo. Peter Main took advantage of this, sending Phillips all over the country for press events and on-camera interviews. Main appointed Phillips Nintendo’s first Game Master. Shortly thereafter, Howard Phillips became a national celebrity, boasting a Q score higher than Madonna, Pee-wee Herman, and the Incredible Hulk.

The rise of the Game Master was just the latest sign of Nintendo’s unprecedented success. By 1990, Nintendo of America had sold nearly thirty million consoles, resulting in an NES in one out of every three homes. Videogames were now a $5 billion industry, and Nintendo owned at least 90 percent of that. The numbers were astounding, but Nintendo’s triumph went beyond that. Arakawa had proven that he was more than just the son-in-law, Lincoln had proven that he could take on anyone, and Main had proven that he could swim with the sharks.

Together, they had single-handedly resurrected an industry. And they did it all with only 8 bits. Imagine what they could do with 16 …  

Blake J. Harris is a writer and filmmaker based in New York. He is currently codirecting the documentary based on his book, which is being produced by Scott Rudin, Seth Rogen, and Evan Goldberg. He will also serve as an executive producer on Sony’s feature-film adaptation of Console Wars.


Source:
http://grantland.com/features/the-rise-of-nintendo-video-games-history/

This Augmented Reality Motorcycle Helmet could Save Your Life



"I was in early morning traffic when an inattentive driver hit me from behind and I was thrown from my motorcycle." Ryan Shearman, founder and CEO of FUSAR Technologies, tells me at Augmented World Expo 2014. It's the kind of thing that could happen to any motorcyclist -- and indeed, happens too often -- but it also served as a spark of inspiration. "It started the wheels turning in my head: how can I make motorcycle riding safer?" His answer was to build a better helmet.

Shearman calls his smart helmet the Guardian; it's a DOT approved brain bucket with an Android board, two wide-angle cameras and bits and pieces from an Epson Moverio. "What it is, is a fully integrated and connected device that motorcyclists can wear that allows them to have a full cognitive awareness of what's going on around them." The Epson optics provide the rider with a detailed heads-up display, filling their view with data from the rear view camera, their current speed and navigation information.
"It also has video recording capabilities," he told me. "Everything is recorded passively, but in the event that an accident is detected, that footage is locked into hard memory. It effectively has a black box feature built in." It sounds like a great feature, but Shearman obviously wants it to be used as infrequently as possible. The helmet's software package is being designed to monitor the cameras for potential hazards -- if all goes well, the final product will be able to alert the driver of a dangerous situation before it becomes an accident. He also plans to add voice control to future prototypes.



The models Shearman brought to AWE were pretty beat up, but with good reason: the entrepreneur has been riding up and down California's Pacific Coast Highway all week, shooting footage for his company's upcoming Kickstarter video. The company plans to launch its crowdfunding imitative by mid July, and wants to have the Guardian on the market in about a year. Hungry for a safer helmet? Check out the firm's website at the source link below.


Source: 
http://www.engadget.com/2014/05/30/augmented-reality-motorcycle-helmet/?ncid=rss_truncated

Report: Oculus partnering with Samsung for phone-compatible VR device

Image: Engadget/Daniel Cooper

Virtual reality company Oculus sure has been palling around with some high-profile outfits lately. First there was the company's acquisition by Facebook, and now, according to a report by our friends over at Engadget, Oculus is schmoozing it up with Samsung to develop another virtual reality headset. The kicker is, this one will supposedly use your phone.

Citing sources "close to both companies," Engadget has posted a slew of details regarding the unannounced device. Allegedly, the device will connect to your phone via an existing port, and turn your phone's screen into the virtual space your eyes see. Engadget reports that the current version of the headset features buttons familiar to Android users - Home, Back, Recent Apps - and also has a built-in accelerometer. Engadget created the mock-up you see above, if you want an idea of what the device might look like.

While Oculus' Rift headset is a VR device primarily made for PC gaming, Engadget expects Samsung to tout this device as a media experience as well. Ever watch Netflix on your phone? Well, what if you could watch Netflix on your phone on your face?

Since this device is still unofficial, there's unfortunately nothing by way of release dates or prices to be found. Still, if Oculus is partnering with Samsung, that's quite the powerful company to keep. And here we were, just reminiscing about the days when Oculus first took to Kickstarter. They grow up so fast!


Source: http://www.joystiq.com/2014/05/30/report-oculus-partnering-with-samsung-for-phone-compatible-vr-d/

Senin, 19 Mei 2014

Fifty Years of BASIC, the Programming Language That Made Computers Personal

Knowing how to program a computer is good for you, and it’s a shame more people don’t learn to do it.

For years now, that’s been a hugely popular stance. It’s led to educational initiatives as effortless sounding as the Hour of Code(offered by Code.org) and as obviously ambitious as Code Year(spearheaded by Codecademy).

Even President Obama has chimed in. Last December, he issued a YouTube video in which he urged young people to take up programming, declaring that “learning these skills isn’t just important for your future, it’s important for our country’s future.”


I find the “everybody should learn to code” movement laudable. And yet it also leaves me wistful, even melancholy. Once upon a time, knowing how to use a computer was virtually synonymous with knowing how to program one. And the thing that made it possible was a programming language called BASIC.

John Kemeny
John Kemeny shows off his vanity license plate in 1967
Adrian N. Bouchard / Dartmouth College

Invented by John G. Kemeny and Thomas E. Kurtz of Dartmouth College in Hanover, New Hampshire, BASIC was first successfully used to run programs on the school’s General Electric computer system 50 years ago this week–at 4 a.m. on May 1, 1964, to be precise.

The two math professors deeply believed that computer literacy would be essential in the years to come, and designed the language–its name stood for “Beginner’s All-Purpose Symbolic Instruction Code”–to be as approachable as possible. It worked: at first at Dartmouth, then at other schools.

In the 1970s and early 1980s, when home computers came along, BASIC did as much as anything else to make them useful. Especially the multiple versions of the language produced by a small company named Microsoft. That’s when I was introduced to the language; when I was in high school, I was more proficient in it than I was in written English, because it mattered more to me. (I happen to have been born less than a month before BASIC was, which may or may not have anything to do with my affinity for it.)

BASIC wasn’t designed to change the world. “We were thinking only of Dartmouth,” says Kurtz, its surviving co-creator. (Kemeny died in 1992.) “We needed a language that could be ‘taught’ to virtually all students (and faculty) without their having to take a course.”
Train Basic Everyday
A pro-BASIC sign, as seen in a Russian school computer lab in the mid-1980s

Wikipedia

Their brainchild quickly became the standard way that people everywhere learned to program computers, and remained so for many years. But thinking of its invention as a major moment only in the history of computer languages dramatically understates its significance.

In the mid-1960s, using a computer was generally like playing chess by mail: You used a keypunch to enter a program on cards, turned them over to a trained operator and then waited for a printout of the results, which might not arrive until the next day. BASIC and the platform it ran on, the Dartmouth Time Sharing System, both sped up the process and demystified it. You told the computer to do something by typing words and math statements, and it did it, right away.
Today, we expect computers–and phones, and tablets and an array of other intelligent devices–to respond to our instructions and requests as fast as we can make them. In many ways, that era of instant gratification began with what Kemeny and Kurtz created. Moreover, their work reached the public long before the equally vital breakthroughs of such 1960s pioneers as Douglas Engelbart, inventor of the mouse and other concepts still with us in modern user interfaces.

You might assume that a programming language whose primary purpose was to help almost anybody become computer-literate would be uncontroversial—maybe even universally beloved. You’d be wrong. BASIC always had its critics among serious computer science types, who accused it of promoting bad habits. Even its creators became disgruntled with the variations on their original idea that proliferated in the 1970s and 1980s.

And eventually, BASIC went away, at least as a staple of computing in homes and schools. Nobody conspired to get rid of it; no one factor explains its gradual disappearance from the scene. But some of us miss it terribly.

When it comes to technology, I don’t feel like a grumpy old man. Nearly always, I believe that the best of times is now. But I don’t mind saying this: The world was a better place when almost everybody who used PCs at least dabbled in BASIC.

BASIC Beginnings
Sooner or later, it was inevitable that someone would come up with a programming language aimed at beginners. But BASIC as it came to be was profoundly influenced by the fact that it was created at a liberal arts college with a forward-thinking mathematics program. Dartmouth became that place largely because of the vision of its math department chairman, John Kemeny.

Born in Budapest in 1926 and Jewish, Kemeny came to the United States in 1940 along with the rest of his family to flee the Nazis. He attended Princeton, where he took a year off to contribute to the Manhattan Project and was inspired by a lecture about computers by the pioneering mathematician and physicist John von Neumann.

John Kemeny
John Kemeny teaches BASIC to students at Dartmouth (not yet a co-ed institution)
Dartmouth College

Kemeny worked as Albert Einstein’s mathematical assistant before arriving at Dartmouth as a professor in 1953, where he was named chairman of the mathematics department two years later at the age of 29. He became known for his inventive approach to the teaching of math: When the Alfred P. Sloan Foundation gave the school a $500,000 grant to build a new home for the department in 1959, TIME noted the news and said it was mostly due to Kemeny’s reputation.

The thinking that led to the creation of BASIC sprung from “a general belief on Kemeny’s part that liberal arts education was important, and should include some serious and significant mathematics–but math not disconnected from the general goals of liberal arts education,” says Dan Rockmore, the current chairman of Dartmouth’s math department and one of the producers of a new documentary on BASIC’s birth. (It’s premiering at Dartmouth’s celebration of BASIC’s 50th anniversary this Wednesday).

“Our vision was that every student on campus should have access to a computer”

In the early 1960s, average citizens–even those who happened to be students at Ivy League schools with computing centers–had never encountered a computer in person. The machines were kept “behind locked doors, where only guys–and, once in a while, a woman–in white coats were able to access them,” Rockmore says.

Kemeny believed that these electronic brains would play an increasingly important role in everyday life, and that everyone at Dartmouth should be introduced to them. “Our vision was that every student on campus should have access to a computer, and any faculty member should be able to use a computer in the classroom whenever appropriate,” he said in a 1991 video interview. “It was as simple as that.”

Of course, Dartmouth couldn’t give a computer to every student and faculty member: Computers were a pricey shared resource, normally capable of performing only one task at a time. That’s why you typically handed your program over on punch cards and waited your turn.

Tom Kurtz, who had joined Dartmouth’s math department in 1956, proposed using a relatively new concept called time-sharing. It would divvy up one system’s processing power to serve multiple people at a time. With what came to be known as the Dartmouth Time-Sharing System, or DTSS, a user sitting at a terminal would be able to compose programs and run them immediately.

DTSS
A schematic of Dartmouth’s time-sharing system, as shown in an October 1964 manual
Dartmouth College

“If you’re trying to get a student interested in the idea of computing, you need some immediacy in the turnaround,” says Rockmore. “You don’t want to ship a 10-line program off to a computer center before you know if you’ve got it right.”

But what sort of programs? In the past, Kemeny and Kurtz had made two unsuccessful stabs at creating computer languages for beginners: Darsimco (Dartmouth Simplified Code) and DOPE (Dartmouth Oversimplified Programming Experiment). But this time they considered modifying an existing language.

Tom Kurtz
Tom Kurtz (standing) works with Michael Busch, co-programmer of the DTSS, with the GE-225 mainframe in the background
Adrian N. Bouchard / Dartmouth College

“I tried, briefly, to develop simple subsets of Fortran and ALGOL, but found quickly that such could not be done,” Kurtz says. Even the most common of tasks could be tricky in Fortran, which had an “almost impossible-to-memorize convention for specifying a loop: ‘DO 100, I = 1, 10, 2’. Is it ‘1, 10, 2’ or ‘1, 2, 10’, and is the comma after the line number required or not?”.

“Fortran and ALGOL were too complex,” says John McGeachie, who, as a Dartmouth undergraduate, was the co-author of the DTSS software. “Anything that required days and days of training would have defeated the purpose. It certainly would have curtailed its widespread popularity”.

So Kemeny and Kurtz decided to create something so straightforward that it almost didn’t involve memorization at all. “We wanted the syntax of the language to consist of common words, and to have those words have a more-or-less obvious meaning,” says Kurtz. “It is a slight stretch, but isn’t it simpler to use HELLO andGOODBYE in place of LOGON and LOGOFF?”.

“If you were writing a very simple program, you’d get your answer in a second or so.” 

BASIC was primarily Kemeny’s idea, and he wrote the first version himself. Starting in September 1963, he and Kurtz began the overarching effort to get the language and the DTSS up and running. They led a team of a dozen undergraduate students–young men who were still in the process of learning about computers themselves. (Dartmouth was a male-only institution at the time: Kemeny himself took it co-ed in 1972 as president of the college, a position he held from 1970-1981).

“We used to work all night and then go to sleep,” remembers McGeachie. “Kemeny would work with us, and then go teach math to undergraduates”.

Kemeny and Kurtz
Tom Kurtz and John Kemeny examine a brochure for the GE-225 mainframe computer that powered the initial version of Dartmouth’s time-sharing system
Adrian N. Bouchard / Dartmouth College

A $300,000 grant from the National Science Foundation helped fund the undertaking, which required not one but two powerful computers, both from General Electric. A GE-225 mainframe (quickly replaced with a faster GE-235) did the heavy lifting of performing floating-point math, while a smaller Datanet-30 coordinated communications with Teletype machines–essentially glorified typewriters–which students would use to do their programming.

“We were not working under much constraints,” Kurtz says. “We had 16K of 20-bit words to work with.” Though a rounding error by today’s standards, that was enough memory to write a capable version of BASIC: Years later, when others adapted the language for PCs, they sometimes had to cram it into as little as 3K of 8-bit memory, resulting in cut-down, ungainly implementations that Kemeny and Kurtz disowned.

Dartmouth BASIC
An early Dartmouth BASIC program, performing a simple math exercise, as run on a simulator of the college’s time-sharing system. The DTSS’s Teletypes used yellow paper–and printed far more slowly than this recreation
Harry McCracken / TIME

Unlike many BASICs to come, Dartmouth BASIC was a compiler, which meant that it converted your entire program in one fell swoop into machine code that the computer could understand, rather than line by line every time you ran the program. It performed that task rapidly, especially by the leisurely standards of 1960s computing: “If you were writing a very simple program, you’d get your answer in a second or so,” McGeachie says. “It might take longer to print it out, because the Teletypes could only do 10 characters a second.”

The historic moment at Dartmouth on May 1, 1964 at 4 a.m. was actually two historic moments. Not one brief BASIC program but two or three of them–accounts vary–ran simultaneously, proving both that BASIC worked and that the Dartmouth Time-Sharing System was capable of dealing with more than one user at a time.

In June 1964, they became generally available to Dartmouth students, initially on 11 Teletype machines. The first version of BASIC had 14 commands, all with straightforward names and syntax that made sense:  
  • PRINT output text and numbers to the Teletype (and, later on, displayed it on the screens of time-sharing terminals and PCs);
  • LET told the computer to perform calculations and assign the result to a variable, in statements such as LET C = (A*2.5)+B;
  • IF and THEN let the program determine if a statement was true, vital for anything involving decision-making;
  • FOR and NEXT let a program run in loops;
  • GOTO let a program branch to another numbered line within itself;
  • END, which was required in Dartmouth BASIC, told the computer that it had reached the program’s conclusion.

Then there was INPUT, a command that let a BASIC program accept alphanumeric characters typed in by a user. It wasn’t among the initial 14, arriving only in the third revision of the language in 1966. But when it did, it made it possible to write far more interactive programs.

Without INPUT, BASIC was mostly for solving math problems and doing simple simulations; with it, the language could do almost anything. Including play games, which many people came to consider as the language’s defining purpose.

Jennifer Kemeny
John Kemeny checks out program written by his daughter Jennifer on the family’s “home computer,” a terminal on the Dartmouth Time-Sharing System
Adrian N. Bouchard / Dartmouth College

You could write a fairly sophisticated program in Dartmouth BASIC. (An early manual stated the maximum program length as “about two feet of teletype paper.”) But you could also make the computer do something interesting and useful with just a few lines of simple code, shortly after you’d encountered the language for the first time. That was the whole point.

It mattered to Kemeny and Kurtz that access to BASIC and the DTSS be as open as possible. “Any student can enter the Library, browse among the books or take some back to his room. No one asks him why he wants the book, and he does not need anyone’s permission,” Kemeny wrote in a brochure for the college’s new computer center, which opened in 1966. “Similarly, any student may walk into the Kiewit Computation Center, sit down at a console, and use the time-sharing system. No one will ask if he is solving a serious research problem, doing his homework the easy way, playing a game of football, or writing a letter to his girlfriend”.

What Kemeny was describing in the Kiewit brochure was personal computing. It’s just that the term hadn’t been invented yet. Even the concept was still audacious.

Glee Club
Members of Dartmouth’s Glee Club perform a very early form of computer dating with women in California via the Dartmouth Time-Sharing System
Adrian N. Bouchard / Dartmouth College

Dartmouth BASIC did everything that Kemeny and Kurtz hoped it would, and more. In a triumphant 1967 report, they said that by the end of that academic year, 2000 Dartmouth students–representing 80 percent of the three incoming freshman classes who had arrived since BASIC’s invention–would have learned about computers by writing and debugging their own programs. Many continued to do so after completing the BASIC classwork that was a mandatory part of the school’s math program. Forty percent of faculty members–not just math and science teachers–also used the system.

“Anyone who tries to convince a Dartmouth undergraduate either that computers are to be feared or that they are of little use, will be met with well-founded scorn,” the report said. “The Dartmouth student knows better–and knows it from personal experience”.

GE BASIC
The cover of a General Electric brochure touting its version of Dartmouth BASIC
Computer History Museum

Dartmouth provided access to the DTSS over telephone lines to other East Coast schools, including Harvard and Princeton, as well as to some high schools. It also helped other institutions implement time-sharing systems and BASIC, while General Electric commercialized the DTSS and Dartmouth BASIC and sold them to business customers. Other computer companies such as Digital Equipment Corporation and HP introduced their own BASICs.

Dartmouth’s effort to democratize computing was, in short, a huge success. “Qualitatively, I was right on the impact,” Kemeny said in the 1991 interview. “Quantitatively, I vastly underestimated it. That is, it had impact on many, many more courses than I thought, and the amount of impact was much greater–courses being totally changed because of the availability of computers. I also underestimated, of course, how far educational computing would spread all over the world”.

Bashing BASIC
Not everybody was happy with the way the language put computing within reach of mere mortals. Its most articulate and vociferous opponent was Edsger Dijkstra (1930-2002), an influential computer scientist. “It is practically impossible to teach good programming to students that have had a prior exposure to BASIC,” he groused in a 1975 essay titled “How Do We Tell Truths That Might Hurt?” “As potential programmers they are mentally mutilated beyond hope of regeneration.”

Edsger Dijkstra
Edsger Dijkstra, the influential computer scientist and BASIC critic, in 2002
Wikipedia

Now, it’s possible that Dijkstra was exaggerating for dramatic effect. BASIC wasn’t his only bête noire among programming languages: He also spewed bile in the direction of FORTRAN (an “infantile disorder”), PL/1 (“fatal disease”) and COBOL (“criminal offense”).

Still, despite Dijkstra’s foreboding attitude towards BASIC, countless programmers who started out with the language went on to have thriving careers. And the thing is, some of the characteristics that have given BASIC a bad reputation are precisely the same ones that made it so easy to learn.

For instance, BASIC offered GOTO, a command that let you jump from anywhere in your program to anywhere else in it—a practice that could result in messy “spaghetti code.” (In 1968, Dijkstra devoted an entire essay to his contempt for the command, “Go To Statement Considered Harmful.” ) A thoughtful BASIC programmer could indeed compose fastidious code that didn’t use GOTO. But insisting that liberal arts students obsess about tidy programming techniques from the get go was hardly a way to make computers less threatening. For them, GOTO was a godsend.

“It is practically impossible to teach good programming to students that have had a prior exposure to BASIC”

In its classic form, BASIC also made you organize your programs with line numbers—such as the 10in 10 PRINT “HELLO”—a convention that was arguably superfluous and eventually fell by the wayside. But line numbers helped emphasize the sequential nature of computer programs, which, regardless of the language in question, consist of a task broken down into steps.

In “How Do We Tell Truths That Might Hurt?,” Dijkstra tips his hand by calling programming “one of the most difficult branches of applied mathematics” and suggesting that less talented mathematicians should not even bother with it. If that was his take in 1975, he couldn’t simultaneously approve of BASIC. Either programming a computer was exceptionally hard and should be left to the experts, or it was something that should be democratized, as BASIC had already done. Not both.

Dartmouth students
Dartmouth students at work in a remote computer center in 1965
Adrian N. Bouchard / Dartmouth College

Today, Kurtz is blunt about criticism of the language he co-created as being insufficiently serious or a dangerous way to begin learning about computer programming. “It’s B.S.,” he says.

“I’ll go out on a limb and suggest the degrading of BASIC by the professionals was just a little bit of jealousy–after all, it took years for us to develop our skill; how is it that complete idiots can write programs with just a few hours of skill?”

BASIC may not have made sense to people like Edsger Dijkstra. That was O.K.—it wasn’t meant for them. It made plenty of sense to newbies who simply wanted to teach computers to do useful things from almost the moment they started to learn about programming. And in 1975, as Dijkstra was accusing it of mutilating minds, there were about to be far more of those people than ever before.

Enter the PC
By letting non-computer scientists use BASIC running on the DTSS, Kemeny, Kurtz and their collaborators had invented something that was arguably the first real form of personal computing. But it didn’t yet involve personal computers. That revolution got jump-started a decade later, when a New Mexico model rocket company called MITS launched the Altair 8800, the $497 build-it-yourself microcomputer ($621 assembled) that launched the PC revolution.

It was huge news among the small number of people who could be called computer nerds at the time—people like Paul Allen, who was working as a programmer for Honeywell in Boston.

MITS Altair 8800B
MITS’ Altair 8800B (1976), one of the first microcomputers to run Microsoft BASIC
Science & Society Picture Library / Getty Images

When he bought a copy of the January 1975 issue of Popular Electronics at the Out of Town newsstand in Harvard Square, with the Altair on the cover, he and an old friend—a Harvard sophomore named Bill Gates—got excited. Immediately, they knew they wanted to try to make the Altair run BASIC, a language they’d both learned in its original timeshared-via-Teletype form at the Lakeside School in Seattle.

Actually, Allen had been ruminating about the possibility of building his own BASIC even before he knew about the Altair. “There hadn’t been attempts to write a full-blown programming language for a microprocessor,” he explains. “But when the chips leading up to the 8080 processor became available, I realized we could write a program for it that would be powerful enough to run BASIC”.

“I realized we could write a program for it that would be powerful enough to run BASIC”

Famously, they wrote the first draft of Altair BASIC without having access to an Altair, using a simulator they ran on a Digital Equipment PDP-10 minicomputer. “Monte Davidoff, who helped me and Bill write BASIC for the Altair, once said programming was like writing a novel,” Allen says. “That’s how we approached BASIC. We started with a plot of sorts to know the general approach we were taking. We broke the big ideas into sections, or chapters, and then we’d edit and re-edit and keep editing until we had preserved the big picture and fixed all the bugs. It was the hardest but most enjoyable work I’ve ever done.”

Altair BASIC
An original paper-tape copy of Paul Allen, Bill Gates and Monte Davidoff’s Altair BASIC, as exhibited at the New Mexico Museum of Natural History and Science
Michael Holley / Wikipedia
Allen and Gates’ BASIC partnership became a company, known at first as Micro-Soft. They licensed the language to MITS, which sold it for a hefty $500, marked down to $75 if you bought it bundled with Altair hardware. Or you could get it for free by pirating it—something that so many early microcomputer owners did that Gates wrote a legendary open letter bemoaning the rampant theft of Micro-Soft’s intellectual property.

Setting a high price for BASIC and using it as an incentive to buy hardware “was a strategy that backfired in a big way,” says David Bunnell, who was in charge of publications at MITS. Ed Roberts, MITS’ president, “was just sort of small-minded in that way. He couldn’t see the big picture.”

Even so, BASIC being available on a microcomputer was a big deal. “There are two keys to the new computer revolution,” declared an unsigned article–Bunnell says he almost certainly wrote it–announcing Altair BASIC in the April 1975 issue of MITS’ Computer Notes newsletter. “One is computers must be inexpensive and the other is computers must be understandable. With the Altair 8800 and Altair BASIC, both of these criteria have been met.”

Radio Shack TRS-80
Radio Shack’s TRS-80 (1977), one of the first PCs to come with BASIC as standard equipment
Science & Society Picture Library / Getty Images
That was true, but it was only the beginning of the story. The Altair and its earliest rivals catered to hobbyists who were comfortable flipping switches and wielding soldering guns. In 1977, a new wave of PCs arrived that were aimed at a far broader swath of consumers, including Apple’s Apple II, Commodore’s PET 2001and Radio Shack’s TRS-80. And one of the things that made them consumer PCs was that they came with BASIC.

The PET offered a variant of Microsoft BASIC from the start. Apple and Radio Shack started out with rudimentary variants of the language—Apple’s was written by cofounder Steve Wozniak himself—before acquiring the rights to distribute Microsoft’s version. For the next few years, nearly every microcomputer of any significance had a version of Microsoft BASIC, including models from companies such as Atari and Texas Instruments.

Paul Allen and Bill Gates
Paul Allen (left) and Bill Gates in 1981, surrounded by some of the computers that ran their version of BASIC
Microsoft
Tech pundits like to talk about killer apps–software so useful that you’d buy a computer just to run it. 1979′s VisiCalc is often identified as the first such program. But before that, Microsoft BASIC itself was the PC’s killer app. Lots of people bought a computer so they could learn how to program it.
There were other microcomputer BASICs besides Microsoft’s. In the early days, the best-known of them was CBASIC, created by a naval officer named Gordon Eubanks. His version was particularly popular among people developing commercial programs–which, at the time, were as likely to be written in BASIC as in any other language.

“Microsoft’s BASIC was much more fundamental than CBASIC,” says Eubanks. “It appeared on every single computer. If you got your TRS-80, you could fire it up and write a little program to print ‘HELLO.’ And it worked. I was focused on a narrower thing, how to develop commercial applications…The end result is, Bill did a little better.”

(Though he may not have reached Gates-ian levels of success, Eubanks did end up doing rather well for himself, eventually becoming the longtime CEO of Symantec, an enduringly important software company in its own right.)

Eubanks mentions a key reason for Microsoft BASIC’s importance: It was unavoidable. When you turned on an early microcomputer such as the TRS-80, it dumped you directly into the language. You could load a program off tape cassette or floppy disk if you chose, but you could also simply begin typing a new one. These computers begged to be programmed in a way that their descendants do not.

Breakout
This very early Apple II clone of Atari’s Breakout–later known as Brick Out and Little Brick Out–was written by Apple co-founder Steve Wozniak himself, in Integer BASIC, which he also wrote
Harry McCracken / TIME
Unlike both Dartmouth BASIC and CBASIC, Microsoft BASIC was an interpreted language, which meant that it converted each line of a BASIC program into machine code on the fly, every time you ran the program. That made it sluggish, especially if you didn’t know any hacks for speeding up stuff like animated graphics.

But it also meant that anybody could examine the original program for anything written in Microsoft BASIC, including commercial software distributed on cassette or floppy disk. You could learn from it, tweak it or steal snippets of it for your own projects. It had many of the virtues of open-source software long before the concept had that name.

My Computer Likes Me
Bob Albrecht’s 1972 BASIC manual was as lovably quirky on the inside as it was on the outside
Dymax

If you needed help learning BASIC’s lingo, there was a near-infinite amount of reference material, all of which included programs you could type in and customize. Like the language itself, the works documenting BASIC tended to be rather informal by computer-science standards.

In 1972, for instance, Bob Albrecht of the People’s Computer Company in Menlo Park, California wrote a primer with a wonderfully evocative title: My Computer Likes Me When I Speak in BASIC. You couldn’t imagine a similar title being applied to a volume about FORTRAN (or, today, one about Objective-C or Java).

The single most influential book of the BASIC era was not a textbook–at least not officially. It was 101 BASIC Computer Games, later known as BASIC Computer Games and edited, in both versions, by David H. Ahl.

The tome grew out of Ahl’s work as educational marketing manager for Digital Equipment Corporation’s PDP-8 line of minicomputers. Around 1971, he created BASIC versions of two games that had originally been written in DEC’s FOCAL language, Hammurabi (in which the player governed ancient Sumeria) and Lunar Lander (exactly what it sounds like). “I wrote them from scratch more to demonstrate the machine than anything else,” he says.

Their popularity led him to include BASIC games in EDU, a DEC newsletter he edited. Many of them were contributed by readers–especially high school students–and in 1973, DEC published an anthology, 101 BASIC Computer Games.

BASIC Computer Games
David Ahl’s seminal anthology of BASIC games, in its original form as published by Digital Equipment Corp. and a later version rejiggered for PCs
David Ahl
The book quickly went into a second printing, for a total of 10,000 copies sold. “That was far more books than there were computers around, so people were buying three, four, five of them for each computer,” Ahl remembers.

Shortly after Ahl left DEC in 1974, he started a magazine, Creative Computing, devoting a great deal of its space to BASIC. Then he acquired the rights to publish 101 BASIC Computer Games under the title BASIC Computer Games.

As with BASIC itself, BASIC Computer Games predated the PC revolution but helped propel it forward. It got updated, was translated into six languages, inspired multiple sequels and became the first computer book to have sold a million copies. Like folk songs, its programs felt like part of a shared cultural heritage. They were passed around, mutating into multiple variants as they did so.

By the time I got my hands on the book–circa 1978 when my father brought home a TRS-80–I was aware that the games it contained were, well, basic. Because they had their roots in the Teletype era, most of them involved no graphics whatsoever. Instead, they were resolutely text-oriented, like a golf game that had you type in your swing as a number from 1 to 100.

In theory, I shouldn’t have found the book very valuable. After all, I could get games written for the TRS-80s I used at home and school on floppy disks and load them in seconds, rather than laboriously entering them from a book and swatting the bugs I’d introduced in the form of typos. But I cherished it. I also typed in plenty of other programs from magazines such as Creative Computing, 80 Microcomputing, SoftSide, and the most extravagantly programming-centric of the major monthlies, Compute.

Computer
The March 1982 issue of Compute–stuffed, as always, with programs in BASIC
Internet Archive
The best BASIC programs published in computer magazines were surprisingly professional, in part because the bar of professionalism was easy to clear. “When I first came to Compute in 1983, the software that was sold, it was all handcrafted kinds of stuff in a Baggie, with a sheet or two of instructions,” says Gregg Keizer, who eventually became the magazine’s editor.

That made learning how to program all the more tempting: If you didn’t look at much of the commercial software of the time and think to yourself “I can learn to do that,” you weren’t trying.

“Software was handcrafted stuff in a Baggie, with a sheet or two of instructions.”

(Sure, there were people who used PCs without making any attempt to program them. But in the computer lab at my high school, in the late 1970s and early 1980s, we looked on them with pity. They were illiterate, and didn’t seem to care.)

Typing in programs from listings was an intellectual exercise rather than mere rote effort, in part because you often ended up adapting them for your computer’s version of Microsoft BASIC. The language had splintered into dialects as the companies that licensed it adapted it for their computers, stuffing it into whatever memory was available and improvising functions for machine-specific capabilities such as graphics and sound. It didn’t look that much like Dartmouth BASIC in the first place, and it became a lingua sorta franca at best.

For instance, there’s a famous one-line BASIC program for the Commodore 64 computer:
10 PRINT CHR$ (205.5 + RND (1)); : GOTO 10

That generates a random, maze-like pattern that goes on forever, or at until you press Ctrl-C. It’s so hypnotic and iconic that it inspiredan entire book of essays in 2012, titled, appropriately, 10 PRINT CHR$ (205.5 + RND (1)); : GOTO 10. But it won’t run on any non-Commodore computer, because its clever technique depends on the way the Commodore 64 handles graphics.

Commodore 64
This one-line Commodore 64 program, which prints a never-ending , maze-like pattern, inspired an entire book of essays in 2012
Harry McCracken / TIME

Kemeny and Kurtz were exceptionally disappointed with what others had done to their creation. In 1985 they published a book, Back to BASIC, which bemoaned the crudeness and inconsistency of Microsoft BASIC and other variants available for microcomputers. They titled the chapter on PC-based BASICs “What Went Wrong?” and called them “street BASICs,” a moniker meant to sting.

John Kemeny and Thomas Kurtz
BASIC creators Thomas Kurtz (left) and John Kemeny in the mid-1980s, showing their True BASIC running on a Mac and an IBM PC
Dartmouth College
BASIC’s creators didn’t just complain about what had happened to the language. They also founded a company with a meaningful name–True BASIC–which produced a version that added new features while preserving the original Dartmouth BASIC’s original vision. Unlike Microsoft BASIC, True BASIC was also designed to be the same language, no matter what computer you ran it on.

“In 1983 or so, the microcomputer versions of BASIC were all different,” Kurtz says. “We thought we could do some good by implementing the same language (literally) on the different computers. We were wrong. The different computers came up so thick and fast that we, a small company, couldn’t keep up.”
“We all know the result: books with titles like BASIC for the XYZ Computer. Ad nauseum.”

As someone who grew up on street BASIC, I’m just as glad that I didn’t know about Back to BASIC when it was published. Although most of Kemeny and Kurtz’s beefs make perfect sense to me today, they could have used the BASIC programs I was writing at the time as case studies typifying everything that made them unhappy with what had happened to their language.

The BASIC programs I wrote for the TRS-80 would run only on a TRS-80; later, when I got an Atari 400, I wrote programs that would only run on an Atari. Like many Microsoft BASIC programs, mine did weird, undocumented things to work around the language’s limitations–most of all its sluggishness, another fact Kemeny and Kurtz weren’t happy about.

My programs were not elegant; oftentimes, they weren’t even intelligible to anyone except for me and the TRS-80. But they were mine, and they let me tell the computer to do exactly what I wanted it to do.

Munchkin Attack
David Plotkin’s Munchkin Attack, an Atari game published as a type-in in SoftSide magazine in 1982, may remind you of a more famous arcade game. Being written in interpreted BASIC, it can just barely manage to move two on-screen characters at once
Harry McCracken / TIME

BASIC was so approachable that you could toss off little improvisational programs with barely any effort at all. I probably wrote hundreds of them in my high school’s computer lab—games, utilities, practical jokes to play on my classmates. Most were meant to be disposable, and unless any of them show up on forgotten floppy disks at my parents’ house, almost all of them are long gone.

SLOT/BAS
SLOT/BAS, the TRS-80 game I wrote circa 1980
Harry McCracken / TIME
But I was happy enough with one—a slot machine simulation I wrote despite never having used a slot machine—to upload it a local online bulletin-board system. Approximately 34 years later, I discovered that it was among the programs in the vast collection of Ira Goldklang, whosewebsite is a veritable Smithsonian of all things TRS-80.

Ira reunited me with SLOT/BAS, which I loaded up on my MacBook Air, via a TRS-80 emulator. Perusing the code I’d written decades ago and then playing the game was a Proustian experience anyone who was ever obsessed with programming will understand.

Back when I was hacking BASIC, my programming idol was a man named Leo Christopherson. A junior high school math teacher who had bought a TRS-80 after one of his students brought in a BASIC programming manual, he did wonderful things with animation and sound effects in games such as Android Nim, Snake Eggs and Dancing Demon. It was all the more impressive given that he plied his trade primarily on the TRS-80, a computer with crude graphics and no official support for audio at all. (In recent years, he’s recreated some of his games in modern editions for Windows PCs and Macs).

basic-android
Leo Christopherson’s TRS-80 masterwork Android Nim (1978). Trust me: These graphics were amazing for the time
Harry McCracken / TIME
Christopherson’s techniques would have horrified Kemeny and Kurtz. His programs were rife with programming techniques deeply tied to the TRS-80’s particular software and hardware, and in a quest for more speed than Microsoft BASIC was theoretically capable of delivering, he wrote more and more of his code in machine language for the computer’s Z-80 microprocessor, which appeared as lines of gibberish in his BASIC programs.

“I never felt troubled by BASIC’s limitations since I could use Z-80 code too,” Christopherson explains. “The real limitations were imposed by the TRS-80 itself. I used to spend hours racking my brain to figure out smaller routines to do various tasks since that 16K RAM was used up pretty fast. Most anything I’ve wanted to do was do-able in BASIC if the computer itself was up to it”.

Exiting the Spotlight
By the time Kemeny and Kurtz were unhappy enough with what had happened to BASIC to create True BASIC, the language’s influence had peaked. It was no longer the default tool chosen by schools to teach programming to beginners: When I was in college in the mid-1980s, the language du jour was Pascal, especially among those who prized good programming practices. (I’m too embarrassed to tell you what my grade was for the Pascal course I took circa 1985; let’s just say that Edsger Dijkstra would have had every right to be smug).

BASIC had also become less of a fundamental ingredient of PC computing. It wasn’t that an even easier language had taken its place. It’s just that it had become increasingly possible to have a deep and meaningful relationship with a computer without ever learning to program it.

Lotus 1-2-3
Lotus 1-2-3, one of the applications that helped render BASIC less essential on PCs
Harry McCracken / TIME
In part, that was because of the rich and powerful applications that had come along for PCs. If you just wanted to do math calculations, a spreadsheet such as Lotus 1-2-3 would do the trick even more easily than BASIC. If you wanted to wrangle vast amounts of data, a database like dBASE was built for the task.

Microsoft saw which way the software winds were blowing. “We worked early on to consolidate our dominance in languages,” says Allen. “But it soon became clear that applications as tools for word processing, etc. would overshadow languages and we needed to build a word processor, database and spreadsheet. So that’s how we ended up doing Word and other applications”.

PC Mag
The debut issue of PC magazine, February-March 1982
Internet Archive
“When I started PC magazine, I declared that we would not have computer code on the pages of the magazine,” says David Bunnell, who founded that publication in 1981, PC World in 1982 and Macworld in 1983. “It was ugly. I wanted it to be more of a consumer magazine.”

When Apple shipped the first Macintosh in 1984, it didn’t come with BASIC–an omission that seems striking to me now, although there was so much else that was interesting about the Mac that I’m not sure if anybody noticed at the time. Apple did intend to introduce a version of the language soon after the Mac’s debut, but it was delayed and then eventually killed as part of an agreement with Microsoft, which released a version of Microsoft BASIC for the machine.

QuickBASIC
A 1987 ad for Microsoft’s QuickBASIC, a compiler version of the language aimed at advanced programmers
Google Books
One moment that feels like a watershed to me happened with the publication of theMay 1988 issue of Compute, the magazine that took type-in programs more seriously, over more time, than any other. In his “Editorial License” column, Editor Gregg Keizer announced that the magazine would no longer include such listings, a move akin to Playboy declaring that it would stop running photographs of naked women.

Keizer says that the decision was driven by a number of factors, including the increasing difficulty that type-in programs in a magazine had competing with slick and ambitious boxed software. “I think in the end, it was a question of how best can the magazine continue to support readers? Things were so different in 1988 than they had been in 1983 in terms of the software available and the sweat equity that people were willing to put into typing it in.”

When Microsoft introduced Windows 3.0 in 1990, it became clear that the days were numbered for the text-based, command line environment in which classic BASIC flourished. But even though the company introduced a Windows version of BASIC for professional developers called Visual BASIC–one that became extremely popular, and eventually led to a still-extant version, Visual Basic .NET–it never bundled any version of it with Windows. Instead, it continued to ship its operating systems with an MS-DOS version of BASIC called GW-Basic, later replaced by one called QBasic.

donkey.bas
DONKEY.BAS shipped with the original IBM PC in 1981. The most notable thing about it is its co-author, a fellow by the name of Bill Gates
Windows Me, released in 2000, was the last Microsoft operating system to come with QBasic. By that point, it was vestigial; it only let you write MS-DOS programs, in an era when virtually nobody wanted to write MS-DOS programs.

By then, I was a former BASIC programmer myself, having started to slack off when I bought a Commodore Amiga in 1987. The only full-blown BASIC program I’ve written in this century was a quickie I cobbled together to automate a tiresome task in Microsoft Word. And that was seven or eight years ago.

BASIC Forever
I began this article by saying that I admire the modern-day movement to teach more people to code. I said that I miss BASIC. Which makes this question unavoidable: If everyone should learn to code, should everyone–or anyone–learn to do it with BASIC?

It’s a question that’s been asked before and has been debated endlessly. In 2006, Salon published “Why Johnny Can’t Code,” astrophysicist and science-fiction author David Brin’s paean to BASIC’s virtues as a teaching tool. He mourned its disappearance from personal computers, noting that his 14-year-old son Ben had no way to run the brief BASIC programs in his math textbooks.

“BASIC became obsolete. New innovative tools for new times”

The case against Brin’s plea–made by many of the people who responded to it–involves BASIC being a quaint antique. A budding programmer learning it would be like an aspiring auto mechanic starting by learning to service a Model T.

“Eskimo has several hundred words for snow,” says technologist and entrepreneur Philippe Kahn, whose first major company, Borland International, was a major purveyor of programming languages in the 1980s, including a version of BASIC. “But if you move away from Alaska it’s probably not a very expressive language. In a similar way, programming saw a paradigm shift from building software for simple ‘Disk Operating Systems’ to designing for object-oriented platforms and BASIC became obsolete. New innovative tools for new times.”

For additional guidance, I asked someone who’s never far from my thoughts when I think about BASIC: Charles Forsythe. Charles happens to have been the guy I sat next to in my high school’s computer lab, banging out games in BASIC circa 1980–the only one in the place whose skill I envied. Unlike me, he stuck with programming; today, he’s a Java systems engineer at SAIC.

BASIC is “simple and (usually) interactive,” he says. “This makes it pretty good for its original purpose: teaching beginners the basics of programming. There are other good intro languages, but with BASIC you don’t have to say things like, ‘We’ll learn how to define a method later,’ or ‘we’ll be learning what an object is later.’” Trying to explain a lambda expression (a.k.a closure, a.k.a function pointer) to a newbie could just get them completely confused.”

Still, it doesn’t sound like he’s rooting for the language to reassert itself: “Once you’ve learned about variables and branching, BASIC stops teaching you anything that useful in today’s programming world.”
Brin says that “Why Johnny Can’t Code” wasn’t, ultimately, a plea for the return of BASIC. What he laments is that modern standard-issue computers don’t offer any way at all for a beginner to turn on a standard-issue computer and immediately begin programming it, with a minimum of instruction.

“For a decade, BASIC was so universally available that textbook manufacturers put simple programming exercises in most standard math and science texts,” he says. “And teachers assigned them. And thus, a far higher fraction of students gained a little experience fiddling with 12-line programs that would make a pixel move…and thus knew, in their gut, that every dot on every screen obeyed an algorithm.”

Quite Basic is a contemporary, entirely browser-based implementation of the language
Quite Basic
In the years since Brin’s essay was published, BASIC has made at least a modest comeback. He praises Nikko Ström’s Quite Basic, which runs entirely in a web browser, letting you write and run BASIC programs without installing anything on your computer. As its name suggests, Lyle Kopnicky’s Vintage BASICaims to recreate the feel of the classic BASICs of his youth–with permission, he’sadapted David Ahl’s BASIC games to run on it.

Even Microsoft is back in the business of making BASIC useful for newbies. In 2008, it introduced Small Basic, a free, simplified version of Visual Basic designed for kids and other amateurs. It has 14 commands–the same number as Dartmouth’s original version–but has wandered far from the basics of BASIC. (What you once would have accomplished with 10 PRINT “HELLO” now requires TextWindow.WriteLine(“Hello”).)

Small Basic
Microsoft’s modern BASIC for beginners, Small Basic, proving that it can run an admirable version of Tetris
There are multiple BASICs available for iPhones, iPads and Android devices, too. Bottom line: If you’re interested in trying out BASIC, you can, on virtually any computing device you’ve got. I wouldn’t be surprised if that’s still true another half-century from now.

None of these BASICs address Brin’s original complaint, though: Unlike the great BASICs of the past, they don’t stare every PC user in the face. “Remember, even one step between turning on the computer and actually typing the program will lose you 30% of the students,” he says. “Add another step and you lose another 30%. Downloading and fiddling is not an option.”

I agree with Brin. And I’m still glad that I learned about computers when that meant learning BASIC. However, I feel better about where we are now when I remind myself: BASIC was never really about BASIC, or even about programming.

Tom Kurtz
Tom Kurtz, magnetic tape in hand, poses in Dartmouth’s computing center in the early days of BASIC
Adrian N. Bouchard / Dartmouth College
“The goal of Kemeny and Kurtz was to make these great, new and interesting machines available to a much broader group of minds,” says Dartmouth’s Rockmore. “They succeeded. Looking around at people staring at their cell phones, you could argue that they succeeded too well.”

Even Kurtz seems to be at peace with the fact that few people learn BASIC these days, calling it a sign of progress: “Many of the uses of BASIC are now easily done by spreadsheets, or by specific applications. Now, practically all the functions of a modern computer can be accomplished by poking a finger at certain spots on the screen.”

No, BASIC isn’t a pervasive part of personal computing anymore. But the grand, improbable idea that started at Dartmouth ended up changing our culture forever–and it’s tough to think of a greater legacy than that.

Dartmouth students
Dartmouth students study a program at the school’s Kiewit Computation Center in 1969
Adrian N. Bouchard / Dartmouth College

Source: http://time.com/104174/att-directv/