Senin, 03 November 2008

Charles Babbage

More than a century before the first electronic digital computers were invented, British mathematician and inventor Charles Babbage conceived and designed a mechanical “engine” that had most of the key features of modern computers. Although Babbage’s computer was never built, it remains a testament to the remarkable power of the human imagination.
Charles Babbage was born on December 26, 1791, in London, to a well-to-do banking family that was able to provide him with a first-class private education. Even as a young child Babbage was fascinated by mechanisms of all kinds, asking endless questions and often dissecting the objects in search of answers. He became a star student, particularly in math, at boarding school, where he and some intellectually inclined friends stayed up late to study.
In 1810 Babbage entered Trinity College, Cambridge, where he studied advanced calculus and helped found an organization to reform the Newtonian discipline along more modern European lines. By 1815, Babbage had made such an impact in mathematics and science that he had been elected a fellow of the prestigious British Royal Society. His reputation continued to grow, and in 1828 he was appointed Lucasian Professor of Mathematics at Cambridge, occupying the chair once held by Isaac Newton. What was becoming a distinguished career in mathematics then took a different turn, one in keeping with the times. By the early 19th century, the role of mathematics and science in European society was beginning to change. Britain, in particular, was a leader in the Industrial Revolution, when steam power, automated weaving, and large-scale manufacturing were rapidly changing the economy and people’s daily lives. In this new economy, “hard numbers”—the mathematical tables needed by engineers, bankers, and insurance companies—were becoming increasingly necessary. All such tables, however, had to be calculated slowly and painstakingly by hand, resulting in numerous errors.
One day, when poring over a table of logarithms, Babbage fell asleep and was roused by a friend. When asked what he had been dreaming about, Babbage replied that “I am thinking that all these tables might be calculated by machines.” The idea of mechanical computation was perhaps not so surprising. Already the automatic loom invented by Joseph-Marie Jacquard was being controlled by chains of punched cards containing weaving patterns. The idea of controlled, repetitive motion was at the heart of the new industry. Babbage was in essence applying industrial methods to the creation of the information that an industrial society increasingly required for further progress. But although his idea of industrializing mathematics was logical, Babbage was entering uncharted technological territory. From 1820 to 1822, Babbage constructed a small calculator that he called a difference engine. The calculator exploited a mathematical method for generating a table of numbers and their squares by repeated simple subtraction and addition. When the demonstration model successfully generated numbers up to about eight decimal places, Babbage undertook to build a larger scale version, which he called Difference Engine Number One. This machine would have around 25,000 gears and other moving parts and could handle numbers with up to 20 decimal places. The machine was even to have a printer that could generate the final tables directly, avoiding transcription errors.
By 1830, work was well under way, supported by both government grants and Babbage’s own funds. However, Babbage soon became bogged down with problems. Fundamentally, the parts for the Difference Engine required a tolerance and uniformity that went beyond anything found in the rough-hewn industry of the time, requiring new tools and production methods. At the same time, Babbage was a poorer manager than an inventor, and in 1833 labor disputes virtually halted the work. The big Difference Engine would never be finished.
By 1836, however, Babbage, undaunted, had developed a far bolder conception. He wrote in his notebook, “This day I had for the first time a general . . . conception of making an engine work out algebraic developments. . . . My notion is that the cards (Jacquards) of the calc. engine direct a series of operations and then recommence with the first, so it might be possible to cause the same cards to punch others equivalent to any number of repetitions.”
As Babbage worked out the details, he decided that the new machine (which he called the Analytical Engine) would be fed by two stacks of punched cards. One stack would contain instructions that specified the operation (such as addition or multiplication), while the other would contain the data numbers or variables. In other words, the instruction cards would program the machine to carry out the operations automatically using the data. The required arithmetic would be carried out in a series of gear-driven cal-culation units called the mill, while temporary results and variable values would be stored in a series of mechanical registers called the store. The final results could be either printed or punched onto a set of cards for future use. The Analytical Engine thus had many of the features of a modern computer: a central processor (the mill), a memory (the store), as well as input and output mechanisms. One feature it lacked, as revealed in Babbage’s journal entry, was the ability to store programs themselves in memory. That is why a repetition (or loop) could be carried out only by repeatedly punching the required cards.
The new machine would be a massive and expensive undertaking. Babbage’s own funds were far from sufficient, and the British government had become disillusioned by his failure to complete the Difference Engine. Babbage therefore began to use his contacts in the international mathematical community to try to raise support for the project. He was aided by L. F. Menebrea, an Italian mathematician, who wrote a series of articles about the Analytical Engine in France. He was further aided by ADA LOVELACE (the daughter of the poet George Gordon, Lord Byron). She not only translated the French articles into English, but greatly expanded them, including her own example programs and suggestions for applications for the device.
However, like the Difference Engine, the Analytical Engine was not to be. A contemporary wrote that Babbage was “frequently and almost notoriously incoherent when he spoke in public.” His impatience and “prickliness” also made a bad impression on some of the people he had to persuade to support the new machine. Funding was not found, and Babbage was only able to construct demonstration models of a few of its components. As he moved toward old age, Babbage continued to write incredibly detailed engineering drawings and notes for the Analytical Engine as well as plans for improved versions of the earlier Difference Engine. But he became reclusive and even more irritable. Babbage became notorious for his hatred of street musicians, as chronicled in his 1864 pamphlet Observations of Street Nuisances. Neighbors who supported the musicians often taunted Babbage, sometimes organizing bands to play deliberately mistuned instruments in front of his house.
After Babbage’s death in October 18, 1871, his remarkable computer ideas faded into obscurity, and he was remembered mainly for his contributions to economic and social statistics, another field that was emerging into importance by the mid-19th century. Babbage therefore had little direct influence on the resurgence of interest in automatic calculation and computing that would begin in the late 1930s (many of his notes were not unearthed until the 1970s). However, computer scientists today honor Charles Babbage as their spiritual father.

John Vincent Atanasoff

It is difficult to credit the invention of the first digital computer to any single individual. Although the 1944 ENIAC, designed by J. PRESPER ECKERT and JOHN MAUCHLY, is widely considered to be the first fully functional electronic digital computer, John Vincent Atanasoff and his graduate assistant Clifford Berry created a machine five years earlier that had many of the features of modern computers. Indeed, a court would eventually declare that the Atanasoff-Berry Computer (ABC) was the first true electronic computer.
Atanasoff was born October 4, 1903, in Hamilton, New York. His father was an electrical engineer, his mother a teacher, and both parents encouraged him in his scientific interests. In particular, the young boy was fascinated by his father’s slide rule. He learned about logarithms so he could understand how the slide rule worked. He also showed his father’s aptitude for electrical matters: When he was nine years old, he discovered that some wiring in their house was faulty, and fixed it.
John blazed through high school in only two years, making straight A’s. By then, he had decided to become a theoretical physicist. When he entered the University of Florida, however, he majored in engineering and mathematics because the school lacked a physics major. Offered a number of graduate fellowships, Atanasoff opted for Iowa State College because he liked its programs in physics and engineering. He earned his master’s degree in mathematics in 1926.
Atanasoff continued on to the University of Wisconsin, where he earned his doctorate in physics in 1930. He would remain there as a professor of physics for the next decade. Like HOWARD AIKEN, Atanasoff discovered that modern physics was encountering an increasing burden of calculation that was becoming harder and harder to meet using manual methods, the slide rule, or even the electromechanical calculators being used by business.
One alternative in development at the time was the analog computer, which used the changing relationships between gears, cams, and other mechanical components to represent quantities manipulated in equations. While analog computers such as the differential analyzer built by VANNEVAR BUSH achieved success in tackling some problems, they tended to break down or produce errors because of the very exacting mechanical tolerances and alignments they required. Also, these machines were specialized and hard to adapt to different kinds of problems. Atanasoff made a bold decision. He would build an automatic, digital electronic calculator. Instead of the decimal numbers used by ordinary calculators, he decided to use binary numbers, which could be represented by different amounts of electrical current or charge. The binary logic first developed by GEORGE BOOLE could also be manipulated to perform arithmetic directly. Equally important, at a time when electric motors and switches drove mechanical calculators, Atanasoff decided to design a machine that would be electronic rather than merely electrical. It would use the direct manipulation of electrons in vacuum tubes, which is thousands of times faster than electromechanical switching.
Atanasoff obtained a modest $650 grant from Iowa State and hired Clifford Berry, a talented graduate student, to help him. In December 1939, they introduced a working model of the Atanasoff-Berry Computer (ABC). The machine used vacuum tubes for all logical and arithmetic operations. Numbers were input from punched cards, while the working storage (equivalent to today’s random-access memory, or RAM), consisted of two rotating drums that stored the numbers as electrical charges on tiny capacitors. The ABC, however, was not a truly general purpose computer: It was designed to solve sets of equations by systematically eliminating unknown quantities. Because of problems with the capacitor-charge memory system, Atanasoff and Berry were never able to solve more than five equations at a time. As the United States entered World War II, Atanasoff had to increasingly divide his time between working on the ABC and his duties at the National Ordnance Laboratory in Washington, D.C., where he headed the acoustics division and worked on designing a computer for naval use. Eventually the ABC project petered out, and the machine never became fully operational. After the war, Atanasoff gradually became disillusioned with computing. The mainstream of the new field went in a different direction, toward the general-purpose machines typified by ENIAC, a large vacuum tube computer. In 1950 Atanasoff discovered that Iowa State had dismantled and partly discarded the ABC. He spent the remainder of his career as a consultant and entrepreneur. In 1952, he and his former student David Beecher founded a defense company, Ordnance Engineering Corporation. In 1961, he became a consultant working on industrial automation, and cofounded a company called Cybernetics with his son.
In 1971, however, Atanasoff and the ABC became part of a momentous patent dispute. Mauchly and Eckert had patented many of the fundamental mechanisms of the digital computer on the strength of their 1944 ENIAC machine. Sperry Univac, which now controlled the patents, demanded high licensing fees from other computer companies. A lawyer for one of these rivals, Honeywell, had heard of Atanasoff’s work and decided that he could challenge the Mauchly-Eckert patents. The heart of his case was that in June 1941 Mauchly had stayed at Atanasoff’s home and had been treated to an extensive demonstration of the ABC. Honeywell claimed that Mauchly had obtained the key idea of using vacuum tubes and electronic circuits from Atanasoff. If so, the Atanasoff machine would be “prior art,” and the Mauchly-Eckert patents would be invalid. In 1973, the federal court agreed, declaring that Mauchly and Eckert “did not themselves invent the automatic electronic digital computer, but instead derived that subject matter from one Dr. John Vincent Atanasoff.”
The decision was not appealed. Despite the definitive legal ruling, the controversy among computer experts and historians grew. Defending his work in public for the first time, Atanasoff stressed the importance of the ideas that Mauchly and Eckert had obtained from him, including the use of vacuum tubes and binary logic circuits. Defenders of the ENIAC inventors, however, pointed out that the ABC was a specialized machine that was never a fully working general-purpose computer like ENIAC. While the dispute may never be resolved, it did serve to give Atanasoff belated recognition for his achievements. On October 21, 1983, the University of Iowa held a special conference celebrating Atanasoff’s work, and later built a working replica of the ABC. By the time Atanasoff died in 1995 at the age of 91, he had been honored with many awards, including the Computer Pioneer Medal from the Institute for Electrical and Electronics Engineers in 1984 and the National Medal of Technology in 1990.

Marc Andreessen

Marc Andreessen brought the World Wide Web and its wealth of information, graphics, and services to the desktop, setting the stage for the “e-commerce” revolution of the later 1990s. As founder of Netscape, Andreessen also created the first big “dot-com,” as companies doing business on the Internet came to be called.
By the early 1990s, the World Wide Web (created by TIM BERNERS-LEE) was poised to change the way information and services were delivered to users. However, early Web browsers ran mainly on machines using UNIX, a somewhat esoteric operating system used primarily by students and scientists on college campuses and at research institutes (Berners-Lee had been working at CERN, the European nuclear physics laboratory.) The early Web generally consisted only of linked pages of text, without the graphics and interactive features that adorn webpages today. Besides looking boring, early webpages were hard for inexperienced people to navigate. Marc Andreessen would change all that. Marc Andreessen was born on July 9, 1971, in New Lisbon, Wisconsin. That made him part of a generation that would grow up with personal computers, computer games, and computer graphics. Indeed, when Marc was only nine years old he learned the BASIC computer language from a book in his school’s library, and then proceeded to write a program to help him with his math homework. Unfortunately, he did not have a floppy disk to save the program on, so it disappeared when the school’s janitor turned off the machine.
Marc got his own personal computer in seventh grade, and tinkered on many sorts of programs through high school. He then studied computer science at the University of Illinois at Urbana-Champaign. Despite his devotion to programming, he impressed his fellow students as a Renaissance man. One of them recalled in an interview that “A conversation with Andreessen jumps across a whole range of ungeekish subjects, including classical music, history, philosophy, the media, and business strategy. It’s as if he has a hypertext brain.”
Andreessen encountered the Web shortly after it was introduced in 1991 by Tim Berners-Lee. He was impressed by the power of the new medium, which enabled many kinds of information to be accessed using the existing Internet, but became determined to make it more accessible to ordinary people. In 1993, while still an undergraduate, he won an internship at the National Center for Supercomputing Applications (NCSA). Given the opportunity to write a better Web browser, Andreessen, together with colleague Eric Bina and other helpers, set to work on what became known as the Mosaic web browser. Since their work was paid for by the government, Mosaic was offered free to users over the Internet. Mosaic could show pictures as well as text, and users could follow Web links simply by clicking on them with the mouse. The userfriendly program became immensely popular, with more than 10 million users by 1995. After earning his B.S. degree in computer science, Andreessen left Mosaic, having battled with its managers over the future of Web browsing software. He went to the area south of San Francisco Bay, a hotbed of startup companies known as Silicon Valley, which had become a magnet for venture capital in the 1990s. There he met Jim Clark, an older entrepreneur who had been chief executive officer (CEO) of Silicon Graphics. Clark liked Andreessen and agreed to help him build a business based on the Web. They founded Netscape Corporation in 1994, using $4 million seed capital provided by Clark. Andreessen recruited many of his former colleagues at NCSA to help him write a new Web browser, which became known as Netscape Navigator. Navigator was faster and more graphically attractive than Mosaic. Most important, Netscape added a secure encrypted facility that people could use to send their credit card numbers to online merchants. This was part of a twopronged strategy: First, attract the lion’s share of Web users to the new browser, then sell businesses the software they would need to create effective Web pages for selling products and services to users.
By the end of 1994, Navigator had gained 70 percent of the Web browser market. Time magazine named the browser one of the 10 best products of the year, and Netscape was soon selling custom software to companies that wanted a presence on the Web. The e-commerce boom of the later 1990s had begun, and Marc Andreessen was one of its brightest stars. When Netscape offered its stock to the public in summer 1995, the company gained a total worth of $2.3 billion, more than that of many traditional blue-chip industrial companies. Andreessen’s own shares were worth $55 million.
Microsoft under BILL GATES had been slow to recognize the growing importance of the Web.
However, as users began to spend more and more time interacting with the Netscape window, Microsoft began to worry that its dominance of the desktop market was in jeopardy. Navigator could run not only on Microsoft Windows PCs, but also on Macintoshes and even on machines running versions of UNIX. Further, a new programming language called Java, developed by JAMES GOSLING made it possible to write programs that users could run from Web pages without being limited to Windows or any other operating system. If such applications became ubiquitous, then the combination of Navigator (and other Netscape software) plus Java could in effect replace the Windows desktop.
Microsoft responded by creating its own Web browser, called Internet Explorer. Although technical reviewers generally considered the Microsoft product to be inferior to Netscape, it gradually improved. Most significantly, Microsoft included Explorer with its new Windows 95 operating system. This “bundling” meant that PC makers and consumers had little interest in paying for Navigator when they already had a “free” browser from Microsoft. In response to this move, Netscape and other Microsoft competitors helped promote the antitrust case against Microsoft that would result in 2001 in some of the company’s practices being declared an unlawful use of monopoly power. Andreessen also responded to Microsoft by focusing on the added value of software for Web servers, while making Navigator “open source,” meaning that anyone was allowed to access and modify the program’s code. He hoped that a vigorous community of programmers might help keep Navigator technically superior to Internet Explorer. However, Netscape’s revenues began to decline steadily. In 1999 America Online (AOL) bought Netscape, seeking to add its technical assets and Webcenter online portal to its own offerings.
After a brief stint with AOL as its “principal technical visionary,” Andreessen decided to start his own company, called LoudCloud. The company provided website development, management and custom software (including e-commerce “shopping basket” systems) for corporations that have large, complex websites. Through 2001, Andreessen vigorously promoted the company, seeking to raise enough operating capital to continue after the crash of the Internet industry. However, after a continuing decline in profitability Andreessen sold LoudCloud’s Web management business to Texas-based Electronic Data Systems (EDS), retaining the smaller (software) side of the business under a new name, Opsware.
While the future of his recent ventures remains uncertain, Marc Andreessen’s place as one of the key pioneers of the Web and e-commerce is assured. His inventiveness, technical insight, and business acumen made him a model for a new generation of Internet entrepreneurs. Andreessen was named one of the Top 50 People Under the Age of 40 by Time magazine (1994) and has received the Computerworld/Smithsonian Award for Leadership (1995) and the W. Wallace McDowell Award of the Institute of Electrical and Electronic Engineers Computer Society (1997).

Gene Myron Amdahl

In a long and fruitful career as a computer designer Gene Myron Amdahl created many innovations and refinements in the design of mainframe computers, the hefty workhorses of the data processing industry from the 1950s through the 1970s. Amdahl was born on November 16, 1922, in Flandreau, South Dakota. Amdahl did his college work in electrical engineering and physics. When his studies were interrupted by World War II, he served as a physics instructor for an army special training program and then joined the navy, where he taught electronics until 1946. He then returned to school, receiving his B.S. degree from South Dakota State University in 1948 and his doctorate in physics at the University of Wisconsin in 1952. As a graduate student, Amdahl worked on a problem involving the forces binding together parts of a simple atomic nucleus. He and two fellow students spent a month performing the necessary computations with calculators and slide rules. Amdahl realized that if physicists were going to be able to move on to more complex problems they would need greater computing resources. He therefore designed a computer called the WISC (Wisconsin Integrally Synchronized Computer). This computer used a sophisticated procedure to break calculations into parts that could be carried out on separate processors, making it one of the earliest examples of the parallel computing techniques found in today’s computer processors.
In 1952, Amdahl went to work for IBM, which was beginning the effort that would lead to its dominating the business computer industry by the end of the decade. Amdahl worked with the team that designed the IBM 704. The 704 improved upon the 701, the company’s first successful mainframe, by adding many new internal programming instructions, including the ability to perform floating point calculations (involving numbers that have decimal points). The machine also included a fast, high-capacity magnetic core memory that let the machine retrieve data more quickly during calculations. In November 1953, Amdahl became the chief project engineer for the 704.
On the heels of that accomplishment, new opportunities seemed to be just around the corner. Although IBM had made its reputation in business machines, it was also interested in the market for specialized computers for scientists. Amdahl helped design the IBM 709, an extension of the 704 designed for scientific applications. When IBM proposed extending the technology by building a powerful new scientific computer called STRETCH, Amdahl eagerly applied to head the new project. However he ended up on the losing side of a corporate power struggle, and did not receive the post. He left IBM at the end of 1955.
Amdahl then worked for several small data processing companies. He helped design the RW440, a minicomputer used for industrial process control. This period gave Amdahl some experience in dealing with the problems of startup businesses, experience he would call upon later when he started his own company. In 1960, Amdahl rejoined IBM and soon was involved in several design projects. The one with the most lasting importance was the IBM System/360, which would become the most ubiquitous and successful mainframe computer of all time. In this project, Amdahl further refined his ideas about making a computer’s central processing unit more efficient. He designed logic circuits that enabled the processor to analyze the instructions waiting to be executed (the “pipeline”) and determine which instructions could be executed immediately and which would have to wait for the results of other instructions. He also used a cache, or special memory area in which the instructions that would be needed next could be stored ahead of time so they could be retrieved quickly from high-speed storage. Today’s desktop personal computers (PCs) use these same ideas to get the most out of their chips’ capabilities. The problem of parallel computing is partly a problem of designing appropriate hardware and partly a problem of writing (or rewriting) software so its instructions can be executed simultaneously. It is often difficult to predict how much a parallel computing arrangement will improve upon using a single processor and conventional software. Amdahl created a formula called Amdahl’s law, which attempts to answer that question. In simple terms, Amdahl’s law says that the advantage gained from using more processors gradually declines as more processors are added. The amount of improvement is also proportional to how much of the calculation can be broken down into parts that can be run in parallel. As a result, some kinds of programs can run much faster with several processors being used simultaneously, while other programs may show little improvement.
As a designer, Amdahl coupled hard work with the ability to respond to sudden bursts of intuition. “Sometimes,” he recalled to author Robert Slater, “I wake up in the middle of the night and I’ll be going 60 miles an hour on the way to a solution. I see a mental picture of what is going on and I dynamically operate that in my mind.” In 1965, Amdahl was awarded a five-year IBM fellowship that allowed him to study whatever problems interested him. He also helped establish IBM’s Advanced Computing Systems Laboratory in Menlo Park, California, which he directed. However, Amdahl became increasingly frustrated with what he thought was IBM’s toorigid approach to designing and marketing computers. IBM insisted on basing the price of a new computer not on how much it cost to produce, but on how fast it could calculate.
Amdahl wanted to build much more powerful computers—what would soon be called “supercomputers.” But if IBM’s policy were followed, these machines would be so expensive that virtually no one would be able to afford them. Thus at a time when increasing miniaturization was leading to the possibility of much more powerful machines, IBM did not seem to be interested in building them. The computer giant seemed to be content to gradually build upon its financially successful 360 line (which would become the IBM 370 in the 1970s). Amdahl therefore left IBM in 1970, later recalling to Slater that he left IBM that second time “because I wanted to work in large computers. . . . I’d have had to change my career if I stayed at IBM—for I wanted personal satisfaction.”
To that end, in 1970 he founded the Amdahl Corporation. Amdahl resolved to make computers that were more powerful than IBM’s machines, but would be “plug compatible” with them, allowing them to use existing hardware and software. Business users who had already invested heavily in IBM equipment could thus buy Amdahl’s machines without fear of incompatibility. Since IBM was known as “Big Blue,” Amdahl decided to become “Big Red,” painting his machines accordingly.
Amdahl would later recall his great satisfaction in “getting those first computers built and really making a difference, seeing it completely shattering the control of the market that IBM had, causing pricing to come back to realistic levels.” Amdahl’s critics sometimes accused him of having unfairly used the techniques and knowledge that he had developed at IBM, but he has responded by pointing to his later technical innovations. In particular, he was able to take advantage of the early developments in integrated electronics to put more circuits on a chip without making the chips too small, and thus too crowded for placing the transistors. After it was introduced in 1975, the Amdahl 470 series of machines, doubled in sales in each of its first three years. Thanks to the use of largerscale circuit integration, Amdahl could sell machines with superior technology to that of the IBM 360 or even the new IBM 370, and at a lower price. IBM responded belatedly to the competition, making more compact and faster processors, but Amdahl met each new IBM product with a faster, cheaper alternative. However, IBM also countered by using a sales technique that opponents called FUD—fear, uncertainty, and doubt. IBM salespersons promised customers that IBM would soon be coming out with much more powerful and economical alternatives to Amdahl’s machines. As a result, many potential customers were persuaded to postpone purchasing decisions and stay with IBM. Amdahl Corporation began to falter, and Gene Amdahl gradually sold his stock and left the company in 1980.
Amdahl then tried to repeat his early success by starting a new company called Trilogy. The company promised to build much faster and cheaper computers than those offered by IBM or Amdahl. He believed he could accomplish this by using the new, very-large-scale integrated silicon wafer technology, in which circuits were deposited in layers on a single chip rather than being distributed on separate chips on a printed circuit board. However, the problem of dealing with the electrical characteristics of such dense circuitry, as well as some design errors, somewhat crippled the new computer design. Amdahl also found that the aluminum substrate that connected the wafers on the circuit board was causing short circuits. Even weather, in the form of a torrential rainstorm, conspired to add to Amdahl’s problems by flooding a chip-building plant and contaminating millions of dollars’ worth of chips. Amdahl was forced to repeatedly delay the introduction of the new machine, from 1984 to 1985 to 1987. He attempted to infuse new technology into his company by buying Elxsi, a minicomputer company, but Trilogy never recovered.
After the failure of Trilogy, Amdahl undertook new ventures in the late 1980s and 1990s, including Andor International, an unsuccessful developer of minicomputers, and Commercial Data Servers (CDS), which is trying to compete with IBM in the low-priced end of the mainframe market.
Amdahl has received many industry awards, including “Data Processing Man of the Year” from the Data Processing Management Association (1976) and the Harry Goode Memorial Award from the American Federation of Information Processing Societies.

Howard Hathaway Aiken

Howard Hathaway Aiken was a pioneer in the development of automatic calculating machines. Born on March 8, 1900, in Hoboken, New Jersey, he grew up in Indianapolis, Indiana. He pursued his interest in electrical engineering by working at a utility company while in high school. Aiken then earned a B.A. degree in electrical engineering in 1923 at the University of Wisconsin. By 1935, Aiken was working on the physics of how electric charges were conducted in vacuum tubes—an important question for the new technology of electronics. This work required tedious, error-prone hand calculation. Aiken therefore began to investigate the possibility of building a large-scale, programmable, automatic computing device. As a doctoral student at Harvard, Aiken aroused considerable interest in his ideas, particularly from THOMAS J. WATSON SR., head of International Business Machines (IBM). In 1939, IBM agreed to underwrite the building of Aiken’s first calculator, the Automatic Sequence Controlled Calculator (ASCC), which became known as the Harvard Mark I.
Mechanical and electromechanical calculators were nothing new: indeed, machines from IBM, Burroughs, and others were being increasingly used in business settings. However, ordinary calculators required that operators manually set up and run each operation step by step in the complete sequence needed to solve a problem. Aiken wanted a calculator that could be programmed to carry out the sequence automatically, storing the results of each calculation for use by the next. He wanted a general-purpose programmable machine rather than an assembly of special-purpose arithmetic units. Earlier complex calculators (such as the Analytical Engine which CHARLES BABBAGE had proposed a century earlier) were very difficult to implement because of the precise tolerances needed for the intricate assembly of mechanical parts. Aiken, however, had access to a variety of tested, reliable components, including card punches, readers, and electric typewriters from IBM and the mechanical electromagnetic relays used for automatic switching in the telephone industry.
Aiken’s Mark I calculator used decimal numbers (23 digits and a sign) rather than the binary numbers of the majority of later computers. Sixty registers held whatever constant data numbers were needed to solve a particular problem. The operator turned a rotary dial to enter each digit of each constant number required for the calculation. Variable data and program instructions were entered from punched paper tape. Calculations had to be broken down into specific instruction codes similar to those in later low-level programming languages such as “store this number in this register” or “add this number to the number in that register.” The results (usually tables of mathematical function values) could be printed by an electric typewriter or output on punched cards.
The Mark I was built at IBM’s factory in Endicott, New York. It underwent its first fullscale test on Christmas Day 1943, illustrating the urgency of work under wartime conditions. The bus-sized machine (about eight feet high by 51 feet long) was then painstakingly disassembled and shipped to Harvard University, where it was up and running by March 1944. Relatively slow by comparison with the vacuum tube-based computers that would soon be designed, the Mark I was a very reliable machine. A New York Times article enthused, “At the dictation of a mathematician, it will solve in a matter of hours equations never before solved because of their intricacy and the enormous time and personnel which would be required to work them out on ordinary office calculators.”
Aiken then went to work for the U.S. Navy (and was given the rank of commander), where his team included another famous computer pioneer, the future admiral GRACE MURRAY HOPPER. The Mark I worked 24 hours a day on a variety of problems, ranging from solving equations used in lens design and radar to the ultrasecret design for the implosive core of the atomic bomb. Unlike many engineers, Aiken was comfortable managing fast-paced projects. He once quipped, “Don’t worry about people stealing an idea. If it’s original, you’ll have to ram it down their throats.” Aiken completed an improved model, the Mark II, in 1947. The Mark III of 1950 and Mark IV of 1952 were electronic rather than electromechanical, replacing relays with vacuum tubes. The Mark III used a magnetic core memory (analogous to modern RAM, or randomaccess memory) that could store and retrieve numbers relatively quickly, as well as a magnetic drum that served the function of a modern hard disk.
Compared to slightly later digital computers such as ENIAC and Univac, the sequential calculator, as its name suggests, could only perform operations in the order specified, rather than, for example, being able to loop repeatedly. (After all, the program as a whole was not stored in any sort of memory, and so previous instructions could not be reaccessed.) Yet although Aiken’s machines soon slipped out of the mainstream of computer development, they did include the modern feature of parallel processing, because different calculation units could work on different instructions at the same time. Further, Aiken recognized the value of maintaining a library of frequently needed routines that could be reused in new programs—another fundamental of modern software engineering. Aiken’s work demonstrated the value of large-scale automatic computation and the use of reliable, available technology. Computer pioneers from around the world came to Aiken’s Harvard computation lab to debate many issues that would become staples of the new discipline of computer science. By the early 1950s Aiken had retired from computer work and became a Florida business entrepreneur, enjoying the challenge of rescuing ailing businesses.
The recipient of many awards, including the Edison Medal of the Institute of Electrical and Electronics Engineers and the Franklin Institute’s John Price Award, Howard Aiken died on March 14, 1973, in St. Louis, Missouri.