http://dapsam.tripod.com

Computer page

Home
About Me
Contact Me
Where I want to visit
My Favorite Links
Computer Page
Great Inventors
Biographies

Shopping Amazon


  More Info
price:

COMPUTER AND INTERNET ORIGIN
 
 
 
 

Personal Computers: History and Development


Overview

The personal computer (PC) has revolutionized business and personal activities and even the way people talk and think; however, its development has been less of a revolution than an evolution and convergence of three critical elements - thought, hardware, and software. Although the PC traces its lineage to the mainframe and minicomputers of the 1950s and 1960s, the conventional thought that was prevalent during the first thirty years of the computer age saw no value in a small computer that could be used by individuals.

A PC is a microcomputer, so named because it is smaller than a minicomputer, which in turn is smaller than a mainframe computer. While early mainframes and their peripheral devices often took up the floor space of a house, minicomputers are about the size of a refrigerator and stove. The microcomputer, whose modern development traces back to the early 1970s, and fits on a desk.

From the start, the creation of the computer was centered around the concept that a single unit would be used to perform complex calculations with greater speed and accuracy than humans could achieve.

The Transistor

On December 23, 1947, one of the most far-reaching technologies of the 20th Century was developed at Bell Laboratories by John Bardeen, Walter Brattain, and William Shockley - the transistor. But the transistor wasn't available to U.S. manufacturers until 1956, when a seven year-old antitrust law suit against AT&T, the owners of Bell Labs, was settled. The judgment required that AT&T give away licenses to manufacture the transistor to American companies. Following this decision, the transistor was used to replace thousands of vacuum tubes in computers and began the miniaturization of electronics. Because it drastically reduced the size and heat considerations of the large vacuum tubes, the transistor enabled the computer to become a viable tool for business and government.

The Computer Mystique

From the beginning, computers baffled the populous with their capability. In corporate and government offices and on university campuses, information processing departments sprouted up to serve the computer. The IBM 701, which was introduced in 1952 as a business computer, was comprised of several units that could be shipped and connected at a customer's location, rather than the earlier massive units that had to be assembled on site. In 1953, IBM began shipping the first mass-produced computer, the IBM 650. IBM introduced the first solid-state (transistorized) computer in 1959, the IBM 7090. Then in 1964, IBM culminated over $1 billion in research when it brought out the System/360 series of computers. Unlike other mainframes, the System/360 computers were compatible with each other.

By 1960, the computer was king. Companies hired armies of technicians and programmers to write its operating programs and software, fix it, and allocate the precious computer time. The capability of the machines was more than a mere mortal could fathom, but gathering raw data and "keying" it in so the computer could "crunch the numbers" was a complicated and time-consuming task.

Frustrations abounded, computer errors were called "glitches," and the phrases "garbage in/garbage out," "It's a computer mistake," and "Sorry, the computer's down and we can't do anything," were introduced into the lexicon.

On college campuses in the 1960s, students carried bundles of computer cards to and from class, hoping that their share of the valuable computer time would not be bumped or allocated to someone else. The term, "Do not fold, spindle or mutilate," was coined so people wouldn't disable the process of feeding the punched computer cards into punch card readers, where the intricate patterns of holes were decoded.

The computer mystique was reinforced in people every time they heard of some new accomplishment. In 1961, a computer calculated the value of pi to 100,000 decimal places. A computer could play checkers, and in 1967 a chess playing computer program was made an honorary member of the United States Chess Federation. Banks began printing checks with magnetic ink so they could be processed by the computers.

A Small Change in Thought

Until 1971, nobody even thought of a computer as anything but a big, fast, electronic brain that resided in a climate-controlled room and consumed data and electricity in massive quantities.

In 1971, an Intel 4004 chip containing 4004 transistors was programmed to perform complex mathematical calculations; the hand-held calculator was born. Suddenly, scientists and engineers could carry the computational power of a computer with them to job sites, classrooms, and laboratories; but the hand-held calculator, like the ENIAC before it, was not yet a computer. The microprocessor was developed by Robert Noyce, the founder of Intel and one of the inventors of the integrated circuit, and brought with it a change in the way people worked.

New Technologies and New Ideas

Small, hand-held calculators had provided an idea, or at least a "what if," to some people. Still, in the early 1970s, computers were used for number crunching and printing out long streams of green and white paper. IBM Selectric typewriters were the top of the line "word processors" and Xerox copiers churned out photocopies. Most people never imagined that a computer could process data in real time, be used to write letters, or fit on a desk.

In 1972, Intel brought out its 8008 chip, capable of processing 8-bits of data, enough to convey numbers and letters of the alphabet. In that same year, Xerox began working on a personal computer at their Palo Alto Research Center. For the next several years, a team of Xerox scientists worked on the "Alto," a small computer that would have become the first PC if only the development team had been able to convince someone of its usefulness.

Likewise, in 1972 Digital Electronics Corporation (DEC), a minicomputer manufacturing company headed by Kenneth Olsen, had a group of product engineers developing the DEC Datacenter. This PC incorporated not only the computer hardware but the desk as well. The DEC Datacenter could have put tremendous computing capability in the home or at work, but management saw no value to the product and halted its development.

In the end, none of the giant companies whose names had been synonymous with computers would introduce the PC to the world. There seemed to be no future in an inexpensive product that would replace the million dollar "Big Iron" that they were selling as fast as they could make them.

The people who eventually introduced the PC were rebels. Many had spent time in the bowels of the big companies and were frustrated by the lack of vision they encountered. They retreated into their own garages and attended meetings with other "computer nuts" who saw a much different future than the one laid out over the previous 30 years by the giants of the computer industry.

The PC is Born

In 1975, Rubik's Cube was put on store shelves and proved to many that the human brain was incapable of complex problem solving. But a ray of hope also appeared; the first PC was introduced. Micro Instrumentation and Telemetry Systems, Inc. (MITS) sold a kit for the MITS Altair 8800 that enabled computer hobbyists to assemble their own computers. It had no monitor, no keyboard, no printer, and couldn't store data, but the demand for it, like Rubik's Cube, was overwhelming.

The Altair proved that a PC was both possible and popular, but only with those people who would spend hours in their basements with soldering irons and wire strippers. The Altair, which looked like a control panel for a sprinkler system, didn't last, but it helped launch one of the largest companies in the computer world and gave a couple of young software programmers a start. In 1974, Bill Gates and Paul Allen wrote a version of BASIC for the Altair and started a company called Microsoft Corporation.

In 1976, another computer kit was sold to hobbyists - the Apple I. Stephen Wozniak sold his Volkswagen and Steve Jobs sold his programmable calculator to get enough money to start Apple. In 1977, they introduced the Apple II, a pre-assembled PC with a color monitor, sound, and graphics. It was popular, but everyone knew that a serious computer didn't need any of this. The kits were just a hobby and the Apple II was seen as a toy. Even the Apple name wasn't a serious, corporate sounding name like IBM, Digital Equipment Corporation, or Control Data.

But 1977 also brought competition. The Zilog Z-80 microprocessor, which had been introduced in 1975, was used in the Tandy Radio Shack TRS-80, affectionately called the "Trash 80." Apple, Commodore, and Tandy dominated the PC marketplace. The Apple II had 16K bytes of RAM and 16K bytes of ROM; Commodore Business Machines' Personal Electronic Transactor (PET) included 4K RAM and 14K ROM; and the TRS-80 had 4K RAM and 4K ROM.

Also in 1977, the Central Program for Microprocessors (CP/M) operating system was developed by Digital Research and Gary Kildall. From its introduction until 1980, CP/M was used in most PCs, but even that did not guarantee that a program or document could be written on one machine and read on another because each manufacturer used different floppy disk drives.

Apple introduced the floppy disk drive in 1978, allowing Apple II users to store data on something other than the cumbersome and unreliable tape cassettes that had been used up to that point.

But despite the popularity of the three PCs, non-computer people still saw little reason to buy an expensive calculator when there were other ways to do the same things. In 1979, that all changed.

When VisiCalc was introduced for the Apple II, non-computer people suddenly saw a reason to buy a computer. VisiCalc, a spreadsheet program created by Dan Bricklin and Bob Frankston, allowed people to change one number in a budget and watch the effect it had on the entire budget. It was something new and valuable that could only be done with a computer. For thousands of people, the toy, the computer few could find a use for, had been transformed into a device that could actually do something worthwhile.

Microprocessors and high-tech gadgets were gradually worming their way into people's lives. In 1978, Sony introduced the Beta format video cassette recorder, and a year later the VHS video recorder and the Sony Walkman. And to remind everyone of how far we had to go, Star Trek: The Motion Picture came to theaters in 1979.

The Sinclair ZX-80 PC, which hit the market in 1980, used the same Z-80 chip as Commodore's PET and the Tandy TRS-80. The ZX-80 had 1K RAM and 4K ROM. Developed by British entrepreneur Clive Sinclair, the ZX-80 meant that people could enter the computer revolution for under $200. Its small size and price attracted people who had never thought about owning a PC.

The Commodore VIC-20, also introduced in 1980, had a color monitor and would eventually become the first PC to sell more than one million units.

Even with all of the success the early PC manufacturers had in the late 1970s and early 1980s, the advances in microprocessor speeds, and the creation of software, the PC was still not seen as a serious business tool. Unknown to everyone in the computer industry; however, a huge oak tree was about to drop an acorn that would fall close to the tree and change everything.

Validation of the PC

Two events occurred in 1981 that would have a tremendous impact on the future of the PC. In 1980, IBM had started a secret project in Boca Raton, Florida called "Acorn." Thirteen months later, in 1981, IBM introduced the IBM PC, a product that validated the PC as a legitimate business tool. For many people, even those who prided themselves on being able to operate the "Big Iron," if IBM was making PCs then the small desk-top units were worthy of respect.

When the IBM PC hit the market, it was a compete system. Secretly, IBM had provided software developers with prototypes of their PC so they could develop an array of programs that would be available when the machine hit the streets. IBM also developed printers, monitors, and expansion cards for the PC and made it an open system so other manufacturers could develop peripherals for it.

The IBM PC used an Intel 8088 microprocessor, had 16K of RAM, was expandable to 256K, came with one 5.25-inch disk drive and room for a second, and was available with a choice of operating systems; CP/M-86 or IBM PC-DOS, which had been developed by Microsoft.

The second major event of 1981 was the introduction of the first luggable computer, the Osborne 1. This self-contained, suitcase-sized PC, developed by Adam Osborne, was not only the first portable PC, but also the first to be sold with software. The Osborne I came with BASIC, CBASIC, WordStar for word processing, and the SuperCalc spreadsheet program. Over the next two years, the Osborne Computing Company would go from nothing to a company with $70 million in annual revenue and then into bankruptcy.

Prior to 1980, the most common method of storing data was to connect an audio tape recorder to the PC and dump data to standard tape cassettes. Large word processors and some PCs had 8-inch disk drives, but in 1980 Al Shugart introduced the Winchester hard-disk drive.

The Race was On

Now that the PC had been validated, it began appearing on desk-tops in large and small companies to produce work schedules and payrolls, write letters and memos, and generate budgets. Software enabled people to do more in less time and business was promised the "paperless office" as an added benefit of the PC.

Managers attended classes and began writing memos and letters, but many felt that the work they could now do themselves on a PC was demeaning; it was the work that secretaries and clerks had always done. For some, having a PC on the desk meant that they now had to do the work, not just delegate it, and for others it meant they no longer supervised a person, but a machine.

There was also a strong fear factor. The PCs were expensive and many people were afraid they would damage the units or erase everything in one keystroke. People who had always worked with things they could see and understand were suddenly putting their faith in chips and hard drives that they not only couldn't see or touch, but they also didn't understand. Suddenly it was permissible to make a mistake in spelling or grammar; it could be changed and rewritten until it was correct. The whole thought process didn't set well with some, for others it freed them from the drudgery of using white correction fluid to cover up mistakes on printed documents.

The early 1980s were a time of furious change in the computer industry. More than 100 companies were manufacturing PCs, each with its own unique features, each with its own software. When IBM entered the market in 1981, software companies knew that writing IBM compatible software would be profitable. Software for the Apple II had exploded to 16,000 titles and IBM would do the same. New software in the 1980s included WordStar, Lotus 1-2-3, Microsoft Word, and Word Perfect.

In 1981, Hayes Micromodem brought the MOdulator/DEModulator (MODEM) to the market for PCs. The modem had been invented at AT&T Bell Labs in 1960 to connect mainframes and minicomputers. Hayes' modem allowed PCs to communicate with each other and access CompuServe and The Source, the online services that started up in 1979. CompuServe showed people what to do with their 300 baud modems by offering them an array of services and databases to connect with.

In 1982 Compaq introduced the first IBM compatible machine. Until Compaq, most manufacturers feared IBM and would not bring out a machine that was compatible with the PC. Later the compatibles would be termed "clones."

Also in 1982, Tandy brought out the TRS-80 Model 16, which was based on the Motorola 68000 and Z-80 microprocessors. The Model 16 retailed for $5,000 and included 128K RAM, an 8-inch floppy disk drive, as well as the Xenix operating system, a derivative of UNIX.

In January, 1983 Time magazine anointed the PC as the "Man of the Year," a designation by the editors that the computer had been the most influential newsmaker of 1982. The magazine estimated 80 million PCs would be in use by the end of the century. Industry leaders included Texas Instruments, Timex, Commodore, Atari, Apple, IBM, and Tandy, with Osborne leading the way in the portable market.

The individuals pushing the PC into the future were John Opel at IBM, Adam Osborne of Osborne Computers, VisiCalc creator Dan Bricklin, Jack Tramiel of Commodore, and Clive Sinclair who founded Sinclair Research.

The leading products of 1982 and their sales figures included the Timex/Sinclair 1000 - 600,000; Commodore VIC-20 - over 600,000, Atari 400 and Atari 800 - 600,000; Texas Instruments 99/4A - 530,000; TRS-80 Model III - 300,000; Apple II Plus - 270,000; IBM PC - 200,000; and Osborne 1 - 55,000. These computers ranged in price from the $99 Timex/Sinclair to the Osborne 1 at $1,795 with bundled software. In the opinion of Time, computers priced over $2,000 would appeal to a market of "…growing small businesses and big corporate clients…" Manufacturers of these higher end PCs included Altos, Corvus, Cromemco, Control Data, Digital Equipment, Hewlett-Packard, North Star, Olivetti, Tele Video, Toshiba, Xerox, and Zenith.

But in 1983 there was once again a wind of change blowing across the PC landscape.

The Mac Attack

In 1983, Apple brought out a machine that failed to sell but nonetheless showed consumers and manufacturers a new direction for the PC. The Lisa, an expensive PC with a graphical user interface (GUI), hit the market with a thud. At $10,000, it had few friends and even fewer buyers.

Also in 1983, IBM introduced IBM XT with a 10MB hard drive, three additional expansion slots, 128K RAM, and a 360K floppy drive. To many buyers, the 10MB storage capacity seemed large enough to last a lifetime.

Immediately after the failure of Lisa, Steven Jobs rethought the machine and in 1984, out came the Macintosh. The Macintosh was powered by Motorola's 68000 processor and came with 128K of RAM. It was so radically different from any other PC that it split the PC world into two halves that would not be rejoined for another decade. In addition to the GUI that made the computer an "intuitive" extension of the user, the "Mac" had its own operating system that was incompatible with IBM's MS-DOS system. Suddenly PC meant DOS-based and IBM compatible and Mac meant GUI and mouse.

The Mac was introduced to the world in an extravagant television commercial that was shown only once during half-time of the NFL Super Bowl. The commercial changed the advertising industry almost as much as the Mac changed computing.

Suffering from the failure of the Apple III and Lisa, Apple was literally saved by the Mac. People who hated computers loved the simplicity of Mac. The GUI allowed the user to click a mouse button on an icon to launch a program, print a document, or copy a file. No longer did users have to know combinations of keys or special codes to get the computer to do what they wanted it to do. The Mac was "user friendly."

Although not the first PC with a mouse or GUI (that distinction went to Xerox's $50,000 Star that came out in 1981 and immediately failed), the Mac did set the computer world on its ear because of its ease of operation and its operating system.

When Apple came out with the Apple LaserWriter in 1985 it was with Adobe Systems Inc.'s PostScript page description language. By 1986, with its what-you-see-is-what-you-get (WYSIWYG) display and printing, desk-top publishing was born. WYSIWYG meant that a person could format a document with special fonts and spacing and be assured that what came out of the printer would look like what they had created on the screen.

Adobe, founded in 1982 by John Warnock and Charles Geschke, turned the printed page into a graphic image. The bit map made each pixel on the screen a definable image that could be moved and changed without the limitations of a standard text format. PostScript changed the way people thought about fonts, page layout, and the visual impact of the documents they produced with their PC. Printers like the Apple LaserWriter and the Hewlett-Packard HP LaserJet made every document look like it had been professionally typeset and printed.

In 1985, the Commodore Amiga 1000, which featured multitasking, graphics, sound, and video in a windowing operating system, exposed people to multimedia. At the same time Toshiba came out with the T1100 laptop, Tandy introduced the Tandy 200 laptop, and AT&T introduced the UNIX PC. Intel took the microprocessor to a new level when it brought out the 386 microprocessor in 1985, proving that PCs were not only getting better, they were getting faster.

The 1980s were very active times for hardware manufacturers and software producers. Small software companies locked in with either IBM or Macintosh, but large companies like Microsoft were able to create new applications for both operating systems. While Aldus brought out PageMaker, and Lotus introduced Jazz, Microsoft announced Excel for the Mac, C 3.0, and finally shipped a long-awaited program called Windows.

Bill Gates, a founder of Microsoft, tried three times to interest IBM in Windows but was turned down each time. Although the Mac operating system had changed the interface between users and their PCs, many DOS users continued to hang on to their command line-driven MS-DOS operating system, and it would be several more years until the Windows concept caught on.

With the availability of hundreds of software programs, hard disk space became valuable real estate. The 10MB hard disk on the IBM XT began to fill up so hard drive manufacturers started the process of doubling their capacity.

As modems proliferated and the Hayes Smartmodem was accepted as the standard for modems, just about everyone either knew someone they could get online with, subscribed to an online service such as CompuServe, or wanted to access the 1000 host sites on the Internet.

But PCs that were connected to the outside world were also vulnerable to a new phenomenon called viruses. Once downloaded, these programs could attach themselves without warning to a PC's hard drive and gradually or in the blink of an eye destroy or overwrite files. Virus checkers became the rage for anyone who received data over telephone lines.

By 1987 enough people were writing their own software and sharing it that the Association of Shareware Professionals was formed to market and protect the inexpensive software. In 1987 a new computer language, C++, stimulated the growth of object-oriented programming (OOP).

Out of the Box and Obsolete

For consumers, the late 1980s were a time of frustration. No sooner had they learned to run their new PC and Macs than a new, better, larger, faster model was on the shelf. New versions of software, printers, and modems made it impossible to have the latest of anything.

In 1990, Intel's 386 and Motorola's 68030 microprocessors were at the top, then in 1991 Intel brought out the i486SX 20 MHz chip and Motorola introduced the 68040. Less than a year later Intel introduced the 50MHz 486 chip and Tandy brought out its $400 CD-ROM drive for PCs. Then, just to make everyone wonder what was going on, in 1991 Apple and IBM agreed to share technology by integrating the Mac into IBM's systems and using the IBM Power PC chip.

In 1992, Apple brought out the Apple PowerBook, a laptop that made everyone wonder just how small a full-function computer could get. A year later everyone knew the answer when Apple introduced the Newton Personal Digital Assistant (PDA). The Newton was supposed to be able to recognize hand-written notes and Apple sold 50,000 of them in 10 weeks.

In 1993, Intel introduced the 60MHz Pentium chip, the next generation of chips. The Pentium; however, had a nasty mathematical bug and its acceptance was slowed. Apple discontinued the workhorse of its fleet, the Apple II, which, despite the mind boggling changes in the industry, had lasted 17 years.

Not only were hardware and software obsolete, people were also getting caught up in their own obsolescence. For years, employers had included the operating systems and software names in their advertising for clerical and secretarial positions. As companies used more temporary workers and included both IBM clones and Macintosh's in their operations, proficiency with only one slammed the door on employment opportunities.

Many people enrolled in classes to learn the latest software or update their computer skills. A good, well-rounded employee needed to know desktop publishing, two or more word processing programs, at least one spreadsheet program, and a graphics package. They had to be able to access the company local area network (LAN), send and receive E-mail using high-speed (28,800bps) modems, and solve problems with hardware and software to maximize their output. Microprocessor-driven telephones, cellular phones, and pagers added to the complexity of the job, and repetitive motion syndrome from using keyboards hour after hour created an army of people wearing wrist braces.

Many people left a job where their day was spent working at a computer terminal or PC and went home to enjoy the quite, relaxing camaraderie they found in Internet chat rooms, by visiting the World Wide Web, or reading their favorite newspapers and electronic magazines (ezines).

From its inception in 1975, the PC has become a focal point of business, education, and home life. The microprocessor, an amazing technology when it had 4000 transistors on a single chip, is now even more amazing when it has over 3 billion transistors on an even smaller chip. In 1982, when Time magazine made the computer its "Man of the Year," the PC was still in its infancy. "Big Iron" still dominated the high-tech environment and having a personal computer was a luxury.

The creation and success of the PC would not have been possible without the elimination of the concept that a computer was a large, centralized, data processor and number cruncher. Today the PC is a communication channel more than it is a computational tool. Millions of people work in their "electronic cottages," either operating their own business from home or telecommuting to work. It is ironic that one of the first Intel 4004 microprocessors made continues to operate and lead the world to the outer edges of time and space. In 1972 one of the small chips was installed in the Pioneer spacecraft. Today it continues to operate over 5 billion miles from earth.

The inventor of the modern digital computer - of Bulgarian origin


17-1 Who is John Atanasoff 
                                    (by Luben Boyanov)
                                    
                                    The name John Atanasoff is not very well known but this is the man
                                    who has created the modern digital computer. 50 years have passed
                                    since John Atanasoff has created the first digital computer.
                                    
                                    President Bush has awarded the 1990 National prize for Technical achievement,
                                    - the highest American Technical award (I've used non-English text to
                                    translate the name of the prize so the correct name of the award may be
                                    a different one) to Prof. John Atanasoff.
                                    
                                    For long time it has been considered that the first electronic digital
                                    computer was ENIAC (Electronic Numerical Integrator and Computer) and one
                                    can find that name in almost any Computer Science books as the first
                                    example of the first generation digital computer systems.
                                    
                                    ENIAC was built at the University of Pennsylvania under the direction of John
                                    Mauchly and J. P. Eckert. Work on ENIAC began in 1943 and it was completed
                                    in 1946. However, in the early seventies it was proven that the ideas behind
                                    ENIAC were taken from the ABC (Atanasoff-Berry Computer) computer.
                                    
                                    John Atanasoff was born in Hamilton, New York in 1903. He was educated at
                                    the University of Florida, Iowa State College, and the University of
                                    Wisconsin (PhD, physics, 1930). With the help of Clifford Berry, Atanasoff
                                    built a working model of the Atanasoff-Berry Computer (ABC) in 1942. The
                                    ABC computer was a special-purpose machine for solving simultaneous linear
                                    equations. It was a serial, binary, electro-mechanical machine, and employed
                                    various new techniques that Atanasoff invented, including novel uses of
                                    logical circuitry and regenerative memory.
                                    
                                    Only recently has Atanasoff achieved recognition as one of the "fathers" of
                                    the digital computer.
                                    
                                    During his last visit in Bulgaria to the birth-place of his father - an
                                    emigrant orphan from the April Uprising against the Turks, John Atanasoff
                                    said: "Like a Bulgarian I am also a restless and creative person and the
                                    Slav root in my blood has helped me a great deal in life".
                                    
                                    John Atanasoff - junior, president of "Cybernetics Products, Inc" has also
                                    visited Bulgaria recently. He considers as good the chances of cooperation
                                    between his company and the newly emerging Bulgarian private businesses.
                                    
                                    It's not bad to remember that the inventor of the first modern digital
                                    computer is of Bulgarian origin.
                                    
                                    
                                    
                                    -------------------------------------------------------------------------------
                                    17-2 Who is John Atanasoff 
                                    (by John Bell), last updated: 19-Jun-1995
                                    John V. Atanasoff, 91, who invented the first electronic
                                    computer in 1939 and later saw others take credit for his discovery,
                                    died of a stroke June 15 at his home in Monrovia, Md.   
                                    Dr. Atanasoff, whose pioneenng work ultimately was
                                    aclmowledged during lengthy patent litigation in the 1970s, never
                                    made money off bis invention, which was the first computer to
                                    separate data processing from memory. It heads the famiky tree of
                                    today's personal computers and mainframes.   
                                    Two other scientists, J. Presper Eckert and John W.
                                    Mauchly, drew on Dr. Atanasoff's research. In the mid-1940s, they
                                    were the first to patent a digital computing device, which they
                                    called the ENIAC (electronic numerical integrator and computer).
                                    They said they had worked out the concept over ice cream and
                                    coffee in a Philadelphia restaurant. For many years, they were
                                    acclaimed as the fathers of modern computing.   
                                    But a court battle 20 years ago between two corporate
                                    giants, Honeywell and Sperry Rand, directed the spotlight to Dr.
                                    Atanasoff. He said the idea in fact, had come to him over bourbon
                                    and water in a roadhouse in Illinois in 1937. He was out on a drive
                                    >from  Iowa State University, in Ames, where he taught mathematics
                                    and physics, and had stopped to think about the computing devices
                                    he had been working on since 1935.   
                                    He needed a machine that could do the complex
                                    mathematicat work he and his graduate students had been trying on
                                    desk calculators. He and two others at Iowa State already had build
                                    an analog catculator called a laplaciometer, which analyzed the
                                    geometry of surfaces. 
                                    It was that evening in the tavern, he said, that the possibility
                                    of regenerative memory and the concept of logic circuits came to
                                    him.  The machine he envisioned was different from anything
                                    conceived before.
                                    It would be electronically operated and would use base-two
                                    (binary) numbers instead of the traditional base-10 numbers.  It
                                    would have condensers fro memory and a regenerative process to
                                    preclude loss of memory from electrical failures.  It would use
                                    direct logical action for computing rather than the counting system
                                    used in analog processes.
                                    Within months, he and a talented graduate student, Clifford
                                    Berry, had developed a crude prototype of an electronic computer. 
                                    Although it used a mechanical clock system, the computing was
                                    electronic.  It had two rotating drums containing capacitors, which
                                    held the electrical charge for the memory.  Data were entered using
                                    punch cards. For the first time, vacuum tubes were used in
                                    computing. The project, which cost $1,000, was detailed in a 35-
                                    page manuscript, and university lawyers sent a copy to a patent
                                    lawyer. 
                                    The next year, Mauchly, a physicist  at Ursinus College,
                                    near Philadelphia, whom Dr. Atanasoff had met at a conference,
                                    came to see Dr. Atanasoff's work. Mauchly stayed several days at
                                    the Atanasoff home, where he was briefed extensively about the
                                    computer project and saw it demonstrated. He left with papers
                                    describing its design.   
                                    That same year, Dr. Atanasoff tried to interest Remington
                                    Rand in his invention, saying he believed it could lead to a
                                    "computing machine which will perform all the operations of the
                                    standard tabulators and many more at much higher speeds," but the
                                    company turned him down. Years later, it would eagerly seek his
                                    assistance.   
                                    Dr. Atanasoff had hoped to file a patent for his computer,
                                    but he was called away to Washington at the start of World War II
                                    to do physics research for the Navy. And there were complications
                                    with Iowa State, which held rights to his work but had discontinued
                                    efforts to secure a patent.   
                                    By the time the computer industry was off and running, Dr.
                                    Atanasoff was involved with other areas of defense research and
                                    out of touch with computer development. The Iowa State
                                    prototype had been dismantled while he was away working for the
                                    Navy. But he had kept his research papers. 
                                         He later said he "wasn't possessed with the idea I had
                                    invented the first computing machine. If I had knovn the things I
                                    had in my machine, I would have kept going on it."  
                                         The Atanasoff prototype finally was recognized as the father
                                    of modern computing when, in a patent infringement case Sperry
                                    Rand  brought against Honeywell, a federal judge voided Sperry
                                    Rand's patent on the ENIAC, saying it had been derived from Dr.
                                    Atanasoff's invention.  
                                         It was "akin to finding a new father of electricity to replace
                                    Thomas Edison," said a writer on the computer industry.  The
                                    decision made news in the industry, but Dr. Atanasoff, th this time
                                    retired, continued to live in relative obscurity in Frederick County.
                                         Later, in 1988, two books about his work were published:
                                    "The First Electronic Computer: The Atanasoff Story," by Alice R.
                                    Burns and Arthur W. Burns, and "Atanasoff, Forgotten Father of
                                    the Computer," by Clark R. Mollenhoff. Other articles were
                                    published in the Annals of the History of Computing, Scientific
                                    American and Physics Today.   
                                         In 1990, President George Bush acknowledged Dr.
                                    Atanasoff's pioneering work by awarding him the National Medal of
                                    Technology.   
                                         John Vincent Atanasoff was born in Hamilton, N.Y. He was
                                    an electrical engineering graduate of the University of Florida and
                                    received a master's degree in mathematics from Iowa State
                                    University, where he taught for 15 years. He received a doctorate in
                                    physics from the Uni- versity of Wisconsin.   
                                         Dr. Atanasoff left Iowa State in the early 1940s to become
                                    director of the underwater acoustics program at the Naval
                                    Ordnance Laboratory at White Oak, now the Naval Surface
                                    Weapons Center, where he worked largely with mines, mine
                                    countermeasures and depth charges.   
                                         He participated in the atomic weapons tests at Bikini Atoll
                                    after World War II and became chief scientist for the Army Field
                                    Forces, at Fort Monroe, Va., in 1949. He re- turned to the
                                    ordnance laboratory after two years to be director of the Navy Fuze
                                    programs, and in 1952 he began his own company, Ord- nance
                                    Engineering Corp.   
                                         That company was sold to Aerojet Engineering Corp. in
                                    1956, and Dr. Atanasoff was named a vice president. After he
                                    retired in 1961, he was a consultant and continued to work in
                                    computer education for young people. He also developed a
                                    phonetic alphabet for computers.   
                                         His honors included the Navy's Distinguished Civilian
                                    Service Award, five honorary doctorates, the Computer Pioneer
                                    Medal of the Institute of Electrical and Electronic Engineers, the
                                    Holley Medal of the American Society of Mechanical Engineers and
                                    the Distinguished Achievement Citation of Iowa State University.
                                    He was a member of the Iowa Inventors Hall of Fame. Dr.
                                    Atanasoff, whose father was born in Bulgaria, also was awarded
                                    Bulgaria's highest science award and was a member of the
                                    Bulgarian Academy of Science.   
                                         He was a member of the Phi Beta Kappa, Pi Mu Epsilon
                                    and Tau Beta Pi honorary societies and the Cos- mos Club.   
                                         Dr. Atanasoff's marriage to Lura Meeks Atanasoff ended in
                                    divorce.   
                                         Survivors include his wife, Alice Crosby Atanasoff of
                                    Monrovia; three children from his first marriage, Elsie A. Whistler
                                    of Rockville, Joanne A. Gathers of Mission Viejo, Calif., and John
                                    V. Atanasoff II of Boulder, Colo.; four sisters; three brothers; 10
                                    grandchildren; and seven great-grandchildren. 

                                    

The History of Internet Explorer
by Scott Schnoll

Copyright © 1998-2001 - All Rights Reserved

Internet Explorer is Microsoft’s world wide web browser, and the name for a set of Internet-based technologies that provide browsing, email, collaboration and multimedia features to millions of people around the world.  It’s a four-year old product that has received glowing reviews from end users and the media, harsh criticism from Microsoft’s competitors and the anti-Microsoft crowd, and it is one of the cornerstones of an ongoing anti-trust trial that the Department of Justice has brought against Microsoft.  It remains a testament to Microsoft’s ability to turn it’s product strategy on a dime, it is used by millions upon millions of users navigate the World Wide Web, and it has emerged the victor in the long-standing browser wars with Microsoft’s competitor, Netscape Corporation.

To properly understand the security aspects surrounding Internet Explorer, I believe one should begin with a historical perspective.  This is important for two reasons.  First, given the many different released versions of Internet Explorer, you need to determine where you are in the Internet Explorer product timeline.  Only then will you be able to determine what security issues you’re facing and what you can do about them.  Second, and more importantly, Internet Explorer is here to stay.  Microsoft has forever interwoven the Internet Explorer suite of products and set of technologies into its Windows, Office and BackOffice family product lines.  There are over 200 million Windows users, and I don’t think Windows is going to disappear any time soon.

The Beginning of an Era

In 1995, Microsoft was busily working on a very important project, code-named “Chicago.”  An extension of that project – code-named “O’Hare” after Chicago’s O’Hare Airport – was being developed in tandem.  Microsoft’s intent was to combine the technologies of both projects into a single consumer product.  Toward the end of these projects, Microsoft decided to take the O’Hare technologies, and distribute them as part of a separate add-on pack to the Chicago product.  Chicago, now known as Windows 95, proved to be one of the most successful operating systems to date.  O’Hare, now known as Internet Explorer 1.0, first shipped as an Internet Jumpstart Kit in Microsoft Plus! For Windows 95.

Although Internet Explorer 1.0 integrated nicely with Windows 95, few customers used it, preferring instead to use the highly popular browser from Netscape Development Corporation, or other web browsers such as Mosaic, Lynx and Opera.  Microsoft remained undeterred.  Microsoft’s market research indicated that their customers wanted to use Windows 95 as a universal network client; one that could connect to Windows NT, Novell NetWare, Banyan Vines, and the Internet.

Microsoft made great strides over the next year with version 2.0.  This was Microsoft’s first cross-platform browser, available to both Macintosh and 32-bit Windows users.  Version 2 introduced support for a wide variety of emerging Internet technologies, such as Secure Sockets Layer (SSL), HTTP cookies, RealAudio, Virtual Reality Modeling Language (“VRML”), and support for Internet newsgroups (NNTP).  We’ll discuss these things more in depth in forthcoming chapters.

Full Steam Ahead

In the summer of 1996, Microsoft released version 3.0, which seemingly overnight triggered a mass exodus from Netscape’s browser to Internet Explorer.  The Internet community became polarized on the issue of which web browser had the most features and the most support for the latest Internet technologies, as well as which one more closely adhered to RFCs and other Internet standards.  Internet Explorer 3 boasted a wide variety of features, including support for video and audio multimedia, Java applets, cascading style sheets, and Microsoft’s ActiveX controls.  Ever since the release of version 3, the browser wars have raged on.  But the debate was nearly made moot by one distinguishing aspect – Netscape charged nearly $50 for its web browser, while Microsoft gave Internet Explorer away for free.

One of the primary reasons behind the success of Microsoft Office, was the fact that it was a bundled suite of products.  Microsoft felt that, by applying this practice to Internet Explorer, they would be able to duplicate this success.  So they introduced additional integrated components when they released version 3, such as Internet Mail and News 1.0, a Windows Address Book, and later on, Microsoft NetMeeting and the Windows Media Player.  As a result of these new compelling features, version 3’s popularity skyrocketed.  This new and quickly increased popularity had the unintended side-effect of putting Microsoft and it’s web browser under intense public scrutiny.

Trouble Begins to Brew

Technologists and pundits began to write about how Microsoft was trying to dominate the Internet by flooding the market with their web browser and turning the Internet into a Microsoft proprietary domain.  Others were concentrating on other issues, such as browser security.  There was much to be concerned about.  On August 22, 1996, a mere nine days after Internet Explorer 3 was released, the first Internet Explorer security problem was reported – The Princeton Word Macro Virus Loophole.

The Princeton Word Macro Virus Loophole should have been a wake-up call for Microsoft.  Discovered by two well-known Princeton researchers – Edward Felten and Dirk Balfanz – this security hole enabled a malicious webmaster to download files to an unsuspecting user’s PC without their knowledge.  This could be any file, including a Microsoft Word Macro that could in turn execute DOS commands.  Or worse, a malicious webmaster could transmit a virus, a Trojan program that could open a “back door” into the target system, or a program designed to discretely transmit files back to the malicious web site.

The very next day, Microsoft released a patch to close the Princeton Word Macro Virus Loophole.  While Microsoft downplayed the significance of the loophole, the Internet community was becoming increasingly concerned.  Months before reporting this loophole, Felten reported his discovery of some serious Java vulnerabilities in Netscape Navigator.  The picture was becoming clear – this new territory called the Internet could be a dangerous place.

More and more security bugs started appearing.  In December, 1996, Felton reported another security flaw in Internet Explorer.  This flaw allowed malicious websites to “spoof” other web sites.  A spoofed web site is a site that looks real; it can literally be an identical copy of a real site, except that it isn’t hosted on a web server that belongs to the web site you think you’re visiting.  In other words, while you think you’ve just purchased the latest subscription to Foo Magazine, you’ve actually just transmitted your credit card number and other personal information to a fake site.

Month after month, one security problem after another was being steadily reported.  There were numerous vulnerabilities which exposed computer files to malicious web sites; there were other bugs that inadvertently transmitted encrypted information in plain text to unauthorized sites; and there was the revelation that Internet Explorer maintained a bit-by-bit record of where users went online.  Between Java bugs, scripting holes, Year 2000 problems, and a growing anti-Microsoft sentiment, Microsoft was being attacked on all sides, all because of Internet Explorer.

Goodbye Web Browser, Hello Integrated Functionality

Microsoft’s strategy for Internet Explorer took an interesting turn in late 1997 when Microsoft claimed that, once installed, Internet Explorer 3.0 could not be completely uninstalled from Windows 95.  This claim was made early on in the still-running antitrust trial against Microsoft, and hotly disputed by many, including the Department of Justice.  Again, Microsoft was undeterred.  In fact, in September 1997 they stepped up their efforts to improve upon version 3 by releasing an all new version – version 4 – one that was completely integrated into Windows 95, Windows NT and, when later released, Windows 98.

Internet Explorer 4 represented a quantum leap over the prior versions of Internet Explorer.  In 1990, Microsoft had unveiled its “Information at Your Fingertips” (IAYF) campaign.  According to Microsoft, IAYF means “the right information at the right time for the right purpose.”  Microsoft’s goal was to make finding, browsing and retrieving information easy, with access to the information location-independent.  Internet Explorer 4 was a major milestone in this campaign.  In fact, it was so critical to their vision, that Microsoft completely scrapped earlier betas and alphas of Internet Explorer in favor of the version that is available today.

Microsoft was targeting three major markets with this latest version.  For companies and organizations, Internet Explorer 4 would make users more productive and evangelize intranets, while allowing IS departments a granular level of control.  For home users, Internet Explorer 4 provided a much richer Internet experience.  For programmers and software developers, Internet Explorer 4 provided a platform for delivering interactive and compelling content.

But it was much more than that.  The launch of Internet Explorer 4 meant the end of the already extremely blurred line between Windows and Internet Explorer.  In Windows terminology, the word “shell” refers to the user interface (“UI”).  When Windows 95 debuted, the original Windows Program Manager shell was replaced with the Windows Explorer shell.  Explorer was a slick, new interface that caught on, and allowed novice users to quickly learn how to use Windows.  When a Windows 95 user installed Internet Explorer 4, their Explorer shell was replaced with Internet Explorer.  On the surface, the user didn’t notice much change.  The changes were there, however, and they were significant.  Internet Mail and News was replaced with Outlook Express, Microsoft Chat was added and Microsoft NetMeeting was upgraded.  In addition, Microsoft introduced a new feature called the “Active Desktop.”  This allowed Internet Explorer 4 users to replace their normal desktop and wallpaper with any web content they wanted.  Instead of icons and a single wallpaper image, Internet Explorer 4 users could, in effect, create their own custom UI for Windows.  It also brought drag-and-drop functionality to the Start Menu, and added integrated Favorites, a Quick Launch Bar and Address Bars.

Thanks, but No Thanks

Despite this power and flexibility, many users didn’t care for the Active Desktop.  Some felt that this feature was “code bloat,” that is, a feature that no one really wanted, but that Microsoft added anyway because they thought it was cool.  To a certain extent, they were right.  A lavishly customized Active Desktop can add quite a bit of resource overhead to a Windows PC.  Many Windows users were still running with 28.8Bps modem connections, 32MB of RAM or less in their systems, and, when turned on, the Active Desktop would slow the system to a crawl.  Today’s systems, however, are significantly more powerful that those in 1997, making the Active Desktop features useful and richly interactive.

Internet Explorer 4 also introduced a slew of new features, such as Channels, Subscriptions, Dynamic HTML, enhanced multimedia, and webcasting.  Security was also beefed up with the addition of Authenticode 2.0, and Security Zones.  Channels, subscriptions and webcasting (aka “Push” technology) were Microsoft’s efforts to move from a technology company to a content company.  This only fueled the now prevalent fears that Microsoft’s intent was to dominate the Internet.  Some went so far as to claim that, by dumping its web browser into the market for free, Microsoft would control who got on the Internet, where they went, and what they would see.  The very nature of the Internet made this a technical impossibility, but nonetheless, people complained.

Despite Microsoft’s best attempts to add features, provide integration, and secure Internet Explorer, everything they did seemed to backfire.  Customers didn’t like Internet Explorer 4’s heavy footprint or the way Active Desktop performed.  Microsoft’s partners didn’t like having to license and distribute Internet Explorer 4 – unmodified – in order to retain their status as a Windows licensee.  And security experts worldwide, such as Carnegie-Mellon’s Computer Emergency Response Team (“CERT”), were reporting one serious security hole after another.

A Very Long Life – In Internet Time

The concept of “Internet Time” refers to the frenzied and never-ending pace at which things on the Internet, or things related to the Internet, occur.  It’s a sort of “dog years” analogy for technology.  For example, say your company’s product happens to be a web browser.  Software development cycles can run anywhere from twelve months to several years.  But on Internet Time, the development cycle might now be six months to a year.  By Internet Time standards, Internet Explorer 4 has enjoyed an extremely long life cycle. 

It is common for development on the next version of a product to occur simultaneously with the release or near-release of the current version.  This is what happened with Internet Explorer 4.  Version 3 was an ambitious project to begin with.  The project – code-named “Athena” – was scheduled to be released in the Summer of 1996, and it was supposed to include a web browser, an email client and news reader, a new TCP/IP auto-dialer, and Microsoft NetMeeting.

  Athena 

Athena would also be the primary client in another project – code-named “Normandy.”  Normandy was a product line comprised of various Internet-related technologies, such as Microsoft Chat Server, Microsoft Personalization Server, Internet News Server, Microsoft Merchant Server, and others.  The “summer Internet package,” as it came to be known, would later become blended into another project – code-named “Nashville” – which was to be the successor to Windows 95 UI shell.

Late in the development cycle for Internet Explorer 3, it became apparent that Microsoft would not be able to deliver Athena as planned in the Summer of 1996.  So, Microsoft cut back on their plans and released Internet Explorer 3, Internet Mail and News 1.0 and Microsoft NetMeeting 1.0.  Microsoft then began working on a new project under the code-name of “Nashville.”  Nashville was being billed as an “Internet Update Release.”  Microsoft had ambitious plans for Nashville.  It would be a web browser (at the time based on Internet Explorer 3), an email client, a news reader, a personal web server, data and audio conferencing, and a personal information manager.  More importantly, it would replace the existing Windows shell, making it a completely integrated product.  Their intent was to release a new version of Windows with Nashville blended in.

Nashville’s goal was to evolve the Windows 95 shell to provide integration between the user’s PC and the Internet, blurring (and removing), the boundary between Windows 95 and Internet Explorer.  The Nashville team merged elements from the Windows 95 Explorer with features from Internet Explorer, and created a new shell (which is still called Explorer).  Nashville’s goal was realized in on September 30, 1997, when Microsoft released Internet Explorer 4.

The demand for version 4 was impressive.  In the first 24 hours it was available, it was being downloaded once every six seconds.  This amounted to the transmission of a whopping ten terabytes of data!  The demand exceeded everyone’s expectations, including Microsoft’s.  But in a matter of days, security issues began cropping up, and Microsoft began releasing what was to be a long stream of patches, updates and service packs, resulting in a number of different builds for version 4.

Resistance is Futile

Microsoft continues to integrate Internet Explorer into its other product lines, including its Office and BackOffice family of products.  Microsoft Outlook 98 – like it’s cousin Outlook Express – uses Internet Explorer’s HTML parsing and rendering engine.  Therefore, if you install Outlook 98 onto a computer without version 4 or higher, Internet Explorer gets installed, as well.  Office 2000 extends this practice by including and using Internet Explorer 5 technologies.  This foundational approach makes sense.  Why reinvent the wheel (or in this case why re-write the code) if it already exists?  On the other hand, this also means that security issues that affect Internet Explorer more often than not also affect products which use its codebase.  This only adds to the already mounting challenges of maintaining a safe and secure operating environment.

 Internet Explorer 4 continues to be a popular browser.  Nearly two years after its release it is still the most popular version in use today.  It is feature-rich, user-friendly and highly configurable.  On March 18, 1999, Microsoft capitalized on version 4’s success with the release of Internet Explorer 5.  Before it was even released, over 2 million copies of the beta version were downloaded.  Version 5 isn’t too much of a departure from version 4.  It does add a some very nice features, but like its predecessors, it, too has security vulnerabilities.  In fact, it’s a pretty safe assumption that all future versions of Internet Explorer – as with any web browser – will be affected by one or more security issues.

So there you have it.  The history thus far of Internet Explorer.

                         DAPSAM ( IN GOD I TRUST )