Home Guest Book Updates Help
Reyes Enterprises Network Logo Copyright © 2000-2001 Reyes Enterprises

Computer History
 

Columns
eBooks
QuicKnow
TI-83/83+
TI-85/86
TI-89

Chemistry Computer Hardware Computer History Computer Tips Game Cheats Graphing Calculators Poetry Quotations Windows 2000 Windows 98

Curious about the history of the computer and its evolution into its current day form?  Well, below is a detailed history created especially for you by Reyes Enterprises CEO, Mr. Eugene Reyes.

Howard Andres
COO, Vice President
Reyes Enterprises


Computer History

By: Eugene Reyes

The history of computing began with an analog machine.  In 1623, German scientist Wilhelm Schikard invented a machine that used 11 complete and 6 incomplete sprocketed wheels that could add and, with the aid of logarithm tables, multiply and divide.

French philosopher, mathematician, and physicist Blaise Pascal invented a machine in 1642 that added and subtracted, automatically carrying and borrowing digits from column to column.  Pascal built 50 copies of his machine, but most served as curiosities in parlors of the wealthy.  Seventeenth-century German mathematician Gottfried Leibniz designed a special gearing system to enable multiplication on Pascal's machine.

In the early 19th century, French inventor Joseph-Marie Jacquard devised a specialized type of computer: a loom.  Jacquard's loom used punched cards to program patterns that were output as woven fabrics by the loom.  Though Jacquard was rewarded and admired by French emperor Napoleon I for his work, he fled for his life from the city of Lyon pursued by weavers who feared their jobs were in jeopardy due to Jacquard's invention.  The loom prevailed, however: When Jacquard passed away, more than 30,000 of his looms existed in Lyon.  The looms are still used today, especially in the manufacture of fine furniture fabrics.

Another early mechanical computer was the Difference Engine, designed in the early 1820s by British mathematician and scientist Charles Babbage.  Although never completed by Babbage, the Difference Engine was intended to be a machine with a 20-decimal capacity that could solve mathematical problems.  Babbage also made plans for another machine, the Analytical Engine, considered the mechanical precursor of the modern computer.  The Analytical Engine was designed to perform all arithmetic operations efficiently; however, Babbage's lack of political skills kept him from obtaining the approval and funds to build it.  Augusta Ada Byron (Countess of Lovelace, 1815-52) was a personal friend and student of Babbage.  She was the daughter of the famous poet Lord Byron and one of only a few woman mathematicians of her time.  She prepared extensive notes concerning Babbage's ideas and the Analytical Engine.  Ada's conceptual programs for the Engine led to the naming of a programming language (Ada) in her honor.  Although the Analytical Engine was never built, its key concepts, such as the capacity to store instructions, the use of punched cards as a primitive memory, and the ability to print, can be found in many modern computers.

Herman Hollerith, an American inventor, used an idea similar to Jacquard's loom when he combined the use of punched cards with devices that created and electronically read the cards.  Hollerith's tabulator was used for the 1890 U.S. census, and it made the computational time three to four times shorter than the time previously needed for hand counts.  Hollerith's Tabulating Machine Company eventually merged with other companies in 1924 to become IBM.

In 1936, British mathematician Alan Turing proposed the idea of a machine that could process equations without human direction.  The machine (now known as a Turing machine) resembled an automatic typewriter that used symbols for math and logic instead of letters.  Turing intended the device to be used as a "universal machine" that could be programmed to duplicate the function of any other existing machine.  Turing's machine was the theoretical precursor to the modern digital computer.

In the 1930s, American mathematician Howard Aiken developed the Mark I calculating machine, which was built by IBM.  This electronic calculating machine used relays and electromagnetic components to replace mechanical components.  In later machines, Aiken used vacuum tubes and solid state transistors (tiny electrical switches) to manipulate the binary numbers.  Aiken also introduced computers to universities by establishing the first computer science program at Harvard University.  Aiken never trusted the concept of storing a program within the computer.  Instead, his computer had to read instructions from punched cards.

At the Institute for Advanced Study in Princeton, Hungarian-American mathematician John von Neumann developed one of the first computers used to solve problems in mathematics, meteorology, economics, and hydrodynamics.  Von Neumann's 1945 Electronic Discrete Variable Computer (EDVAC) was the first electronic computer to use a program stored entirely within its memory.

John Mauchly, an American physicist, proposed an electronic digital computer, called the Electronic Numerical Integrator And Computer (ENIAC), which was built at the Moore School of Engineering at the University of Pennsylvania in Philadelphia by Mauchly and J. Presper Eckert, an American engineer.  ENIAC was completed in 1945 and is regarded as the first successful, general digital computer.  It weighed more than 27,000 kg (60,000 lb), and contained more than 18,000 vacuum tubes.  Roughly 2000 of the computer's vacuum tubes were replaced each month by a team of six technicians.  Many of ENIAC's first tasks were for military purposes, such as calculating ballistic firing tables and designing atomic weapons.  Since ENIAC was initially not a stored program machine, it had to be reprogrammed for each task.

Eckert and Mauchly eventually formed their own company, which was then bought by the Rand Corporation.  They produced the Universal Automatic Computer (UNIVAC), which was used for a broader variety of commercial applications.  By 1957, 46 UNIVACs were in use.

In 1948, at Bell Telephone Laboratories, American physicists Walter Houser Brattain, John Bardeen, and William Bradford Shockley developed the transistor, a device that can act as an electric switch.  The transistor had a tremendous impact on computer design, replacing costly, energy-inefficient, and unreliable vacuum tubes.

In the late 1960s integrated circuits, tiny transistors and other electrical components arranged on a single chip of silicon, replaced individual transistors in computers.  Integrated circuits became miniaturized, enabling more components to be designed into a single computer circuit.  In the 1970s refinements in integrated circuit technology led to the development of the modern microprocessor, integrated circuits that contained thousands of transistors.  Modern microprocessors contain as many as 10 million transistors.

On August 12, 1981, IBM executives held a press conference in New York to introduce a momentous new computer--the IBM Personal Computer, or the PC, as it became known.  The personal computer industry dates back to the introduction of the first microprocessor, the Intel 4004, in 1971.  Nevertheless, the industry really took off following the January 1975 issue of the Ziff-Davis magazine Popular Electronics, which trumpeted the "Project Breakthrough of the Altair 8800, from MITS, which the magazine dubbed "the world's first minicomputer kit to rival commercial models.

By today's standards, that initial kit developed by Ed Roberts, who headed MITS, a small electronics company in Albuquerque, New Mexico, was quite limited.  It was based on Intel's 8080 microprocessor and had only 256 bytes of memory.

Priced at a very affordable $397, the Altair was the first personal computer widely available to the public.  It attracted hundreds of orders from electronics enthusiasts.

One who noticed this seminal event was a young Honeywell programmer named Paul Allen, who showed the Popular Electronics article to an old friend, a Harvard freshman named Bill Gates.  The pair worked together to write a version of BASIC for the Altair.  Soon, Allen went to work for MITS as its director of software, and shortly thereafter Gates left Harvard to join Allen in Albuquerque and to start a company that would later be named Microsoft.

With the Altair's introduction, the personal computer industry took off.  The year 1977 saw an explosion of interest in personal computers and the introduction of a long succession of machines--the Commodore PET, the Radio Shack TRS-80, and most important, Steve Wozniak and Steve Jobs's Apple II.

The Apple II quickly developed into its own standard, helped out enormously by Wozniak's 1978 design for an inexpensive floppy disk drive and--even more important--by Dan Bricklin and Bob Frankston's VisiCalc, the first electronic spreadsheet.  With the introduction of VisiCalc, business people suddenly had a reason for using personal computers.  It was not just a hobbyist's world anymore.

The rest of the decade saw dozens of very different designs, as one new company after another tried to define a unique combination of power, price, performance, and features.  Machines introduced in this period ranged from offerings for home and hobbyist users--such as Commodore's Vic-20 and 64, Atari's 400 series, and Texas Instruments' TI 99--to more business-oriented devices, such as a series of machines from Tandy/Radio Shack and a host of designs that ran Digital Research's operating system CP/M, which was written by personal computing pioneer Gary Kildall.

Because the market was growing so fast, and because in those early days backward compatibility did not mean much, the time was marked by a period of hardware creativity never seen since.  In addition, software began to grow as well, with the rapid appearance of a variety of early programming languages, games, and even business applications, such as the popular word processor WordStar.

Before long, nobody viewed personal computers as toys or hobbies but as devices for personal productivity with clear business applications.  The era of the personal computer was firmly established.  Moreover, IBM, which long had dominated mainframe computers, wanted a piece of the action.

The IBM of 1980, much more than the IBM of today, was not a company accustomed to fast-moving markets and consumer sales.  It sold business machines--primarily computers and typewriters--to businesses, using its own technology and relying very heavily on a very structured system of sales and service to large accounts.

The PC business required something different.  This new market was moving quite fast, and a new entrant would have to move quickly.  In addition, it would need to target individuals as well as businesses, even if the ultimate aim was to continue to sell business computers.  This is what William C. Lowe, laboratory director of IBM's Entry Level Systems (ELS) unit in Boca Raton, Florida, told IBM's Corporate Management Committee, including IBM president John Opel, in July 1980.

Lowe told the committee that IBM needed to build a personal computer and that there was room in the market that Apple and others had left untapped.  He told the committee it couldn't be built within IBM's standard culture of the time.  Therefore, they gave him the freedom to recruit 12 engineers to form a task force, called Project Chess, and to build a prototype computer.

In the next month, Lowe's task force had a number of meetings with other players in the young industry and made a number of key decisions that ultimately would affect the PC business for years to come.  One was the decision to sell IBM's personal computer through retail stores in addition to offering it through IBM's own commissioned sales staff.  Perhaps the company's most important decision, however, was to use an "open architecture": to choose the basic components and operating system from sources outside of IBM.  It was a big departure for IBM, which up to that point typically had designed all the major components of its machines.

In August, Lowe and two engineers, Bill Sydnes and Lew Eggebrecht, demonstrated a prototype to the Corporate Management Committee, which approved the basic plan and gave Project Chess the permission to proceed to create a personal computer, code-named Acorn.

To head the group that pulled it together, Lowe turned to Philip D. "Don" Estridge, another longtime IBM employee, who worked at the Boca Raton labs.  Estridge recruited a team that included Sydnes, who headed engineering, Dan Wilkie, who was in charge of manufacturing, and H.L. "Sparky" Sparks, who headed sales.

One early decision they had to make was to choose the processor to power the PC.  The task force had decided they wanted a 16-bit computer, because it would be more powerful and easier to program than existing 8-bit machines.  Intel had recently announced the 16-bit 8086, but Sydnes later said that IBM was concerned that the 8086 would be too powerful and compete too much with other IBM entries.

So instead, they chose the 8088, a version of the chip that had an 8-bit bus and a 16-bit internal structure.  This 8-bit technology offered the added benefit of working with existing 8-bit expansion cards and with relatively inexpensive 8-bit devices, such as controller chips, which could thus be incorporated easily and inexpensively into the new machine.

Another key decision was software.  In July, members of the task force went to visit Digital Research to ask the firm to port its CP/M operating system to the 8086 architecture.  Legend has it that founder Gary Kildall was flying his plane at the time.  Whatever the reason, Kildall's wife, Dorothy, and DR's attorneys did not sign the nondisclosure agreement IBM presented.  So the IBM team left and flew north to Seattle to meet with Microsoft, from which they had hoped to obtain a version of BASIC.

Microsoft officials signed an agreement with IBM for BASIC, and soon Bill Gates and company were discussing not only BASIC but also an operating system.  Quickly thereafter, Microsoft acquired an 8086 operating system called QDOS, or Quick and Dirty Operating System, written by Tim Patterson at a company called Seattle Computer Products.  Microsoft further developed this operating system and licensed it to IBM, which sold it as PC-DOS.

Fevered months of putting together the hardware and software ensued.  Then, on Wednesday, August 12, 1981, almost exactly a year after Project Chess was given the go-ahead, IBM introduced the IBM Personal Computer.  Sold initially at ComputerLand outlets and Sears Business Centers, that first PC--with an 8088 CPU, 64K of RAM, and a single-sided, 160K floppy disk drive--had a list price of $2,880.  When the IBM PC shipped in October, Estridge--by then considered the father of the PC--and his team had a runaway hit.

That original IBM PC had some great features--and some clear limitations.  It had a 4.77-MHz Intel 8088 processor, trumpeted as a "high-speed 16-bit microprocessor," but the PC only had an 8-bit data bus.  The machine initially came with 16K RAM on the motherboard standard, expandable to 64K, but its processor was capable of more because its 20 address bits permitted the PC to address 1 megabyte of physical memory, which was a huge leap forward at that time.  While the PC was capable of displaying graphics, you had to buy an optional graphics card to do this because the base machine only had a monochrome adapter.  The advertised price didn't include a monitor--or even a serial or parallel port.

Limitations aside, it was an immediate hit.  Beginning in the fall, according to IDC, IBM sold 35,000 machines by the end of 1981, and overall sales were five times initial projections.  In part, the sales effort was helped by a brilliant marketing campaign featuring the Little Tramp, the character Charlie Chaplin popularized in movies such as Modern Times.

The technical limitations of the original PC helped spark development of other, third party markets.  For example, the 8-bit data bus opened the door for add-on board manufacturers, which almost immediately started offering boards that had serial or parallel ports, graphics adapters, or extra memory up to 256K per board.  These boards could be combined so that the machine could use the full 640K of the processor's 1MB of space allocated to physical memory addresses.  Different vendors offered a wide variety of add-on features.  Major early players included Tecmar, Quadram, and AST, which originally gained fame with its Sixpack add-on card.

In software, the choices soon grew as well.  A BASIC language was included with the PC; many early users learned the language, ported applications from other systems, and created many interesting utilities.  There was also a lot of software written specifically for the IBM PC.  Besides providing PC-DOS, IBM also announced support for CP/M-86 and the UCSD p-System, operating systems that provided a little competition.  In fact, there were a few programs written for each of those systems, but it did not take long for PC-DOS to become accepted as the standard.

When the PC shipped, IBM announced a number of initial applications, including VisiCalc, a series of accounting programs from Peachtree Software, a word processor called EasyWriter from Information Unlimited Software (IUS), and Microsoft Adventure.  In most cases, these products were soon challenged by a variety of other packages.  For instance, though EasyWriter was first, more capable products--including WordStar, then MultiMate and later WordPerfect--eclipsed it over the next several years.

In the spreadsheet arena, VisiCalc's authors and publisher had a disagreement that delayed future versions, and it was soon challenged by new "integrated software."  Most important was a new program designed by Mitch Kapor, who had earlier designed a charting package that worked with VisiCalc on Apple II machines.  Kapor called his new company Lotus Development Corporation and named his product Lotus 1-2-3.  In the early days, 1-2-3 faced competition from programs like Context MBA, which relied on the UCSD p-System.  However, 1-2-3 was written directly to the video system of the IBM PC bypassing DOS.  As a result, it was fast, and it took over the PC market.  Just as VisiCalc had been the "killer application" for the Apple II, Lotus 1-2-3 played that role for the IBM PC.

Soon there were a variety of programs in other fields, ranging from programs that originally started life on other platforms--such as Ashton-Tate's dBASE II--to many programs written specifically for the IBM PC, including Microsoft Flight Simulator.  By 1983, most major software developers were writing their software for the IBM PC, and the effect on most of the competing machines was staggering.  Many companies that focused on earlier, non-IBM hardware and software standards, such as Osborne Computer, went out of business.

A number of competitors decided that DOS (the generic term for the operating system Microsoft called MS-DOS and IBM called PC-DOS) was the standard, but compatibility across many machines did not come quite that easily.  For a while, several vendors offered machines that were like the IBM PC, "only better."  The DEC Rainbow offered compatibility not only with the 8088 but also with Z80 software.  AT&T's 6300 and later the Texas Instruments Professional offered better graphics.  And Microsoft soon pronounced that DOS 2.x would become the standard.

Despite this rush to DOS compatibility, all these machines ultimately failed, primarily because they could not run all the software that the IBM PC could.  Most of the popular early applications, such as Lotus 1-2-3, WordStar, SuperCalc, and MultiPlan, were DOS applications, but they were written to circumvent DOS and BIOS code to deal directly with the IBM hardware for benefits like faster displays.  Some programs were written for generic DOS and some were ported to various new machines, but soon every user wanted real IBM compatibility.  Lotus 1-2-3 and Microsoft Flight Simulator were universally used as IBM-compatibility test programs, because they were written in assembly language and spoke directly to the PC's hardware.  The PC standard began to settle in.

In 1982, Joseph R. "Rod" Canion and two other former Texas Instruments managers formed Compaq Computer Corporation to create a true IBM-compatible portable, which started shipping in March 1983.  While not strictly the first compatible--Columbia Data Systems had shipped one earlier--the Compaq Portable showed that there was a market for a true IBM-compatible machine while gaining tremendous attention as a portable PC.  In those days portable meant a mere 28 pounds.

IBM would not introduce a similar machine until nearly a year later.  In the following years the portable field would become even more crowded as numerous companies, including Data General, Texas Instruments, Toshiba, NEC, and Compaq raced each other to field innovative, truly laptop computers that not only could be carried on a plane but could also be used on one.

Compaq followed its initial portable PC introduction with its first desktop PC, the Deskpro, in July 1984, and in the years that followed, PC "clones"--portables and desktops--would establish themselves as part of the industry standard.

In 1984, IBM tried to extend its standard in two ways.  In March of that year, it introduced the PCjr, a $1,300 8088-based "home computer" that became known for its infamous wireless keyboard with Chiclets-style keys.  It flopped.

IBM had much more success with its August introduction of the PC AT (for Advanced Technology).  Based on Intel's 80286 processor, the AT cost nearly $4,000 with 256K of RAM but no hard disk or monitor.  Models with a 20MB hard disk sold for almost $6,000.  Most important, the AT moved the industry to the next processor level while maintaining compatibility with almost all original PC applications.

Several important standards debuted along with the AT, especially the 16-bit expansion bus that endures as a standard today, but also including EGA graphics, which supported 640-by-350 resolution with as many as 16 colors.  At the same time, IBM and Microsoft introduced DOS 3.0, which would be the standard for many years, and IBM launched TopView, an early windowing system that let users display multiple applications concurrently.

Also that year, Hewlett-Packard introduced the first laser printer, although dot matrix and daisy wheel printers continued to dominate the market for years.  During this time, the first bulletin-board software packages were just starting to appear.  More business-oriented online services, such as CompuServe, were still years away from market dominance.

In the early 1980s, there were still a number of other kinds of machines on the market.  The Commodore 64 and the Atari 800 series were still popular home computers.  But their day would soon pass, though they would be resurrected a few years later in the kind of dedicated game machines that Nintendo, Sega, and Sony would create.

In the business market, CP/M still existed, but it was quickly fading from mainstream use.  Apple continued to have a lot of success with the Apple II family.  It failed, however, with the introduction of the Apple III and with the technically impressive Lisa, based on technology inspired by work at Xerox's famed Palo Alto Research Center.  The Lisa was the first major attempt to popularize the combination of mouse, windows, icons, and graphical user interface.  But at nearly $10,000, it did not gain market acceptance.

Instead, the business world was beginning to adopt programs such as Lotus 1-2-3 and WordPerfect, which soon became corporate fixtures.  These programs popularized both the PC standard and the character-based DOS interface.  Perhaps that interface was not as exciting as the graphics technology on the Lisa, but it worked, it was affordable, and it soon became dominant.  These developments set the pace for the next decade of personal computing.

As corporate users around the world were flocking to the DOS standard, they found it a big advancement over the micros of previous years.  The PC standard had led to the development of the "clone" market for different kinds of machines that would run software written to the standard.  Software developers everywhere were taking advantage of the growing market for new DOS applications software.  It may not have been a graphical era, but it sure was productive.

Then a TV advertisement shown only once--during the 1984 Super Bowl--opened the door to the future of personal computing.  It depicted a young runner dashing through a crowd of faceless drones to throw a hammer that shattered a screen image of Big Brother.  "Macintosh. So 1984 won't be like 1984."  And just like the screen image destroyed by that hammer, the image of the IBM-compatible micro as the ultimate computer was shattered as well.

Suddenly, a computer could offer more than a DOS prompt and a character-based interface; it could have multiple windows, pull-down menus, and a mouse.  PC users, if they noticed, waited…

It was not that the Macintosh wasn't appealing to PC users.  It was that the Mac, unfortunately, was not compatible with their existing hardware and software.  It also initially did not have the applications they wanted, it was not expandable, and it looked a little like a toy.  Still, many software vendors tried to deliver equivalent functionality in programs.  If Apple, inspired by Xerox's work at its Palo Alto Research Center, pointed the way to graphical user interfaces, the PC took a circuitous route to get there.  But it surely wasn't for lack of trying.

One of the first PC graphical products that shipped and failed was IBM's TopView.  It was only character-based, but it allowed multiple programs to run on-screen at once.  Unfortunately, it wasn't all that compatible either, and developers often needed to adapt their programs to work on it.  Having embraced compatibility as the watchword for "standard" computing, users stayed away.

Clearly, Apple wasn't the only company committed to graphical computing.  By late 1983, Microsoft had already begun working on applications software for the Macintosh, and it announced Windows 1.0 for the PC that same year.  This earliest version of Windows was promised as an extension to DOS and positioned against graphical competitors of the day, notably Lisa (an earlier attempt by Apple) and VisiOn (a graphical environment promoted by VisiCorp, the publishers of the VisiCalc spreadsheet).  The Windows that was announced was very different from the Windows finally shipped, after many delays, almost two years later.

The early versions, also called Interface Manager, looked like early versions of Microsoft Word for DOS, with a single list of commands on the bottom of the screen, not the modern pull-down menus.  The windows could not overlap; instead, they would only tile (an option still present in modern-day Windows but rarely used).  The basics were there, including a mouse used for selecting menu items, cut-and-paste capability, and initially a list of 30 hardware vendors that would support it--which represented most of the world that supported DOS, with the notable exception of IBM.

By the time it shipped, Windows had evolved to include pull-down menus as well as the Windows Write and Windows Paint applications and memory above 640K.  But it failed to gain market acceptance, in part because there weren't very many Windows applications available, although the first PC version of PageMaker did follow in late 1986.

Of the many competing windowing systems subsequently released, the most successful attempt of the era may have been the entry from Quarterdeck, a small start-up in Santa Monica, California.  Many power users ended up running Quarterdeck's DESQview for years as their multitasking windowing system.

On the hardware front, although Intel introduced the 16-MHz 80386 processor in 1985, it didn't immediately find its way into IBM-compatible systems, as they were then known.  Maybe vendors were waiting for IBM to take the 386 lead, but IBM had other plans.  Instead, it was left to Compaq and Advanced Logic Research to introduce the first 386-based PCs in September 1986.

One bright spot was that networking for personal computers was beginning to come into its own.  Novell had introduced NetWare in 1983, and it was beginning to be established as a corporate standard.  So was Ethernet, which had been invented in the late seventies but was just beginning to receive corporate acceptance.  Another early networking force was IBM's own standard, Token-Ring, which came out in late 1985.

Compaq's 386 entry defined a new standard for the industry.  Companies did not need to follow IBM; with new technology they could be leaders themselves.  In place of a world of IBM computing, makers of compatible BIOSes, such as Phoenix Technologies and AMI, spurred the era of PC compatibles.

Integrated software became a hot topic in 1984, with the introduction of Lotus Symphony and Ashton-Tate's Framework, programs that combined word processing, spreadsheet, graphics, and database functions into one integrated package.  While these were never as successful as their developers envisioned, they foreshadowed the day when office suites would come to dominate productivity applications.

Overall, it was an era of false starts in many ways.  The growth rate for sales of PCs slowed a bit, perhaps reflecting the lack of industry emphasis on home users.  Moreover, much of the PC industry was on hold waiting for what was known in the press as "the PC II" from IBM and "the New DOS
By 1987 the PC world was ready for something new, which came in April when IBM unveiled its most ambitious offering to date.  IBM's initial PS/2 machines ranged from the Model 30 (with an 8-MHz 8086 Intel processor and two 3.5-inch floppy disk drives) to the Models 50 and 60 (both with 10-MHz 286 CPUs) to the Model 80 (IBM's first 386-based machine, featuring a 16-MHz or a 20-MHz processor).

The PS/2 machines advanced the PC standard in many ways.  While other companies, notably HP and Apple, had shipped 3.5-inch floppy disk drives first, the PS/2 made this a standard, in part because you could not get an internal 5.25-inch drive on several of these models.  While other companies had previously announced graphics boards that provided 640-by-480 display resolution, the PS/2 brought with it the new Video Graphics Array (VGA) standard, which remains a standard to this day.  VGA was a big improvement over the earlier EGA standard, which offered 640-by-350 resolution.  Now each pixel became more "square," and the images had less distortion.  VGA also allowed more colors to be displayed simultaneously on-screen.

The most controversial aspect of the PS/2, however, was its introduction of a new bus for add-on cards--the Micro Channel architecture.  Micro Channel offered huge advantages over the older bus used in the AT: It was faster, it simplified the process of configuring cards, and it improved the ability of two add-in boards to work simultaneously.

However, Micro Channel was not compatible with the preceding IBM-compatible machines and the hundreds of expansion cards already on the market.  Moreover, IBM initially charged vendors significantly more money for a license to use the Micro Channel architecture than it had for the older AT design.  This was IBM's attempt to regain its technical lead--and market share--from the clone vendors, and IBM was willing to go so far as to abandon hardware compatibility.  By November, IBM was crowing that it had sold 1 million PS/2 devices in seven months, four times as fast as it had sold a million units of its original PC.

The vendors of compatible machines looked at the new specifications and gradually adapted the new 3.5-inch drive (although most software vendors had to provide their programs on both 3.5- and 5.25-inch disks for years to come) and VGA graphics standards.  But Compaq, AST, and others that had followed IBM's lead in 8088- and later 286-based (AT-compatible) computers balked at the licensing fees and the lack of backward compatibility of the Micro Channel architecture.  Instead, they came out the following year with an alternative, called the Enhanced Industry Standard Architecture (EISA), which offered several advantages, such as a 32-bit-wide data path, while retaining compatibility with existing 16-bit expansion cards.

For years, the industry debated the merits of Micro Channel and EISA.  Eventually, Micro Channel won support among IBM's customers, and EISA gathered some adherents for use in PC servers.  But most end users continued buying add-on boards that were still AT-compatible, which came to be known as Industry Standard Architecture (ISA) boards.  The era in which IBM by itself could dictate a change to the PC industry's hardware standards had come to an end.

But if the hardware world was to become fragmented following the PS/2 announcement, the effect of that development was nothing compared with the reaction to the announcement at the same time of a new operating system developed jointly by IBM and Microsoft: OS/2.  At that initial announcement, executives of both companies stood up and proclaimed their intention to make OS/2 the replacement for DOS.  After all, they said, DOS was not graphical, did not offer a standard user interface, could run only a single program at a time, and was still burdened by the 640K memory-address limit.  While Microsoft had been talking about a "multitasking DOS" for some time and IBM had long been rumored to be developing its own alternative, the two companies had signed a joint development agreement, through which OS/2 was to be the marriage of the two.

Almost from the beginning, however, that agreement was troubled.  OS/2 was to ship initially in two versions.  The first--which shipped in late 1987--would be OS/2 1.0, offering preemptive multitasking and support for large applications, up to 16MB, the limit of the 286 processor.  But the real graphical version--the one that caught the attention of most users and developers--would be OS/2 1.1 with Presentation Manager, and that didn't ship until October 1988.

More problematic would be compatibility concerns.  OS/2 was originally written for the 286, but the processor itself had some limitations.  The 286 had introduced what Intel called "protected" memory and the ability to write programs beyond the 640K barrier, but it did so in a way that sometimes made it incompatible with existing 8088/8086-based software.  A "compatibility box" let users run some existing DOS programs, but early versions of the compatibility box weren't all that compatible; users often called it the "penalty box."

Intel's 386 would address many of those issues by introducing what was known as Virtual 86 mode, which let a machine run multiple 8086 sessions.  OS/2, however, would not support virtual mode for several years.  In addition, there was confusion about a separate IBM-only version of OS/2, called the Extended Edition, which would add a database manager and communications.  In addition, some users mistakenly assumed the OS/2 name meant the operating system would work only on PS/2s.

Meanwhile, Microsoft continued work on Windows, which it positioned as a product that worked on top of DOS and was a "transition" to OS/2.  In 1987, Microsoft Windows 2.0 enhanced the Windows user interface to include such features as overlapping windows, the ability to resize windows, and keyboard accelerators (shortcut keys).  Thus, it moved much closer to today's Windows and OS/2.  It supported IBM's Systems Application Architecture (SAA) user interface standards more than a year before OS/2 would.  But this version worked in 8088/8086-compatible real mode, not the 286's more sophisticated protected mode, meaning applications still couldn't multitask well and still were limited in size.

Later that year, Windows was split into Windows/286 and Windows/386, the latter of which added multitasking capabilities, the ability to run applications in virtual machines, and support for up to 16MB of memory.  Windows/386 does not sound like much now, but at the time it was a lot.  And it marked the beginnings of competition between OS/2 and Windows, though at the time both IBM and Microsoft denied this.

More important, it became clear that Windows and OS/2 were not as compatible as they were initially promised to be, because the two supported very different models for writing graphics to the screen.  As a result, there was confusion among software developers, who were told by Microsoft to write for Windows with the promise that the programs could easily move to OS/2 later, and who were being told by IBM to write directly for OS/2.

Neither platform received a lot of support in those days.  Initial application support for the Windows platform was somewhat limited, with the exception of Aldus PageMaker and Microsoft Excel, which came out in 1987.  Notably, a full-featured Windows word processor would not appear until late 1989, when Samna's Ami Pro (which later was acquired by Lotus, and which survives under the name Word Pro) and the first version of Microsoft Word appeared.  Just about every major developer promised support for the graphical version of OS/2, but applications were very slow in appearing.

Instead, the PC world was settling down into a world of DOS applications and basic networking.  Slowly but surely, computers were becoming a part of business life for just about all white-collar workers.  It was not exciting, but it sure worked.

Despite all the promises, the computing world at the end of the decade was left with standards that had existed for many years, such as DOS and the ISA bus, and with a lot of proposals for the future--but no clear direction.  IBM had failed to set a new direction for itself and for the industry, and no other vendor had really taken over as standard-setter.  This situation, however, was to change very soon.  Whether the computing world was looking for a new standard or not, it got one in May 1990, when Microsoft finally shipped Windows 3.0.

Windows 3.0 ran on top of DOS, so it offered compatibility with the DOS programs.  It took advantage of the 386 processor, so it could multitask both those DOS programs and Windows programs.  The user interface was designed to look a lot like the Presentation Manager, with both an icon-based Program Manager and a tree-based File Manager, and it included enhancements such as shaded icons.  Although Windows 3.0 required minor rewrites of just about every Windows program to date, there were not many of them to be rewritten.  Most important, almost as soon as Windows 3.0 was introduced, applications appeared, led by Microsoft's own applications division and followed by just about every other major developer.  Even after the Windows 3.0 announcement, Microsoft and IBM continued to talk about OS/2 and in particular about OS/2 2.0, the first real 32-bit version, which would finally appear in 1992.

Even more confusing, while IBM was positioning OS/2 as the future operating system for all users, Microsoft was positioning OS/2 as high-end, for mission-critical and server-based applications only.  Instead, Microsoft started talking about OS/2 3.0 (not to be confused with the later IBM OS/2 Warp 3.0), which would add improved security and multiprocessor support and would be able to execute Windows and Posix applications directly.  In that scenario, Windows NT was the kernel on which DOS, Windows, OS/2, and Posix compatibility would sit.

The two companies finally split their strategies in early 1991, with IBM's Jim Cannavino and Microsoft's Bill Gates and Steve Ballmer sounding like angry spouses during a bitter divorce.  OS/2 developed a strong niche in some large corporate applications, helped by its stability and robustness as compared with Windows 3.x.  Later, IBM would make one last attempt to make OS/2 mainstream with its more consumer-oriented OS/2 Warp 3.0, which shipped in late 1994.  It would sell millions of copies but not slow down the industry's broad move to Windows.

Microsoft eventually turned its one-time "OS/2 3.0" into Windows NT 3.1, which shipped in 1993 without graphical OS/2 support and found support initially as an operating system for applications servers, competing primarily against IBM's OS/2.  For most PC users, Microsoft offered the enhanced Windows 3.1 in late 1991, which added better applications integration, drag-and-drop, and simply more stability.  Through the early nineties this became the dominant standard for PC applications, and Microsoft took a leadership role in defining multimedia specifications.

Microsoft came to dominate many more areas of computing in this time frame.  Its Visual Basic and Visual C++ overcame big competition from Borland to dominate programming languages.  And Microsoft's applications--led by its Office suite of Word, Excel, PowerPoint, and later Access--took the lion's share of the market for applications software (in part helped by delays in the Windows versions of Lotus 1-2-3, WordPerfect, and dBASE, the last of which had by then been acquired by Borland).

In this period, Apple's Macintosh line continued to grow and expand, and it found niches in graphics arts, multimedia, and education.  Nevertheless, in most corporate and government offices the primary business system was one that followed the standards of the original PC.  By then the term IBM-compatible had fallen out of favor, to be replaced by the processor as the primary descriptor of hardware.

The era of the 286 had already hastened to a close in late 1988 following Intel's introduction of the 386SX, a processor that had the 32-bit internals of the 386 but a 16-bit data bus like the 286, which made it inexpensive.  It and the original 386--rechristened the 386DX--dominated computer sales for years.  In April 1989, Intel followed up the 386 with its 486 processors.  With 1.2 million transistors, the 486 was effectively a faster, more refined version of the 386 plus a math coprocessor, so it ran all of the applications written for the 386 without a hitch.

This time around, no one waited for IBM or Compaq to go first.  Dozens of vendors raced to have 486 machines available as soon as possible after the Intel introduction, and these machines could run as much as 50 times as fast as the original IBM PC.

Intel introduced its 60-MHz Pentium processor in March 1993, but it was not just processors that continued to advance.  Hard disks continued to get bigger and faster.  Graphics display technology progressed beyond "frame buffer" graphics cards to graphics accelerators, which worked directly with Windows to increase screen response times and to enhance all graphics.

In this period, corporate local area networking really began to take off.  At the time, IBM was promoting OfficeVision, which was supposed to run on all the SAA platforms including OS/2.  And just about every other giant systems vendor had its own multiplatform office-automation strategy, such as DEC's All-in-One.  Almost all of these would fail relatively quickly.

What did succeed were PC servers, which hold their own data and can link to big corporate databases.  On the hardware front, the Compaq Systempro, introduced in 1989, led the charge for big applications that previously had lived on minicomputers and other large systems.  On the software side, SQL was coming into the market, and companies like Oracle and Sybase were beginning to target PC developers.  Rapid applications development or RAD tools soon helped make it easier to create good user interfaces for accessing corporate data.

E-mail had begun to become a way of corporate life, with early products such as cc:Mail, later acquired by Lotus, and a host of smaller competitors.  In December 1989, Lotus had changed the equation with Lotus Notes, the first "groupware" application.

By 1994, Microsoft and Intel had picked up the mantle of leadership in the PC industry, Windows was established as the standard for applications, and networking had become mainstream.

In early 1995, one might have expected new operating systems from Microsoft and new chips from Intel to remain the driving forces in computing for many years to come, given the history of previous years.  These are still important, but perhaps the most important change in the last few years came from a group of graduate students at the University of Illinois.  There, in early 1993, Marc Andreessen, Eric Bina, and others working for the National Center for Supercomputing Applications (NCSA) came up with Mosaic, a tool they would use to browse the Internet.

The Internet had been around for many years, dating back to the early 1960s when the Pentagon's Defense Advanced Research Projects Agency (DARPA) funded the connections for many university computers.  As the Internet continued to grow, the government passed control of it to the individual sites and technical committees.  In addition, in 1990, Tim Berners-Lee, then at the CERN physics lab in Geneva, Switzerland, created HyperText Markup Language (HTML), an easy way to link information together among Internet sites.  This in turn created the World Wide Web, which only waited for a graphical browser to begin growing exponentially.

Once Mosaic was released to the public in late 1993, suddenly the Internet--and the Web in particular--became accessible to just about anyone with a personal computer, helped in part by the fact that you could freely download the latest version of several different browsers.  And soon it seemed that just about everyone--and every company--was putting up a Web site.

New versions of Web browsers arrived very quickly as well.  Soon Netscape Corp.--a new company formed by Andreessen and Jim Clark, who had been a founder of Silicon Graphics--began to dominate Web browsers.  Netscape Navigator added many features, including plug-in support (which in turn led to many multimedia extensions) and the Java virtual machine (which let developers write Java applets that ran within the browser).

The tremendous excitement caused by the explosion of the World Wide Web came close to overshadowing Microsoft's major announcement of the period: Windows 95.  Introduced in August 1995, the software's debut was accompanied by more hype than any other computing announcement of the era.

Windows 95 was the version of Windows that many users had been waiting for.  It allowed for full 32-bit applications, had preemptive multitasking, was Plug and Play-compatible, supported new e-mail and communications standards, and of course featured a new user interface.  In fact, many users thought the new interface, which included a Start menu and a program desktop with folders and icons, moved Windows much closer to the original Lisa or Macintosh design of ten years earlier.

Microsoft had promised a 32-bit Windows for years, at one point saying that one would be ready in 1992, and developers had spent a long time waiting for "Chicago", as Windows 95 was known during development.  Once shipped, Windows 95 quickly became a standard for end-user computing, with many vendors having 32-bit versions of their applications ready when the new OS shipped or shortly thereafter.  Microsoft followed Windows 95 less than a year later with Windows NT 4.0, which incorporated the same user interface and ran most of the same applications using the Win32 programming interfaces.  Windows NT rapidly found favor among corporate IT managers because of its more stable design.  Moreover, Microsoft released Windows 2000 on February 17, 2000, as a replacement for both Windows 95 and Windows NT 4.0.

Still, there is lots of room left for advances in operating systems.  For years, software vendors have been talking about object-oriented languages (such as C++) and a more object-oriented operating system.  In such a design, data and applications should be split up, so that users might work with data independent of individual applications.  Ideally, objects could be spread out or distributed among multiple computers.

Microsoft has been talking about this concept for years, notably in Bill Gates's November 1990 "Information at Your Fingertips" speech, which emphasized the concept that all the data a user may need could someday be accessed from a personal computer regardless of where the data actually resides.  The idea, he said, was to move beyond applications and think about data.  This direction led to Microsoft's emphasis on compound documents, macros that work across applications, and new file systems (NTFS and FAT32).

Of course, Microsoft's competitors have continued down their own paths.  In 1989, Steve Jobs's NeXT Computer came up with an object-oriented OS, which was targeted toward corporate customers and recently was acquired by Apple Computer.  In the early nineties IBM and Apple merged two of their projects--Apple's "Pink" OS and an IBM/Metaphor experiment called the Patriot Partners--to create Taligent.  This project resulted in a fairly extensive series of frameworks for developers to use in creating object-based applications.  But though the frameworks were recently added to OS/2, plans for Taligent as a separate OS have been shelved.

Other object-based technology is in various stages of development.  Microsoft's OLE, which allows for compound documents, has been improving and is now a part of the firm's ActiveX specification.  Apple, IBM, and others came up with an alternative specification called OpenDoc, and such components are now called LiveObjects.  IBM defined a standard for objects to work together across a network called the Systems Object Model (SOM), which competes with Microsoft's Component Object Model (COM).

But all of this has been overshadowed in recent months by Sun Microsystems' Java, which started life as a variant of C++ designed for use on the Internet.  In the last year, it has grown to include a virtual-machine implementation that has been incorporated in browsers from Netscape and Microsoft, as well as the latest version of IBM's operating system OS/2 Warp.  Many developers are currently developing applets and even full applications within Java, with the hope that this will free them from having to rely on Microsoft standards.  More recently, Sun, Netscape and others have been promoting the JavaBeans specification as a great way of linking objects.

On the Web itself, a current push is for technologies and products that allow content to be delivered automatically over the Internet, so that users do not have to search out specific information.  Pioneered by PointCast, which implemented a screen saver that collects information from many sources, this approach is being pursued by several new competitors, including Castanet and BackWeb.  In addition, both Netscape Communicator and Microsoft Internet Explorer feature background delivery of Internet content, with Microsoft even incorporating it into Windows 98.

Keeping pace with these software developments, computer hardware continues to evolve.  Faster and faster machines remain the watchword.  Intel's Pentium processor with 3.2 million transistors was introduced in 1993 and by 1995 had become the standard processor in mainstream computing. Intel has since followed this up with the Pentium Pro, Pentium II, and Pentium III processors, offering even better 32-bit performance.

In early 1997, Intel introduced the MMX instructions, the first major enhancement to the Intel instruction set since the 386.  These instructions, which are designed to improve game and multimedia performance, were added to both Pentium and Pentium Pro designs, leading to the Pentium MMX and Pentium II.  Then, in early 1999, Intel added yet another enhancement to the Pentium II core: streaming SIMD instructions.  This new feature set promises to enhance all applications and games.

At the same time, other hardware continues to evolve.  Graphics adapters have become faster and more powerful and are now adding new 3-D capabilities, particularly exploiting the Direct3D features of DirectX 7.0a for Windows 98 SE and Windows 2000.  Hard disks continue to get faster and larger; in early 1997, hard disks larger than 2 gigabytes became standard features of high-end PCs.

CD-ROM drives came into the market in 1985 but did not begin to take off until the early nineties.  Now they are a standard part of virtually all desktop machines sold into the home and small-business markets, and the preferred way of loading applications.  CD-ROM speed has improved markedly; today 40X CD-ROM drives (those that spin about 40 times the speed of the first drives) are becoming standard.  However, where the 660MB capacity of a CD-ROM once seemed enormous, the DVD standard--which allows at least 4.7GB of storage on a single disk of the same size--dwarfs it.  That is enough for a full-length movie, perhaps with multiple languages or multiple endings.

Even printers have advanced at an amazing pace.  Laser printers have become standard corporate printers, offering excellent black-and-white printing, often at high speed, in printers shared over a network.  In the home market, color ink jet printers have become standard, offering great black-and-white and superb color printing for under $200.

Meanwhile, some companies--notably Sun and Oracle--are arguing that the speed of machines is no longer as important as it used to be, because of the prevalence of the Internet.  Instead, they are promoting a new Network Computer (or NC) specification, which argues for "thin clients" with most processing taking place on servers, where the operating system and all applications reside.  Such systems, they argue, would be easier to administer and thus would offer a lower total cost of ownership.  Microsoft and Intel are fighting back with a NetPC specification, which they maintain would have the same advantages plus the additional power and flexibility of PCs because NetPCs would run Windows and existing applications and would allow for standalone computing.

How many and which of these hardware, software, and networking initiatives will succeed?  As always, it is hard to tell.  Nevertheless, it is clear that the Internet and the Web will be major factors in the years to come, as will the inevitable increases in hardware and software capabilities.

Even though the last 20 years in the PC industry have been a wild ride, the next 20 promise to be even more interesting as the pace of technological development continues to accelerate.


Send mail to webmaster@reyesenterprises.org with questions or comments about this web site.
Copyright © 1996-2001 Reyes Enterprises and Eugene Sims Reyes.  All rights reserved.
Page last modified on June 30, 2001.

Reyes Enterprises Network Logo Copyright © 2000-2001 Reyes Enterprises
Terms of Use
Privacy Policy