Friday, December 5, 2008

The Evolution of the Modern Computer


The Evolution of the Modern Computer

An Open Source Graphical History

(1934 to 1950)

This is the home of the Computer Evolution File. This file attempts to provide a comprehensive graphical representation of the evolution of the modern computer for the period 1934 to 1950. The file is licensed with an attribution, share alike creative commons license. Please feel free to download and make improvements and derivative works. Please send a copy of changes to me and I will share the updates on this page.

foobar@bigfoot.com

Latest Version:- 0.3 released 2003-12-23
FileSizeDescription
ComputerEvolution_V0.3.txt70kDot file for Graphviz
ComputerEvolution_V0.3.png468kFull size Portable Network Graphics (PNG) file
Computer Evolution
 

Wikipedia Article for the term Computer

Technology )

I just re-wrote the first four sections of the Wikipedia article for the term Computer. The Current page is here.

This link to the change history page for the article currently shows one change on line 6. This was a trivial typo that I just fixed in my final version. Over time this link should change to show all the modifications to the article made by other people. I'm curious to see how signifcant the changes will be...

Posted by John on 2004/07/28 | Comments (0) | TrackBack (2184)

Charles Babbage and Howard Aiken. How the Analytical Engine influenced the IBM Automatic Sequence Controlled Calculator aka The Harvard Mk I

Technology )

In 1936, [Howard] Aiken had proposed his idea [to build a giant calculating machine] to the [Harvard University] Physics Department, ... He was told by the chairman, Frederick Saunders, that a lab technician, Carmelo Lanza, had told him about a similar contraption already stored up in the Science Center attic.

Intrigued, Aiken had Lanza lead him to the machine, which turned out to be a set of brass wheels from English mathematician and philosopher Charles Babbage's unfinished "analytical engine" from nearly 100 years earlier.

Aiken immediately recognized that he and Babbage had the same mechanism in mind. Fortunately for Aiken, where lack of money and poor materials had left Babbage's dream incomplete, he would have much more success.

Later, those brass wheels, along with a set of books that had been given to him by the grandson of Babbage, would occupy a prominent spot in Aiken's office. In an interview with I. Bernard Cohen '37, PhD '47, Victor S. Thomas Professor of the History of Science Emeritus, Aiken pointed to Babbage's books and said, "There's my education in computers, right there; this is the whole thing, everything I took out of a book."

[The Harvard University Gazette. Howard Aiken: Makin' a Computer Wonder By Cassie Furguson]

A fragment of one of Charles Babbage's Machines similar to the one seen by Aiken in 1936

more >>

Posted by John on 2004/03/30 | Comments (8) | TrackBack (692)

Vannevar Bush and The Limits of Prescience

Technology )

Today Vannevar Bush (rhymes with achiever) is often remembered for his July 1945 Atlantic Monthly article As We May Think in which he describes a hypothetical machine called a Memex. This machine contained a large indexed store of information and allowed a user to navigate through the store using a system similar to hypertext links. At the time of writing his essay Bush knew more about the state of technology development in the US than almost any other person. During the war, he was Roosevelt's chief adviser on military research. He was responsible for many war time research projects including Radar, the Atomic Bomb, and the development of early Computers. If anyone should ever have been capable of predicting the future it was Vannevar Bush in 1945. He is an almost unprecedented test case for the art of prediction. Unlike almost anyone else before or since Bush was actually in possession of ALL the facts - as only the head of technology research in a country at war could be.

more >>

Posted by John on 2004/02/11 | Comments (1) | TrackBack (2113)

Source Code as History

Technology )

When the history of early software development is written it will be a travesty. Few historians will have the ability, and even fewer the inclination, to learn long dead programming languages. History will be derived from the documentation not the source code. Alan Turings perplexed, hand written annotation "How did this happen?" on a cutting of Autocode taped into his note book will remain a mystery.

How did this happen? Annotation of a program bug by Alan Turing

What kind of bug would stump Alan Turing? Was it merely a typo that took a few hours to find? a simple mistake maybe? Or did the discipline of the machine expose a fundamental misconception and thereby teach him a lesson? The only way to know would be to learn Autocode.

more >>



Highest Paid Keywords In Google Adsense - Free videos are just a click away
The 7 Layers of the OSI Model
Last updated: March 03, 2008


The OSI, or Open System Interconnection, model defines a networking framework for implementing protocols in seven layers. Control is passed from one layer to the next, starting at the application layer in one station, proceeding to the bottom layer, over the channel to the next station and back up the hierarchy.

Application 
(Layer 7)
This layer supports application and end-user processes. Communication partners are identified, quality of service is identified, user authentication and privacy are considered, and any constraints on data syntax are identified. Everything at this layer is application-specific. This layer provides application services for file transferse-mail, and other networksoftware services. Telnet and FTP are applications that exist entirely in the application level. Tiered application architectures are part of this layer.
 
Presentation 
(Layer 6)
This layer provides independence from differences in data representation (e.g., encryption) by translating from application to network format, and vice versa. The presentation layer works to transform data into the form that the application layer can accept. This layer formats and encrypts data to be sent across a network, providing freedom from compatibility problems. It is sometimes called the syntax layer.
 
Session 
(Layer 5)
This layer establishes, manages and terminates connections betweenapplications. The session layer sets up, coordinates, and terminates conversations, exchanges, and dialogues between the applications at each end. It deals with session and connection coordination.
 
Transport 
(Layer 4)
This layer provides transparent transfer of data between end systems, orhosts, and is responsible for end-to-end error recovery and flow control. It ensures complete data transfer.
 
Network
(Layer 3)
This layer provides switching and routing technologies, creating logical paths, known as virtual circuits, for transmitting data from node to node. Routing and forwarding are functions of this layer, as well as addressing,internetworking, error handling, congestion control and packet sequencing.
 
Data Link
(Layer 2)
At this layer, data packets are encoded and decoded into bits. It furnishestransmission protocol knowledge and management and handles errors in the physical layer, flow control and frame synchronization. The data link layer is divided into two sub layers: The Media Access Control (MAC) layer and the Logical Link Control (LLC) layer. The MAC sub layer controls how a computer on the network gains access to the data and permission to transmit it. The LLC layer controls frame synchronization, flow control and error checking.
 
Physical 
(Layer 1)
This layer conveys the bit stream - electrical impulse, light or radio signal -- through the network at the electrical and mechanical level. It provides the hardware means of sending and receiving data on a carrier, including defining cables, cards and physical aspects. Fast EthernetRS232, andATM are protocols with physical layer components.
 

RECOMMENDED READING:






Custom Search





The Five Generations of Computers
October 24, 2008

The history of computer development is often referred to in reference to the different generations of computing devices. Each generation of computer is characterized by a major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful and more efficient and reliable devices. 

Read about each generation and the developments that led to the current devices that we use today.

First Generation - 1940-1956: Vacuum Tubes
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.

First generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts.

The UNIVAC and ENIAC computers are examples of first-generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.

Key Terms To Understanding  The Five Generations of Computers

Related Articles on Webopedia:

Second Generation - 1956-1963: Transistors
Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second-generation computers still relied on punched cards for input and printouts for output.

Second-generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High-level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.

The first computers of this generation were developed for the atomic energy industry.

Third Generation - 1964-1971: Integrated Circuits
The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.

Instead of punched cards and printouts, users interacted with third generation computers through keyboards andmonitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

Fourth Generation - 1971-Present: Microprocessors
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer - from the central processing unitand memory to input/output controls - on a single chip.

In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors.

As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development ofGUIs, the mouse and handheld devices.

Fifth Generation - Present and Beyond: Artificial Intelligence
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular andnanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that respond to natural language input and are capable of learning and self-organization.

DID YOU KNOW...
An integrated circuit (IC) is a small electronic device made out of a semiconductor material. The first integrated circuit was developed in the 1950s by Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Semiconductor.


1. First Generation (1939-1954) - vacuum tube

  • 1937 - John V. Atanasoff designed the first digital electronic computer
  • 1939 - Atanasoff and Clifford Berry demonstrate in Nov. the ABC prototype
  • 1941 - Konrad Zuse in Germany developed in secret the Z3
  • 1943 - In Britain, the Colossus was designed in secret at Bletchley Park to decode German messages
  • 1944 - Howard Aiken developed the Harvard Mark I mechanical computer for the Navy
  • 1945 - John W. Mauchly and J. Presper Eckert built ENIAC at U of PA for the U.S. Army
  • 1946 - Mauchly and Eckert start Electronic Control Co., received grant from National Bureau of Standards to build a ENIAC-type computer with magnetic tape input/output, renamed UNIVAC in 1947 but run out of money, formed in Dec. 1947 the new company Eckert-Mauchly Computer Corporation (EMCC).
  • 1948 - Howard Aiken developed the Harvard Mark III electronic computer with 5000 tubes
  • 1948 - U of Manchester in Britain developed the SSEM Baby electronic computer with CRT memory
  • 1949 - Mauchly and Eckert in March successfully tested the BINAC stored-program computer for Northrop Aircraft, with mercury delay line memory and a primitive magentic tape drive; Remington Rand bought EMCC Feb. 1950 and provided funds to finish UNIVAC
  • 1950- Commander William C. Norris led Engineering Research Associates to develop the Atlas, based on the secret code-breaking computers used by the Navy in WWII; the Atlas was 38 feet long, 20 feet wide, and used 2700 vacuum tubes
  • 1951 - S. A. Lebedev developed the MESM computer in Russia
  • 1951 - Remington Rand successfully tested UNIVAC March 30, 1951, and announced to the public its sale to the Census Bureau June 14, 1951, the first commercial computer to feature a magnetic tape storage system, the eight UNISERVO tape drives that stood separate from the CPU and control console on the other side of a garage-size room. Each tape drive was six feet high and three feet wide, used 1/2-inch metal tape of nickel-plated bronze 1200 feet long, recorded data on eight channels at 100 inches per second with a transfer rate of 7,200 characters per second. The complete UNIVAC system weighed 29,000 pounds, included 5200 vacuum tubes, and an offline typewriter-printer UNIPRINTER with an attached metal tape drive. Later, a punched card-to-tape machine was added to read IBM 80-column and Remington Rand 90-column cards.
  • 1952 - Remington Rand bought the ERA in Dec. 1951 and combined the UNIVAC product line in 1952: the ERA 1101 computer became the UNIVAC 1101. The UNIVAC I was used in November to calculate the presidential election returns and successfully predict the winner, although it was not trusted by the TV networks who refused to use the prediction.
  • 1954 - The SAGE aircraft-warning system was the largest vacuum tube computer system ever built. It began in 1954 at MIT's Lincoln Lab with funding from the Air Force. The first of 23 Direction Centers went online in Nov. 1956, and the last in 1962. Each Center had two 55,000-tube computers built by IBM, MIT, AND Bell Labs. The 275-ton computers known as "Clyde" were based on Jay Forrester's Whirlwind I and had magnetic core memory, magentic drum and magnetic tape storage. The Centers were connected by an early network, and pioneered development of the modem and graphics display.
Atanasoff-Berry Computer 1939, from IEEE
magnetic drum memory of the Atanasoff-Berry Computer 1939, from Smithsonian NMAH
Whirlwind core memory 1951, from IEEE
first computer bug 1945, fromIEEE


UNIVAC 1951, from Smithsonian NMAH
UNIVAC I ca. 1955, from Smithsonian
UNIVAC ad 1955/01/17 fromTime
UNIVAC ad 1955/02/28 fromTime
UNIVAC I of 1951 was the first business computer made in the U.S. "Many people saw a computer for the first time on television when UNIVAC I predicted the outcome of the 1952 presidential elections."


Bendix G-15 of 1956, inexpensive at $60,000, for science and industry but could also be used by a single user; several hundred were built - used magnetic tape drive and key punch terminal


IBM 650 that "became the most popular medium-sized computer in America in the 1950's" - rental cost was $5000 per month - 1500 were installed - able to read punched cards or magnetic tape - used rotating magnetic drum main memory unit that could store 4000 words, from Smithsonian NMAH



2.Second Generation Computers (1954 -1959) - transistor

  • 1950 - National Bureau of Standards (NBS) introduced its Standards Eastern Automatic Computer (SEAC) with 10,000 newly developed germanium diodes in its logic circuits, and the first magnetic disk drive designed by Jacob Rabinow
  • 1953 - Tom Watson, Jr., led IBM to introduce the model 604 computer, its first with transistors, that became the basis of the model 608 of 1957, the first solid-state computer for the commercial market. Transistors were expensive at first, cost $8 vs. $.75 for a vacuum tube. But Watson was impressed with the new transistor radios and gave them to his engineers to study. IBM also developed the 650 Magnetic Drum Calculator, the first by IBM to use magnetic drum memory rather punched cards, and began shipment of the 701 scientific "Defense Calculator" that was the first of the Model 700 line that dominated main frame computers for the next decade
  • 1955 - IBM introduced the 702 business computer; Watson on the cover of Time magazine March 28
  • 1956 - Bendix G-15A small business computer sold for only $45,000, designed by Harry Huskey of NBS
  • 1959 - General Electric Corporation delivered its Electronic Recording Machine Accounting (ERMA) computing system to the Bank of America in California; based on a design by SRI, the ERMA system employed Magnetic Ink Character Recognition (MICR) as the means to capture data from the checks and introduced automation in banking that continued with ATM machines in 1974
transistor, from Smithsonian NMAH
"First transistor (model), December 1947. Constructed by John Bardeen, Walter Brattain and William Shockley at Bell Laboratories," from Smithsonian NMAH
Regency transistor radio 1954 (TL), Zenith transistor hearing aid 1952, from Smithsonian NMAH
Regency transistor radio 1954, from Smithsonian NMAH
Philco and Emerson transistor radios, from Smithsonian NMAH
transistor radios, from Smithsonian NMAH
transistor radios, from Smithsonian NMAH
Maico hearing aid before and after transistors, from Fortune 1953/03
Morton, Shockley, White who developed transistor, fromFortune 1953/03
RCA transistor ad, from Fortune1953/03

3. Third Generation Computers (1959 -1971) - IC

  • 1959 - Jack Kilby of Texas Instruments patented the first integrated circuit in Feb. 1959; Kilby had made his first germanium IC in Oct. 1958; Robert Noyce at Fairchild used planar process to make connections of components within a silicon IC in early 1959; the first commercial product using IC was the hearing aid in Dec. 1963; General Instrument made LSI chip (100+ components) for Hammond organs 1968
  • 1964 - IBM produced SABRE, the first airline reservation tracking system for American Airlines; IBM announced the System/360 all-purpose computer, using 8-bit character word length (a "byte") that was pioneered in the 7030 of April 1961 that grew out of the AF contract of Oct. 1958 following Sputnik to develop transistor computers for BMEWS
  • 1968 - DEC introduced the first "mini-computer", the PDP-8, named after the mini-skirt; DEC was founded in 1957 by Kenneth H. Olsen who came for the SAGE project at MIT and began sales of the PDP-1 in 1960
  • 1969 - Development began on ARPAnet, funded by the DOD
  • 1971 - Intel produced large scale integrated (LSI) circuits that were used in the digital delay line, the first digital audio device
IC, from Smithsonian NMAH
Polaroid IC 1961, from Smithsonian NMAH
DEC PDP-1 of 1960, from CHM
DEC PDP8/E minicomputer 1973 from SDCM - cu
Anderson Jacobson ADC 260 acoustic coupler 1963, from SDCM
early transistor calculators - Casio "Mini" used chips from TI (left); TI SR-10 calculator showing circuit in transparent case, used a single chip 1972, from Smithsonian NMAH
early transistor calculators - Casio "Mini" used chips from TI (left); TI SR-10 calculator showing circuit in transparent case, used a single chip 1972, from Smithsonian NMAH
IC, from Smithsonian NMAH
IC, from Smithsonian NMAH


4. Fourth Generation (1971-1991) - microprocessor

  • 1971 - Gilbert Hyatt at Micro Computer Co. patented the microprocessor; Ted Hoff at Intel in February introduced the 4-bit 4004, a VSLI of 2300 components, for the Japanese company Busicom to create a single chip for a calculator; IBM introduced the first 8-inch "memory disk", as it was called then, or the "floppy disk" later; Hoffmann-La Roche patented the passive LCD display for calculators and watches; in November Intel announced the first microcomputer, the MCS-4; Nolan Bushnell designed the first commercial arcade video game "Computer Space"
  • 1972 - Intel made the 8-bit 8008 and 8080 microprocessors; Gary Kildall wrote his Control Program/Microprocessor (CP/M) disk operating system to provide instructions for floppy disk drives to work with the 8080 processor. He offered it to Intel, but was turned down, so he sold it on his own, and soon CP/M was the standard operating system for 8-bit microcomputers; Bushnell created Atari and introduced the successful "Pong" game
  • 1973 - IBM developed the first true sealed hard disk drive, called the "Winchester" after the rifle company, using two 30 Mb platters; Robert Metcalfe at Xerox PARC created Ethernet as the basis for a local area network, and later founded 3COM
  • 1974 - Xerox developed the Alto workstation at PARC, with a monitor, a graphical user interface, a mouse, and an ethernet card for networking
  • 1975 - the Altair personal computer is sold in kit form, and influenced Steve Jobs and Steve Wozniak
  • 1976 - Jobs and Wozniak developed the Apple personal computer; Alan Shugart introduced the 5.25-inch floppy disk
  • 1977 - Nintendo in Japan began to make computer games that stored the data on chips inside a game cartridge that sold for around $40 but only cost a few dollars to manufacture. It introduced its most popular game "Donkey Kong" in 1981, Super Mario Bros in 1985
  • 1978 - Visicalc spreadsheet software was written by Daniel Bricklin and Bob Frankston
  • 1979 - Micropro released Wordstar that set the standard for word processing software
  • 1980 - IBM signed a contract with the Microsoft Co. of Bill Gates and Paul Allen and Steve Ballmer to supply an operating system for IBM's new PC model. Microsoft paid $25,000 to Seattle Computer for the rights to QDOS that became Microsoft DOS, and Microsoft began its climb to become the dominant computer company in the world.
  • 1984 - Apple Computer introduced the Macintosh personal computer January 24.
  • 1987 - Bill Atkinson of Apple Computers created a software program called HyperCard that was bundled free with all Macintosh computers. This program for the first time made hypertext popular and useable to a wide number of people. Ted Nelson coined the terms "hypertext" and "hypermedia" in 1965 based on the pre-computer ideas of Vannevar Bush published in his "As We May Think" article in the July 1945 issue of The Atlantic Monthly.
Intel 4004 microprocessor in 1971, from Intel Museum
Apple I of 1976 , from Smithsonian NMAH
Wozniak and Jobs introduced Apple II in 1977, from

adbrite