HISTORY OF
COMPUTERS
The history of computer development is often to in reference
to the different generations of computing devices. A generation refers to the
state of improvement in the product development process. This is also used in
the different advancements of new computer technology. With each new
generation, the circuitry has gotten smaller and more advanced than the
previous generation before it. As a result of miniaturization, speed, power and
computer memory has proportionally increased. New discoveries are constantly
been developed that affect the way we live, work and play.
Each generation of computer is characterized by major
technological development that fundamentally changed the way computers operate,
resulting in increasingly smaller, cheaper, more powerful and more efficient
and reliable devices. There are five generations of computers, namely;
1. First
generation
2. Second
generation
3. Third
generation
4. Fourth
generation
5. Fifth
generation
First Generation: 1940-1956 (Vacuum
Tubes)
The first computers used vacuum tubes for circuitry and
magnetic drums for memory, and were often enormous, taking up entire rooms. A
magnetic drum, also referred to as drum, is a metal cylinder coated with
magnetic iron-oxide material on which data and programs can be stored. Magnetic
drums were once used as a primary storage device but have since been used
implemented as auxiliary storage devices.
There were very expensive to operate and in addition they use
a great deal of electricity, generated a lot of heat, which was often the cause
of malfunctions. First generation of computers relied on machine language to
perform operations and they could only one problem at a time.
The UNIVAC and ENIAC (Electronic Numerical Integrator and
Computer) computers are examples of first-generation computing devices. The
UNIVAC was the first commercial computing delivered to a business client, U.S.
Census Bureau 1951
Second Generation: 1956-1963
(Transistors)
Transistors replaced vacuum tubes and ushered in the second generation
of computer. Transistor is a device composed of semiconductor material that
amplifies a signal or opens or closes a circuit. Invented in 1947 at Bell Labs,
transistors have become the ingredient of all digital circuits, including
computers.
The transistors was far superior to the vacuum tube, allowing
computers to become smaller, faster, cheaper, more energy-efficient and more reliable
than their first-generation predecessors. Though the transistor still generate
a great deal of heat that subjected the computer to damage, it was a vast
improvement over the vacuum tube. Second-generation computers still relied on
punched cards for input and printout for output.
Second-generation computers moved from cryptic binary machine
language to symbolic or assembly, languages, which allowed programmers to
specify instructions in words. These were also the first computers that their
instruction in their memory, which moved from a magnetic drum to magnetic core
technology.
The first computers of this generation were developed for
atomic energy industry.
Third Generation: 1964-1971
(Integrated Circuits)
The development of integrated circuit was the hallmark of the
third generation of computers. These drastically increased the speed and
efficiency of computers.
Instead of punch cards and printout, users interacted with
third generation computers through keyboards and monitors and interfaced with
an operating system, which allowed the device to run many different
applications at one time with a central program that monitored the memory.
Computers for the first time became accessible to a mass audience because they
were smaller and cheaper than their predecessors.
Fourth Generation: 1971-Present
(Microprocessors)
The microprocessors brought the fourth generation of
computers, as thousands of integrated circuits we rebuilt onto a “single
silicon chip”. Microprocessors also control the logic of almost all digital
devices, from clock radios to fuel-injection systems for automobiles.
What in the first generation filled an entire room could now
fit in the palm of the hand. The Intel 4004chip, developed in 1971, located all
the component of the computer – from the central processing unit and memory to
input/output controls – on a single chip.
As these small computers became more powerful, they could be
linked together to form networks, which eventually led to the development of
the Internet. Fourth generation computers also saw the development of GUI’s,
the mouse and handheld devices.
Fifth Generation: Present and Beyond
(Artificial Intelligence)
Fifth generation computing devices, based on artificial
intelligence, are still in development, though there are some applications,
such as recognition, that are being used today.
Artificial Intelligence is the branch of computer science
concerned with making computers behave like humans. The term was coined in 1956
by john McCarthy at the Massachusetts institute of Technology. Artificial
intelligence includes:
·
Games
playing: programming computers to play games such as chess and
checkers.
·
Expert
System: programming
computers to make decisions in real-life situations (for examples, some expert
systems help doctors diagnose diseases based on symptoms).
·
Natural
Language: programming computers to understand natural human languages.
·
Neural Networks: systems
that simulate intelligence by attempting to reproduce the types of physical
connections that occur in animal brains.
·
Robotics:
programming computers to see and hear and react to other sensory stimuli
No comments:
Post a Comment