UNDERSTANDING THE COMPUTER

 Introduction

A computer is an electronic machine that takes input from the user, processes the given input and generates output in the form of useful information. A computer accepts input in different forms such as data, programs  and user reply. 

A computer includes various devices that function as an integrated system to perform several task described above. These devices are:

1. Central processing unit(CPU)

It is the processor of the computer that is responsible for controlling and executing instructions in the computer.

2. Monitor

It is a screen, which displays information in visual form, after receiving the video signals form the computer.

3. Keyboard and Mouse

These are the devices, which are used by the computer, for receiving input from the user.

4. Speed

A computer is a fast electronic device that can solve large and complex problems in few seconds.

5. Storage capacity

A computer can store huge amount of data in its different storage components in many different formats.

6. Accuracy

A computer carries out calculations with great accuracy.

7. Reliability

A computer produces results with no error

8. Versatility

Computers are versatile machines.

9.Diligence

Computers can perform repetitive calculations any number of times with the same accuracy.

EVOLUTION OF COMPUTERS




In ancient times, people used different devices and methods foe performing the computing operations. However, these devices and methods used for calculations were not very fast and accurate. This fact had led to the invention of computer. The computer was developed to produce accurate results at a very fast speed. The computer has gone through several phases of technological development. We can understand these developments by just looking at the history of computers.

1.Manual Computing Devices

The idea of using stones for representing numbers and putting them at a place for performing simple calculations invented the device called sand table. A sand table was a device that arranged stones in three channels in the sand. Each channel could have a maximum of 10 stones. The addition operation was performed on this device by incrementing the count of right hand channel by one and by adding one stone in it. As soon as the right hand channel reached its maximum capacity, the stones were removed from that channel and one stone was added to the left hand channel.

2. Automated Computing Devices

Charles Babbage, a professor of mathematics at the Cambridge Unversity, made some worthwhile efforts towards automatic computing. He is also considered to be the father of modern computer. In 1812, Charles Babbage decided to automate the repeated series of steps needed in tabulating various functions, such as polynomial, logarithmic and trigonometric. In 1822, he presented a working model of his concept with the help of an automatic mechanical computing machine.

GENERATIONS OF COMPUTERS

First Generation: Vacuum Tubes (1940–1956)

The first computer system used vacuum tubes for circuitry and magnetic drums  for main memory, and they were often enormous, taking up entire rooms. These computers were very expensive to operate, and in addition to using a great deal of electricity, the first computers generated a lot of heat, which was often the cause of malfunctions. The maximum internal storage capacity was 20,000 characters. 

First-generation computers relied on machine language, the lowest-level programming language understood by computers, to perform operations, and they could only solve one problem at a time. It would take operators days or even weeks to set up a new problem. Input was based on punched cards and paper tape, and output was displayed on printouts.

It was in this generation that the Von Neumann architecture was introduced, which displays the design architecture of an electronic digital computer. Later, the UNIVAC and ENIAC computers, invented by J. Presper Eckert, became examples of first-generation computer technology. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.

Second Generation: Transistors (1956–1963)

The world would see transistors replace vacuum tubes in the second generation of computers. The transistor was invented at Bell Labs in 1947 but did not see widespread use in computers until the late 1950s. This generation of computers also included hardware advances like magnetic core memory, magnetic tape, and the magnetic disk.

The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy-efficient, and more reliable than their first-generation predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. A second-generation computer still relied on punched cards for input and printouts for output.

Third Generation: Integrated Circuits (1964–1971)

The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers.

Instead of punched cards and printouts, users would interact with a third-generation computer through keyboards, monitors, and interfaces with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers, for the first time, became accessible to a mass audience because they were smaller and cheaper than their predecessors.

Fourth Generation: Microprocessors (1971–Present)

The microprocessor ushered in the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. The technology in the first generation that filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, integrated all the components of the computer, from the central processing unit and memory to input/output controls, on a single chip.

In 1981, IBM introduced its first personal computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use the microprocessor chip.

As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Each fourth-generation computer also saw the computer development of GUIs, the mouse, and handheld technology.

Fifth Generation: Artificial Intelligence (Present and Beyond)

Fifth-generation computer technology, based on artificial intelligence, is still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. This is also so far the prime generation for packing a large amount of storage into a compact and portable device.

Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth-generation computing is to develop devices that will respond to natural language input and are capable of learning and self-organization.

 


No comments:

Post a Comment

Research in Social Science

   C  Concept of Research in Social Science -         Understanding the concept of Research Methods, Techniques and Tools: Interview, Fo...