We have started new website of Reference Notes where you will find Notes, Question Paper, Solution, Old is Gold, Educational Software of all level. The location is MeroSpark.com - NEB, SEE, CTEVT, Bachelor's Level Notes, Question Papers, Notices and Many more..

Thursday, April 18, 2013

History and Generation of Computer | HSEB Notes on Computer Science Class 11


Computer Science
HSEB Notes on History of computer science
Class : 11

The history of computer science began long before the modern discipline of computer science that emerged in the twentieth century, and hinted at in the centuries prior. The progression, from mechanical inventions and mathematical theories towards the modern concepts and machines, formed a major academic field and the basis of a massive worldwide industry.

Mechanical computers:
A mechanical computer is built from mechanical components such as levers and gears, rather than electronic components. The most common examples are adding machines and mechanical counters, which use the turning of gears to increment output displays. More complex examples can carry out multiplication and division, and even differential analysis.


Abacus
The abacus, also called a counting frame, is a calculating tool used primarily in parts of Asia for performing arithmetic processes. Today, abaci are often constructed as a bamboo frame with beads sliding on wires, but originally they were beans or stones moved in grooves in sand or on tablets of wood, stone, or metal. The abacus was in use centuries before the adoption of the written modern numeral system and is still widely used by merchants, traders and clerks in Asia, Africa, and elsewhere. The user of an abacus is called an abacist.

Napier’s Bones
Napier's bones is an abacus created by John Napier for calculation of products and quotients of numbers that was based on Arab mathematics and lattice multiplication. The abacus consists of a board with a rim; the user places Napier's rods in the rim to conduct multiplication or division. The board's left edge is divided into 9 squares, holding the numbers 1 to 9. The Napier's rods consist of strips of wood, metal or heavy cardboard. Napier's bones are three dimensional, square in cross section, with four different rods engraved on each one. A set of such bones might be enclosed in a convenient carrying case. A rod's surface comprises 9 squares, and each square, except for the top one, comprises two halves divided by a diagonal line. The first square of each rod holds a single digit, and the other squares hold this number's double, triple, quadruple, quintuple, and so on until the last square contains nine times the number in the top square. The digits of each product are written one to each side of the diagonal; numbers less than 10 occupy the lower triangle, with a zero in the top half. A set consists of 10 rods corresponding to digits 0 to 9. The rod 0, although it may look unnecessary, is obviously still needed for multipliers or multiplicands having 0 in them.

Slide rule
The slide rule is a mechanical computer. The slide rule is used primarily for multiplication and division, and also for functions such as roots, logarithms and trigonometry, but is not normally used for addition or subtraction. Slide rules come in a diverse range of styles and generally appear in a linear or circular form with a standardized set of markings (scales) essential to performing mathematical computations. Slide rules manufactured for specialized fields such as aviation or finance typically feature additional scales that aid in calculations common to that field. William Oughtred and others developed the slide rule in the 17th century based on the emerging work on logarithms by John Napier. Before the advent of the pocket calculator, it was the most commonly used calculation tool in science and engineering. The use of slide rules continued to grow through the 1950s and 1960s even as digital computing devices were being gradually introduced; but around 1974 the electronic scientific calculator made it largely obsolete and most suppliers left the business.

Pascal's calculator
Blaise Pascal invented the mechanical calculator in 1642. He conceived the idea while trying to help his father who had been assigned the task of reorganizing the tax revenues of the French province of Haute-Normandie ; first called Arithmetic Machine, Pascal's Calculator and later Pascaline, it could add and subtract directly and multiply and divide by repetition. Pascal went through 50 prototypes before presenting his first machine to the public in 1645. He dedicated it to Pierre Séguier, the chancellor of France at the time. He built around twenty more machines during the next decade, often improving on his original design. Nine machines have survived the centuries, most of them being on display in European museums. In 1649 a royal privilege, signed by Louis XIV of France, gave him the exclusivity of the design and manufacturing of calculating machines in France.

Stepped Reckoner
The Step Reckoner (or Stepped Reckoner) was a digital mechanical calculator invented by German mathematician Gottfried Wilhelm Leibniz around 1672 and completed in 1694. The name comes from the translation of the German term for its operating mechanism; staffelwalze meaning 'stepped drum'. It was the first calculator that could perform all four arithmetic operations: addition, subtraction, multiplication and division. Its intricate precision gearwork, however, was somewhat beyond the fabrication technology of the time; mechanical problems, in addition to a design flaw in the carry mechanism, prevented the machines from working reliably.

Jacquard loom
The Jacquard loom is a mechanical loom, invented by Joseph Marie Jacquard in 1801, that simplifies the process of manufacturing textiles with complex patterns such as brocade, damask and matelasse. The loom is controlled by punched cards with punched holes, each row of which corresponds to one row of the design. Multiple rows of holes are punched on each card and the many cards that compose the design of the textile are strung together in order. It is based on earlier inventions by the Frenchmen Basile Bouchon (1725), Jean Baptiste Falcon (1728) and Jacques Vaucanson (1740).

Charles Babbage's Difference engine
A difference engine is an automatic, mechanical calculator designed to tabulate polynomial functions. The name derives from the method of divided differences, a way to interpolate or tabulate functions by using a small set of polynomial coefficients. Both logarithmic and trigonometric functions, functions commonly used by both navigators and scientists, can be approximated by polynomials, so a difference engine can compute many useful sets of numbers. The historical difficulty in producing error free tables by teams of mathematicians and human "computers" spurred Charles Babbage's desire to build a mechanism to automate the process.

Analytical Engine
The Analytical Engine was a proposed mechanical general-purpose computer designed by English mathematician CharlesBabbage. It was first described in 1837 as the successor to Babbage's difference engine, a design for a mechanical calculator. The Analytical Engine incorporated an arithmetical unit, control flow in the form of conditional branching and loops, and integrated memory, making it the first Turing-complete design for a general-purpose computer.

Charles Babbage (1791-1871) the Father of Computers
Charles Babbage is recognized today as the Father of Computers because his impressive designs for the Difference Engine and Analytical Engine foreshadowed the invention of the modern electronic digital computer. He led a fascinating life, as did all the folks involved in the history of computers. He also invented the cowcatcher, dynamometer, standard railroad gauge, uniform postal rates, occulting lights for lighthouses, Greenwich time signals, heliograph opthalmoscope.

Lady Augusta Ada Countess of Lovelace (First Computer Programmer)
Babbage owes a great debt to Lady Augusta Ada, Countess of Lovelace. Daughter of the famous romantic poet, Lord Byron, she was a brilliant mathematician who helped Babbage in his work. Above all, she documented his work, which Babbage never could bother to do. As a result we know about Babbage at all. Lady Augusta Ada also wrote programs to be run on Babbage’s machines. For this, she is recognized as the first computer programmer.

Electro-Mechanical Computer:

Census Tabulating Machine
Herman Hollerith Develop The tabulating machine. The tabulating machine was an electrical device designed to assist in summarizing information and, later, accounting. Invented by Herman Hollerith, the machine was developed to help process data for the 1890 U.S. Census. It spawned a larger class of devices known as unit record equipment and the data processing industry. Herman Hollerith worked as a statistician for the U.S. Census Bureau in the 1880s and 1890s. The U.S. Constitution requires a census count every ten years so that the membership of the House of Representatives will be proportional to the population of each state. This is always a moving target, hence the ten year review of the current state of demographic affairs. The 1880 census took seven years to process. The end of the 19th/beginning of the 20th centuries was the period of highest rate of immigration to the United States. Hollerith deduced,and it didn’t take a rocket scientist to conclude, that the next census would take longer than ten years, the results not available before the whole census counting thing had to start again. So, as the saying goes, “necessity became the mother of invention” and Hollerith designed and built the Census Counting Machine illustrated here and in the next slide. Punched cards (a la Jacquard looms) were used to collect the census data (the origin of the IBM punched cards) and the cards were fed into a sorting machine before being read by the census counting machine which recorded and tabulated the results. Each card was laid on an open grid. A matrix of wires was lowered onto the card and wherever there was a hole in the card, a wire fell through, making an electrical connection which triggered a count on the appropriate dial(s) in the face of the machine. The 1890 census took just three months to process even though quite a bit more data was collected than ever before. Hollerith was the first American associated with the history of computers. As you might expect, he was also the first to
make a bunch of money at it. His company, the Tabulating Machine Company, became the Computer Tabulating Recording Company in 1913 after struggling in the market and merging with another company that produced a similar product. The company hired a gentleman named Thomas J. Watson in 1918 who was primarily instrumental in turning the company around. In 1924, the company was renamed International Business machines (IBM) Corporation.

Harvard Mark I
The IBM Automatic Sequence Controlled Calculator (ASCC), called the Mark I by Harvard University, was an electromechanical computer. The electromechanical ASCC was devised by Howard H. Aiken, built at IBM and shipped to Harvard in February 1944. It began computations for the U.S. Navy Bureau of Ships in May and was officially presented to the university on August 7, 1944. The ASCC was built from switches, relays, rotating shafts, and clutches. It used 765,000 components and hundreds of miles of wire, comprising a volume of 51 feet (16 m) in length, eight feet (2.4 m) in height, and two feet (~61 cm) deep. It had a weight of about 10,000 pounds (4500 kg). The basic calculating units had to be synchronized mechanically, so they were run by a 50‐foot (~15.5 m) shaft driven by a five‐horsepower (4 kW) electric motor. From the IBM Archives: The Automatic Sequence Controlled Calculator (Harvard Mark I) was the first operating machine that could execute long computations automatically. A project conceived by Harvard University's Dr. Howard Aiken, the Mark I was built by IBM engineers in Endicott, N.Y.

The first computer bug
The lady is U.S. Rear Admiral Dr. Grace Murray Hopper, who worked with Howard Aiken from 1944 and used his machine for gunnery and ballistics calculation for the US Bureau of Ordnance’s Computation project. One day, the program she was running gave incorrect results and, upon examination, a moth was found blocking one of the relays. The bug was removed and the program performed to perfection. Since then, a program error in a computer has been called a bug.

Electronic digital computers

The Turing Machine
The "Turing" machine was described by Alan Turing in 1936, who called it an "automatic‐machine". The Turing machine is not intended as a practical computing technology, but rather as a hypothetical device representing a computing machine. Turing machines help computer scientists understand the limits of mechanical computation. A Turing machine is a device that manipulates symbols on a strip of tape according to a table of rules. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm, and is particularly useful in explaining the functions of a CPU inside a computer.

Atanasoff–Berry Computer
The ABC was built by Dr. Atanasoff and graduate student Clifford Berry in the basement of the physics building at Iowa State College during 1939–42. The Atanasoff–Berry Computer (ABC) was the first electronic digital computing device. Conceived in 1937, the machine was not programmable, being designed only to solve systems of linear equations. It was successfully tested in 1942. However, its intermediate result storage mechanism, a paper card writer/reader, was unreliable, and when inventor John Vincent Atanasoff left Iowa State College for World War II assignments, work on the machine was discontinued. The ABC pioneered important elements of modern computing, including binary arithmetic and electronic switching elements, but its special‐purpose nature and lack of a changeable, stored program distinguish it from modern computers.

Colossus computer
Colossus was the world's first electronic, digital, programmable computer. Colossus and its successors were used by British codebreakers to help read encrypted German messages during World War II. They used thermionic valves (vacuum tubes) to perform the calculations. Colossus was designed by engineer Tommy Flowers with input from Sidney Broadhurst, William Chandler, Allen Coombs and Harry Fensom. at the Post Office Research Station, Dollis Hill to solve a problem posed by mathematician Max Newman at Bletchley Park. The prototype, Colossus Mark 1, was shown to be working in December 1943 and was operational at Bletchley Park by February 1944. An improved Colossus Mark 2 first worked on 1 June 1944, just in time for the Normandy Landings. Ten Colossus computers were in use by the end of the war. The Colossus computers were used to help decipher teleprinter messages which had been encrypted using the Lorenz SZ40/42 machine—British codebreakers referred to encrypted German teleprinter traffic as "Fish" and called the SZ40/42 machine and its traffic "Tunny". Colossus compared two data streams, counting each match based on a programmable Boolean function. The encrypted message was read at high speed from a paper tape. The other stream was generated internally, and was an electronic simulation of the Lorenz machine at various trial settings. If the match count for a setting was above a certain threshold, it would be sent as output to an electric typewriter.

ENIAC
ENIAC (Electronic Numerical Integrator And Computer) was conceived and designed by John Mauchly and J. Presper Eckert of the University of Pennsylvania. The team of design engineers assisting the development included Robert F. Shaw (function tables), Jeffrey Chuan Chu (divider/square‐rooter), Thomas Kite Sharpless (master programmer), Arthur Burks (multiplier), Harry Huskey (reader/printer) and Jack Davis (accumulators). ENIAC was the first general‐purpose electronic computer. It was a Turing‐complete digital computer capable of being reprogrammed to solve a full range of computing problems. ENIAC was designed to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory. When
ENIAC was announced in 1946 it was heralded in the press as a "Giant Brain". It boasted speeds one thousand times faster than electro‐mechanical machines, a leap in computing power that no single machine has since matched. This mathematical power, coupled with general‐purpose programmability, excited scientists and industrialists. The inventors promoted the spread of these new ideas by conducting a series of lectures on computer architecture.

Generations of Computers

The history of computer development is often referred to in reference to the different generations of computing devices. A generation refers to the state of improvement in the product development process. This term is also used in the different advancements of new computer technology. With each new generation, the circuitry has gotten smaller and more advanced than the previous generation before it. As a result of the miniaturization, speed, power, and computer memory has proportionally increased. New discoveries are constantly being developed that affect the way we live, work and play. Each generation of computers is characterized by major technological development that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more powerful and more efficient and reliable devices. Read about each generation and the developments that led to the current devices that we use today.

First Generation (1940-1956) : Vacuum Tubes
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a great deal of electricity, generated a lot of heat, which was often the cause of malfunctions. First generation computers relied on machine language, the lowest‐level programming language understood by computers, to perform operations, and they could only solve one problem at a time. Input was based on punched cards and paper tape, and output was displayed on printouts. The UNIVAC and ENIAC computers are examples of first generation computing devices. The UNIVAC was the first commercial computer delivered to a business client, the U.S. Census Bureau in 1951.

Second Generation (1956-1963) : Transistors
Transistors replaced vacuum tubes and ushered in the second generation of computers. The transistor was invented in 1947 but did not see widespread use in computers until the late 1950s. The transistor was far superior to the vacuum tube, allowing computers to become smaller, faster, cheaper, more energy‐efficient and more reliable than their firstgeneration predecessors. Though the transistor still generated a great deal of heat that subjected the computer to damage, it was a vast improvement over the vacuum tube. Second generation computers still relied on punched cards for input and printouts for output. Second‐generation computers moved from cryptic binary machine language to symbolic, or assembly, languages, which allowed programmers to specify instructions in words. High‐level programming languages were also being developed at this time, such as early versions of COBOL and FORTRAN. These were also the first computers that stored their instructions in their memory, which moved from a magnetic drum to magnetic core technology.

Third Generation (1964-1971) : Integrated Circuits
The development of the integrated circuit was the hallmark of the third generation of computers. Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically increased the speed and efficiency of computers. Instead of punched cards and printouts, users interacted with third generation computers through keyboards and monitors and interfaced with an operating system, which allowed the device to run many different applications at one time with a central program that monitored the memory. Computers for the first time became accessible to a mass audience because they were smaller and cheaper than their predecessors.

Fourth Generation (1971-Present) : Microprocessors
The microprocessor brought the fourth generation of computers, as thousands of integrated circuits were built onto a single silicon chip. What in the first generation filled an entire room could now fit in the palm of the hand. The Intel 4004 chip, developed in 1971, located all the components of the computer—from the central processing unit and memory to input/output controls—on a single chip. In 1981 IBM introduced its first computer for the home user, and in 1984 Apple introduced the Macintosh. Microprocessors also moved out of the realm of desktop computers and into many areas of life as more and more everyday products began to use microprocessors. As these small computers became more powerful, they could be linked together to form networks, which eventually led to the development of the Internet. Fourth generation computers also saw the development of GUIs, the mouse and handheld devices.

Fifth Generation (Present and Beyond) : Artificial Intelligence
Fifth generation computing devices, based on artificial intelligence, are still in development, though there are some applications, such as voice recognition, that are being used today. The use of parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum computation and molecular and nanotechnology will radically change the face of computers in years to come. The goal of fifth‐generation computing is to develop devices that respond to natural language input and are capable of learning and self organization.

Don't forget to Comment , Like and Share it. Do you Like this Article please Comment. Feel free to Comment.

No comments:

Post a Comment

Don't forget to leave your comment.