Business

The history of computers

The first computers


The history of the computer dates back much longer than the 1900s, in fact, computers have been around for more than 5000 years.

In ancient times, a “computer” (or “computor”) was a person who performed numerical calculations under the direction of a mathematician.

Some of the best known devices used are the Abacus or Antikythera mechanism.

Around 1725 Basile Bouchon used perforated paper on a loom to establish the pattern to be reproduced on cloth. This ensured that the pattern was always the same and that there were hardly any human errors.

Later, in 1801, Joseph Jacquard (1752-1834) used the idea of ​​punch cards to automate more devices with great success.

The first computers?


From Charles Babbage. (1792-1871), was ahead of his time and, using the idea of ​​punched cards, developed the first computing devices to be used for scientific purposes. He invented the differential engine for Charles Babbage, which he started in 1823 but never completed. Later he began work on the analytical engine, it was designed in 1842.

Babbage was also credited with inventing computing concepts like conditional branches, iterative loops, and index variables.

Ada Lovelace (1815-1852), was Babbage’s colleague and founder of scientific informatics.

Many people improved upon Babbage’s inventions, George Scheutz along with his son, Edvard Scheutz, started working on a smaller version and by 1853 they had built a machine that could process 15-digit numbers and calculate fourth-order differences.

One of the first notable (and successful) commercial uses of computers was the United States Census Bureau, which used punch card kits designed by Herman Hollerith to tabulate the 1890 census data.

To offset the cyclical nature of the Census Bureau’s demand for his machines, Hollerith founded the Tabulating Machine Company (1896), which was one of three companies that merged to form IBM in 1911.

Later, Claude Shannon (1916-2001) first suggested the use of digital electronics in computers and in 1937 and JVAtanasoff built the first electronic computer that could solve 29 simultaneous equations with 29 unknowns. But this device was not programmable

During those difficult times, computers evolved at a rapid rate. But due to restrictions, many projects remained under wraps until much later and a notable example is the British military “Colossus” developed in 1943 by Alan Turing and his team.

In the late 1940s, the US Army commissioned John V. Mauchly to develop a device for calculating ballistics during World War II. It turned out that the machine was not ready until 1945, but the Electronic Numerical Integrator and Computer, or ENIAC, turned out to be a turning point in the history of computing.

ENIAC proved to be a very efficient machine but not very easy to operate. Any change would at some point require the device to be reprogrammed. Engineers were well aware of this obvious problem and developed a “stored program architecture.”

John von Neumann, (ENIAC consultant), Mauchly and their team developed EDVAC, this new project used a stored program.

Later, Eckert and Mauchly developed what was arguably the first commercially successful computer, the UNIVAC.

Software technology during this period was very primitive. The first programs were written in machine code. In the 1950s, programmers used a symbolic notation, known as assembly language, and then manually translated the symbolic notation into machine code. Later programs known as assemblers performed the translation task.

The age of transistors, the end of the inventor.


The late 1950s saw the end of valve-actuated computers. Transistor-based computers were used because they were smaller, cheaper, faster, and much more reliable.

Corporations, rather than inventors, were now producing the new computers.

Some of the best known are:

  • TRADIC at Bell Laboratories in 1954,
  • TX-0 at MIT’s Lincoln Laboratory
  • IBM 704 and its successors, the 709 and 7094. The latter introduced I / O processors for better performance between I / O devices and main memory.
  • The first dinner computers, the Livermore Atomic Research Computer (LARC) and the IBM 7030 (also known as Stretch)
  • The Texas Instrument Advanced Scientific Computer (TI-ASC)

Now the foundation of computers was in place, with transistors, computers were faster, and with the architecture of the stored program, the computer could be used for almost anything.

Soon new high-level programs arrived, FORTRAN (1956), ALGOL (1958) and COBOL (1959), Cambridge and the University of London cooperated in the development of CPL (Combined Programming Language, 1963). Martin Richards of Cambridge developed a subset of CPL called BCPL (Basic Computer Programming Language, 1967).

In 1969, the CDC 7600 was released, it could perform 10 million floating point operations per second (10 Mflops).

The years of the network.


Beginning in 1985, the race began to put as many transistors as possible in a computer. Each of them could perform a simple operation. But apart from being faster and being able to perform more operations, the computer has not evolved much.

The concept of parallel processing has been used more widely since the 1990s.

In the area of ​​computer networks, both wide area network (WAN) and local area network (LAN) technology developed at a rapid pace.

Get a more detailed history of the computer [http://www.myoddpc.com/other/history_of_computer.php].

Leave a Reply

Your email address will not be published. Required fields are marked *