History Of Computers
Computing / / July 04, 2021
The history of the PC begins with an invention that has nothing to do with it: Jean Marie Jacqard's embroidery machine, invented in 1804. The great innovation of this machine was that the embroidery design was controlled by means of punched cards, saving time and work for operators.
In 1843, Charles Babbage applied the idea of punched cards to the design of an "analytical engine", in which the cards contained the instructions for the formulas to be made.
Babbage's machine was a model for building other machines; The first calculating machine that combined mechanical functions with electricity was built by Herman Hollerit in 1880 to aid in the United States population census.
After the Second World War, in 1946, the University of Pennsylvania built a machine to calculate ballistic tables for missile defense, the first fully electronic computer: ENIAC. Like the embroidery machine, ENIAC worked with punched cards. It weighed 30 tons and took up a room of 170 square meters.
The discovery of semiconductors in 1948 and the creation of integrated circuits in the 1960s made it possible to enclose an entire electronic circuit in a single capsule. When applied to computing, the microprocessor was created. In 1972 Intel released the first microprocessor.
This allowed computers much more powerful than ENIAC to be on any desk, but you had to know a series of programming commands in text mode, so they would we needed.
In the early 80s, the MacOs, the first graphic system, was invented, which did not require knowing codes to use the computer. In 1984 Bill Gates made his version of the graphical system for an IBM platform, creating the Windows operating system.
Miniaturization has made it possible to create increasingly powerful microprocessors and structurally more small, managing to introduce in a single unit 2, 4 or up to 8 process units, called nuclei. In addition, the processing speeds have multiplied: from 2 or 4 Mhz, in 1972, up to 3,000 Mhz per core in an i7 processor.