1623
Wilhelm Schickard invents the first mechanical calculator, the Speeding Clock, which allowed the user to add and subtract numbers up to six digit numbers. He wrote to the astronomer Johannes Kepler to explain to him how to use the machine in order to calculate the ephemerides, i.e. the position of the planets as time advanced. The Speeding Clock was destroyed in a fire before being completed.

Wilhelm Schickard, inventor of the first mechanical calculator.
1645
Blaise Pascal carries out the first computations on the Pascaline, the world's first operational mechanical calculator. It was able to add and subtract (though this operation was rather difficult to implement) and in 1649 Pascal was given a royal monopoly on the sale of his machines in France. Three years later he had sold a dozen of his calculators. Due to his role an important computer language of the 1980s was named Pascal.

The Pascaline
1672
Gottfried Wilhelm Leibniz constructs the Stepped Reckoner, the first calculator able to carry out all four basic arithmetic operations (addition, subtraction, multiplication and division). Only two examples of this machine were ever built. Leibiz also deserves credit as the inventor of the system of binary (base two) arithmetic that is used in all modern computers.

The Stepped Reckoner
1820
Thomas de Colmar builds his first Arithomètre, which was the first reliable calculator to be mass-produced. More than 5000 were constructed, up until the beginning of the First World War (1914-1918).

The Arithomètre
1822
Charles Babbage develops the theoretical concept of a programmable computer. This computer would be designed to replace the calculations done by humans, subject to frequent errors. Babbage's "analytical engine" had very similar architecture to that of modern computers, with distinct memory zones for the data and the program, the existence of conditional jumps as well as a separate unit for handling input and output.
This concept was described in detail by Ada Byron, Countess of Lovelace(daughter of the famous Romantic poet Lord Byron). She is also credited with having written the first computer program (to calculate a sequence of Bernoulli numbers), though historians are not in agreement over the relative importance of Ada Byron and Charles Babbage in the development of this program. In 1980, the American Department of Defense named the language it had developed "Ada" in her honor.

Ada Byron, Countess of Lovelace
1854
The English mathematician George Boole published the book Laws of Thought which lays the mathematical foundations for binary arithmetic and symbol logic (using the logical operators AND, OR and NOT), ideas that were of fundamental importance in the construction of 20th century computers.

George Boole
1890
Herman Hollerith constructs a machine which can read perforated cards and then carry out statistical calculations based on the data read from these cards. The first system of this kind was ordered by the Census Bureau of the USA. A few years later Hollerith founded the Computing Tabulating Record, which became IBM (International Business Machines) in 1924.

A Hollerith keyboard (pantograph) punch
1936
Alan Turing develops the concept of the Turing Machine, a purely theoretical device able to manipulate abstract symbols and which captures the logical processes of any possible computer. This machine in fact is a Gedanken experiment, which allowed Turing and later computer scientists to better understand the nature of logic and the limits of numerical computation, resulting in the detailed formulation of such concepts as algorithms and computability.

Alan Turing
1939
Bell Laboratories in New Jersey complete the CNC (Complex Number Calculator), an analog computer based on electromagnetic relay switches. In 1940 its inventor, George Stibitz used this computer, installed in New York, from his office at Dartmouth College in New Hampshire, thereby establishing the first use of a computer by network.

The CNC (Complex Number Calculator)