1 / 20

History of Computer Science

History of Computer Science. A lesson by Matt Smith. Before 1900 – The Abacus. Computational devices have been in use for a very long time It is predicted that the Abacus has been in use since year 3000 B.C .

dard
Télécharger la présentation

History of Computer Science

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. History of Computer Science A lesson by Matt Smith

  2. Before 1900 – The Abacus • Computational devices have been in use for a very long time • It is predicted that the Abacus has been in use since year 3000 B.C. • An abacus is a device with beads on it that can be moved up or down on sticks to aid with counting and calculating.

  3. Before 1900 – John Napier • Another computing device that has been used for a very long time is Napier’s Bones • Napier’s bones were invented circa year 1610 by John Napier • John Napier was a Scottish man who developed the use of logarithms

  4. Images of Napier's bones and an abacus Ancient Chinese Abacus Napier’s Bones

  5. Before 1900 – Joseph-Marie Jacquard • Joseph-Marie Jacquard developed a system in the early 1800’s from a loom that could weave complicated patterns • This loom created patterns depending on the number and arrangement of holes in punched cards that were fed into the machine • This punched card method became significant and was the first of many computers to use this

  6. Joseph-Marie Jacquard's LOOM

  7. 1900-1939 • Developments on devices used for calculations kept improving • Calculation devices for specific purposes, such as factoring integers began to also develop • Electrically operated devices started to exist around this time • Big names in Computer Science history like Alan Turing and Kurt Gödel became prominent in this field

  8. Alan Turing • In 1936, Alan Turing and Alonzo Church introduced the more mainstream use of an algorithm • Alan Turing contributed a significant amount to computer sciences, such as a thesis and a description of the well-known “Turing machine” • The Turing machine, in theory, can simulate the completion of any algorithm which can be performed on a modern computer if given enough resources to do so. • The Turing machine works by printing, erasing, and rewriting data on an infinitely long piece of paper • Alan Turing was arrested for “gross indecency” for being a homosexual many years later • He committed suicide by eating an apple that he had poisoned

  9. Turing Machine

  10. 1940’s • The times of war sparked even more development in the field of computing due to the usefulness for warfare • The first electronic digital computer was created • Computers were used to calculate targets for ballistic weapons, and as a result the weapons were more accurate and took less time to use • Computers were also used for decrypting and code-breaking in the war times • An electronic computer that was able to store programs was designed called the EDVAC • One of the people who worked on the EDVAC is named John Von Neumann, a notable mathematician

  11. John Von Neumann • Von Neumann was an extremely talented mathematician • He developed many theories and published a lot of work in mathematics and programming which accelerated the growth of computing • Von Neumann also helped in the creation of “The Atomic Bomb,” which was later dropped on Hiroshima and Nagasaki in the second World War.

  12. 1940’s • In Germany, a man named KonradZuse built the first operating program controlled calculator for general purpose in 1941 • This calculator was called the “Z3” • The invention of the transistor in 1947 made way for the creation of the microprocessor and the microprocessor revolution in computer sciences • The transistor earned the inventors of it a Nobel Prize in physics. • The inventors of the transistor are named John Bardeen, Walter Brattain, and William Shockley

  13. Computer transistors

  14. 1950’s • The first computer “bug” was discovered • The bug was a moth that had gotten into a computer at Harvard and caused issues with the computer • The term was later coined and is used today as a word to describe an issue in a computer, usually in code or programming • A significant compiler called LISP was invented for artificial intelligence programming by John McCarthy in 1958

  15. 1960’s • ARPAnet had started to be created, which was a precursor to todays Internet • There were many advances in the use and design of Operating Systems. • The first real microprocessor chip was invented near the end of the decade • Many new programming languages were developed, such as BASIC

  16. A modern microprocessor

  17. 1970’s • The Unix operating system was developed at Bell Laboratories • A notable programming language called C was also created in the 70’s, as well as Pascal and Ada • The first supercomputer was created in this time. One of the first was titled CRAY-1 • The CRAY-1 could perform 160 million operations per second • There were further advances in the use of algorithms • Computers grew more and more complicated throughout the years

  18. 1980’s • This decade is the time when the first Personal Computers started to become popular • Steve Wozniak and Steve Jobs, founders of Apple, helped significantly with this • The first computer viruses were developed in 1981 • The first successful Portable Computer was marketed called the Osborne I • The US National Science Foundation started NSFnet, which was also a precursor to todays Internet

  19. 1990’s and on • Interest in Quantum Computing started to grow • Computers are constantly growing smaller and smaller in size with the further advancements in technology • Parallel Computing, in which multiple programs can be ran at once of a computer, continues to grow and become stronger

  20. Bibliography • https://cs.uwaterloo.ca/~shallit/Courses/134/history.html • http://www.computerhistory.org/ • http://www.eingang.org/Lecture/ • http://www.mrtc.mdh.se/publications/0337.pdf • http://www.cs.brown.edu/~jes/papers/09_ch5.pdf • http://www.firstpost.com/topic/person/john-von-neumann-history-of-computer-video-anLX3hir0cg-14840-17.html • http://systemcomputing.org/turing%20award/Maurice_1967/TheFirstDraft.pdf • http://mathworld.wolfram.com/TuringMachine.html

More Related