1 / 17

Can Machines Think?

Can Machines Think?. Peter Bock  Professor of Machine Intelligence and Cognition Director of Project ALISA Department of Computer Science The George Washington University. Background Issues.

erling
Télécharger la présentation

Can Machines Think?

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Can Machines Think? Peter Bock  Professor of Machine Intelligence and Cognition Director of Project ALISA Department of Computer Science The George Washington University

  2. Background Issues Assumption:... the question of whether Machines Can Think ... is about as relevant as the question of whether Submarines Can Swim. [Dijkstra 1984] Axiom:The whole is greater than the sum of its parts. [??????????] Definition:A part of an entity consists exclusively of matter and/or energy. [Bock 2005] Axiom:The whole is exactly equal to the sum of its parts; if it seems otherwise, at least one of its parts has been overlooked. [Bock 2005] Definition:A set may be arbitrarily large and complex. [Cantor 1874]

  3. Fundamental Propositions Background Issues Assumption: ... the question of whether Machines Can Think ... is about as relevant as the question of whether Submarines Can Swim. [Dijkstra 1984] Axiom: The whole is greater than the sum of its parts. [??????????] Definition: A part of an entity consists exclusively of matter and/or energy.[Bock 2002] Axiom: The whole is exactly equal to the sum of its parts; if it seems otherwise, at least one of its parts has been overlooked. [Bock 2002] Definition: A set may be arbitrarily large and complex. [Cantor 1874] Definition:Intelligence is the ability of an entity to synthesize responses that are significantly correlated with its stimuli. [Bock 1993] Postulate:Intelligence capacityis a measure of the amount of information that can be stored in the memory of an entity. [Bock 1993] Definition: The standard unit of information is the bit, which is the base-2 logarithm of the number of unique states an entity can be in. [Shannon & Weaver, 1949]

  4. EntityIntelligence Capacity (bits) Examples of Intelligence Capacity toggle switch 100 = 1 worm 104 = 10,000 sea slug 107 = 10,000,000 tiny lizard 108 = 100,000,000 = 10 MB desktop computer 1010 = 10,000,000,000 = 1 GB DNA molecule 1010 = 10,000,000,000 = 1 GB frog 1011 = 100,000,000,000 = 10 GB mainframe computer 1012 = 1,000,000,000,000 = 100 GB dog 1014 = 100,000,000,000,000 = 10,000 GB = 10 TB human being 1015 = 1,000,000,000,000,000 = 100 TB human species 1025 = 10,000,000,000,000,000,000,000,000 = 1 YB universe 1084 = 1,000,000,000,000,000,000,000, 000,000,000,000,000,000,000, 000,000,000,000,000,000,000, 000,000,000,000,000,000,000 (number of baryons)

  5. 9 2000 - 2006 MP RISC 10 GB 1 GB 0.01 NOW Growth of Computer Memory Capacity RAM capacity (bytes) generation period technology mainframe PC% human 1 1952 - 1958 vacuum tube 0.1 KB 2 1958 - 1964 transistor 1 KB 3 1964 - 1970 SSI 10 KB 4 1970 - 1976 MSI 100 KB 5 1976 - 1982 LSI 1 MB 100 KB 0.000001 6 1982 - 1988 VLSI 10 MB 1 MB 0.00001 7 1988 - 1994 CISC 100 MB 10 MB 0.0001 8 1994 - 2000 RISC 1 GB 100 MB 0.001 Frog

  6. human brain 1952 1958 1964 1970 1976 1982 1988 1994 2000 2006 Growth of Computer Memory Capacity Generation 1 2 3 4 5 6 7 8 9 1 Petabyte 1 Terabyte Memory Capacity 1 Gigabyte 1 Megabyte PC RAM 1 Kilobyte Mainframe RAM NOW Time Period

  7. human brain my PC disk capacities 1952 1958 1964 1970 1976 1982 1988 1994 2000 2006 Growth of Computer Memory Capacity Generation 1 2 3 4 5 6 7 8 9 1 Petabyte 1 Terabyte Memory Capacity 1 Gigabyte 1 Megabyte my PC RAM capacities PC RAM 1 Kilobyte Mainframe RAM NOW Time Period

  8. human brain technology change Growth of Computer Memory Capacity Generation 1 2 3 4 5 6 7 8 9 10 11 12 13 14 1 Petabyte 1 Terabyte Memory Capacity 1 Gigabyte 1 Megabyte PC RAM 1 Kilobyte Mainframe RAM NOW 1952 1958 1964 1970 1976 1982 1988 1994 2000 2006 2012 2018 2024 2030 2036 Time Period

  9. Knowledge Acquisition Definition:Knowledgeis the instantiation of intelligence. Definition:Cognition (Thinking)is the mental process of acquiring, representing, processing, and applying knowledge.

  10. Knowledge Acquisition Definition: Knowledge is the instantiation of intelligence. Definition: Cognition (Thinking) is the mental process of acquiring, representing, processing, and applying knowledge. Programming 10% capacity of the brain ≈ 1014 bits 1 line of code (rule) ≈ 1000 bits ≈ 100 billion rules software production rate ≈ 10 lines of code per person-hour software production time ≈ 1010 person-hours ≈ 10,000,000 person-years !!! IMPOSSIBLE !!! Fact: This approach for achieving robust AI was abandoned in the mid-1980’s. NONETHELESS... Fact:CYC: rule-based system funded by DARPA and directed by Douglas Lenat • under construction for more than 20 years at MCC in Texas • objective is to include 1 billion “common sense” rules • no significant successes and many, many failures

  11. Knowledge Acquisition Direct Transfer 10% capacity of the brain ≈ 1014 bits data transfer rate ≈ 108 bits per second data transfer time ≈ 106 seconds ≈ 12 days GREAT !!! HOW ???

  12. Knowledge Acquisition Learning Definition: Learning is the dynamic acquisition and application of knowledge based on unsupervised and supervised training. 10% capacity of the brain ≈ 1014 bits average rate of sensory input ≈ 500,000 bits per second knowledge acquisition time ≈ 200,000,000 seconds ≈ 3500 days (16 hours per day) ≈ 10 years THAT’S BETTER !!! Collective Learning Systems (CLS) [Bock 1976] Definition:Project ALISA is an adaptive non-parametric parallel-processing statistical knowledge acquisition and classification system based on CLS theory. [Bock, et al. 1992] Practical applications are illustrated on my website.

  13. Training Style Derived Art Edvard Munch (10 images) Source Image mimicry = 25% brush size = thick influence = high photograph Courtesy of Ben Rubinger

  14. Training Style Derived Art Monet (39 images) Source Image mimicry = 28% brush size = large influence = high photograph Courtesy of Ben Rubinger

  15. Training Style Derived Art Sam Brown (171 images) Source Image mimicry = 28% brush size = medium influence = medium photograph Courtesy of Ben Rubinger

  16. Training Style Derived Art brick walls (6 images) Source Image mimicry = 24% brush size = medium influence = high photograph Courtesy of Ben Rubinger

  17. le début

More Related