1 / 37

Present Status and Future Trend of Autonomous Mobile Robot

Present Status and Future Trend of Autonomous Mobile Robot. Presented by Young Hoon Joo. http://mouse.kunsan.ac.kr/~RAIC. ROBOT(manipulator). 다양한 작업을 수행하기 위해 여러 가지 프로그램된 동작으로 재료 , 부품 등을 이동 및 조립하기 위해 설계된 다기능 조작기  단순히 , 인간 팔의 작업 기능을 모방한 로봇으로 고정된 장소에서 작업을 수행.

adonica
Télécharger la présentation

Present Status and Future Trend of Autonomous Mobile Robot

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Present Status and Future Trend of Autonomous Mobile Robot Presented by Young Hoon Joo http://mouse.kunsan.ac.kr/~RAIC

  2. ROBOT(manipulator) 다양한 작업을 수행하기 위해 여러 가지 프로그램된 동작으로 재료, 부품 등을 이동 및 조립하기 위해 설계된 다기능 조작기  단순히, 인간 팔의 작업 기능을 모방한 로봇으로 고정된 장소에서 작업을 수행 MOBILE ROBOT 인간의 개입 없이 실시간으로 이동하여 유용한 작업을 수행할 수 있는 자동화 장치  즉, 로봇에 더욱 다양한 기능을 부여하기 위해 기존의 로봇에 인간과 같은 이동 기능을 부여한 로봇 Legged Mobile Robot : 여러 개의 관절로 이루어진 다리를 이동기구로 한다. Wheeled Mobile Robot :로봇에 장착된 바퀴를 지표면에 접촉하여 구동시킴으로써 지표면을 단독으로 이동가능 (산업용으로 AGV가 실용화) Robot의 개요 및 정의

  3. Z Y X Z Z Z X Y Y X Y X  Description of Position and Orientation -. 로봇 공학에서는 3차원 공간상의 물체를 다룸 - 링크, 부품, 공구 등 -. 물체를 position 과 orientation으로 표시하고 이것을 수학적으로 다루는 방법을 공부 -. Position과 Orientation을 표시하기 위해 물체에 좌표계(frame)를 부착 -. 한 계에서 다른 계로의 변환을 공부

  4. Z Y 2 TOOL 3 2 X TOOL X 1 1 Z Y 3 Z Z Y Y X BASE BASE -. Kinematics : 운동을 야기시키는 힘을 고려하지 않고 운동을 취급 -. Dynamics : 운동을 일으키는데 필요한 힘을 공부 -. Inverse Kinematics : 말단효과 장치의 position과 orientation이 주어졌을 때 그 주어진 position과 orientation을 얻기 위해 사용될 수 있는 모든 가능한 관절각을 구하는 것 -. Forward Kinematics : 한 조의 관절각이 주어졌을 때 base frame에 대한 tool frame의 position과 orientation을 계산하는 것

  5. 3rd Generation ( 1990’ ) 1st Generation ( 1960 ~1970 ) 2nd Generation ( 1980’ ) Perception Problem-Solving Learning Interface Playback robot Types Feedback sensors (visual, tactile untrasound) Link and cam teaching/playback Learning Decision making Features Conventional Mobile systems Omni-directional mobile Mobility Walking Arc welding Assembling Medical/Nursing Remote operation Automated assembling Independent operation Advanced Inspection Domestic Paint spraying Spot welding Pick and place Applications

  6. ROBOT Mobile Robot Robot Manipulator Legged Mobile Robot -. 1-Axis Manipulator -. Hopping type -. Human type -. Horse type -. Insect type -. 2-Axis Manipulator -. 3-Axis Manipulator Wheeled Mobile Robot -. 4-Axis Manipulator -. Steering & Driving type -. Driving & Driving type -. Hybrid type -. Stanford Wheel type -. Mechanam Wheel type -. Ball Wheel type -. 5-Axis Manipulator -. 6-Axis Manipulator -. Redundant Manipulator In recent years the area of robotics application has been expanded due to the addition of mobility

  7.  Robot Manipulator 2-Axis 4-Axis 6-Axis 5-Axis

  8.  Legged Mobile Robot Human Type(2-Legged) Hopping Type(1-Legged) Horse Type(4-Legged) Insect Type(6-Legged)

  9. Wheeled Mobile Robot -. Driving & Driving Type -. Steering & Driving Type -. Hybrid Type I -. Hybrid Type II

  10.  Application Example of Mobile Robot

  11. RESEARCH CHALLENGES OF MOBILE ROBOT Hardware Aspect Vehicle body Controller Actuator Sensors Software Aspect Operating System Map & Path-Planning Autonomous Navigation Position Estimation

  12. Block Diagram of Mobile Robot System MOBILE ROBOT SYSTEM Mechanical Part Sensing Part Application Part Control Part Manipulation Part Application Area of Mobile Robot System -. Automated Guided Vehicle(AGV) -. Document Delivery Robot in Office Building -. Nursing Robot in Hospital -. Guide Robot for Handicapped people -. Serving Robot -. Wafer Handling Robot in Semiconductor Factory

  13. DESIGN EXAMPLE Construction of Wheels Arrangement of Sonar Sensors Heading Angle k S[4] S[3] S[5] S[6] S[2] S[7] S[1] YL Local Coordinate System S[8] S[0] XL Y ( x , y , ) k k k S[9] S[17] Reference Point S[10] S[16] S[11] S[15] S[12] S[14] S[13] + X World Coordinate System

  14. Hardware Block Diagram MIAN CONTROLLER (IBM PC) Standard Bus Standard Bus RF Modem Module Voice Synthesize Module Vehicle Control Module Manipulator Control Module Sensor Acquisition Module Image Processing Module Servo Controller Servo Controller CCD Camera Antenna Speaker Sonar Sensors Infrared Sensors DC Motor DC Motor E E

  15. Assembly of Mobile Robot

  16. Specification of Mobile Robot Appearance of Mobile Robot The Specifications of Vehicle Item Specification Specification Item Weight 0-1000 mm/sec Velocity 150 Kg Size Accele. 0-1000 mm/sec2 70X70X140Cm Power Motor 24VDC Battery DC Servo Motor The Specifications of Sonar Sensor Specification Item Specification Item 0.15 - 10.7 m Beam Distance 15 [Width] Period Resolu. 1 cm/m 56 pulses/msec [Polaroid Co.] The Specifications Of CCD Camera Item Specification Item Specification Pixel Resolu. 570 X 485 lines 756 X 493 11 X 13m Cell Size Size 44X29X107mm3 [SONY XC-77]

  17. Path-Planning Method Mapping Method • -. Graph method : delicate but long time • -. Grid Method : not delicate but short time -. Voronoi method : delicate but long time -. Potential method : delicate but vibrated moving -. 4 or 8 Connection method : correct but long time -. Movable Window Method Having the generated grid map, plan the navigation path by the movable window method Using the sensor data fusion technique of the data obtained by sonar sensors, generate the grid map MAP & PATH-PLANNING

  18. Graphic Simulator (IBM PC/AT) Sensor Inputs NAVIGATOR OBSTACLE AVOIDER Control Objective Fuzzy Inference Rule Learned Neural Network PATH PLANNER Programmed Mapping (Manual) Environment Mapping (Auto) Environment Identifier Position & Orientation Distance Difference Steering Angle y X1 MOBILE ROBOT ENVIRONMENT DECETOR State Evaluation Inference Engine X2 v (Output) Velocity Orientation Path Planning (Navigation Data File) POSITION MONITOR Position Corrector (Vision) Land Mark  Block Diagram for Navigation System of Mobile Robot

  19. P(r) 1 y P() 1 x Mapping Technique  Modeling of the Sonar Snesor  Footprint of Beam and Rearranged Cells Non occupied region occupied region

  20. OBSTACLE Ye (x’e , y’e) d0 al ar (xr , yr) ROBOT Xe Oe Path-Planning Technique  Environmental Coordinate  The Movable Window by Grid

  21. Position & Orientation of Mobile Robot Thresholding Path Planner Mapper Modeling for Ultrasonic Sensor  Block Diagram for Map & Path-Planning Grid Map Modified Grid Map Path Following

  22. Experimental Results in Mapping Method -. Experiment Example -. Simulation Example

  23. Experimental Results in Map & Path-Planning Method -. Experiment Example -. Simulation Example

  24. AUTONOMOUS NAVIGATION Guided Method -. Dead-Reckoning Navigation Method : Blind Running -. Sonar Guided Method -. Wire Guided Method : AGV(Automated Guided Vehicle) -. Landmark Guided Method -. Laser Guided Method -. Vision Guided Method Feature of Mobile Robot It is difficult to model the navigation of a mobile robot mathematically since we have to consider various environmental conditions such as slippage, tear of wheel and oscillation by irregular floor. AI technique to model the control action of an expert + Fuzzy Control Neural Network

  25. Heading Angle X2 Yn Distance Difference X1 Xn ( x , y , ) k k k Reference Line The Geometry for Navigation -. Controller Inputs : Distance Difference X1 and Heading Angle X2 -. Controller output : Steering Angle y

  26. Control Strategy for Autonomous Navigation SENSOR INPUTS OUTPUTS Difference x1 Fuzzy Reasoning for Navigation Steering Angle y [deg/sec] Wall Information x2 Orientation Difference Control Objective Neural Network for Obstacle Avoidance Obstacle Information Turning Angle [deg] Sensor Data

  27. Control Block Diagram for Autonomous Navigation Fuzzy Control Rules Control Objective Neural Network Turning Angle Steering Angle y Difference X1 Inference Engine State Evaluation Mobile Robot Position & Orientation X2 Orientation Key Point : Using the input-output data obtained by the control action of an expert -. Identify the fuzzy control rules using FCM, GA and nonlinear programming techniques. -. Train the neural network for obstacle avoidance.

  28. Pattern Data (1) Decision of the number of fuzzy rules Decision of Optimal Number of Inference Rules Hard Clustering FCM Clustering Genetic Algorithm + Complex Method Gradient Method Identification of Parameters of Fuzzy Membership Function (2) Identification of membership function Generation of Fuzzy Inference Rules  Block Diagram for Fuzzy Modeling

  29. Genetic Algorithm + Complex Method Gradient Method Optimized Result  Identification Technique of Fuzzy Control Rules Simplified Fuzzy Reasoning Method Big(A12) Small(A11) y1=w1 Pattern data Rule 1: w1 x2 x1 Y a11 Input( ) a12 b11 b12 Big(An1) Medium(An2) output yr yn=wn Rule n: wn x2 x1 Y an1 an2 bn1 bn2 X10 X20 Output Minimize

  30. Multi-Layered Neural Network y* Input( ) Back-Propagation Algorithm output yr X1 X2 X3 X4 X5 X6 X7 Output y* Minimize Learned Network  Learning Technique of Neural Network

  31. Experimental Result in Obstacle Avoidance -. Autonomous Navigation Experiment 1 -. Autonomous Navigation Experiment 2

  32. Sensors for Position Estimation Absolute Sensor(External Sensor) : Measuring the external position. -. Vision Sensor( CCD Camera) -. Laser Scanning Sensor -. Optical Sensor -. Sonar Sensor Relative Sensor(Internal Sensor) : Measuring the relative position -. Encoder -. Accelerometer -. Gyroscope Estimate the position using Sensor Data Fusion Technique between absolute sensor and relative sensor by Kalman Filter POSITION ESTIMATION

  33. Position Estimation Techniques 200 150 300 m 300 yL k Z2 xL Z1 Y Reference Point ( xk ,yk , k ) + X Reference Coordinate system -. Position Estimation for CCD camera -. Position Estimation for Encoder YR [Side View] [Landmark]  R  XR Yw d Xw [Top View]

  34. Absolute Coordinate in a Vision System The Coordinate of Camera-Landmark ZL P1 CTL Landmark YL P0 ZC XL P2 XC YC P3 RTC XS WTL screen ZR Q2 ZC Q1 XC YR Q0 Q3 ZW YS YW XR WTR f YC Center of lens XW WTL=WTR RTC CTLWTR=WTL C TL-1 RTC-1

  35. Obtain the Image Calculate the vertices of the landmark in the camera coordinate : ( CXi , CYi , CZi) Calculate the threshold values for binarization Detect Landmark ? No Calculate the camera-landmark transformation matrix, CTL Calculate the vertices of the landmark in the frame coordinate : ( FXi , FYi ) Yes Calculate the robot-camera transformation matrix, RTC Calculate the vertices of the landmark in the screen coordinate : ( SXi , SYi ) Calculate the position of the robot, WTR  Position Estimation Algorithm by Vision System

  36. Experimental Result in Position Estimation • . Experimental 2 -. Experimental 1

  37. Development and standardization of basic and fundamental technologies • Integration of component technologies into a complete system -. Sensor data fusion technique to recognize the environment -. Autonomous navigation method using the AI techniques -. Man-Machine interface techniques • Development of a real-time perception capabilities -. Real-time position estimation using vision • Development of key component hardware technologies -. Sensors and actuators, controller H/W, etc The development of the autonomous mobile robot system FUTURE TREND OF MOBILE ROBOT

More Related