1 / 53

Distributed Edge Computing

Distributed Edge Computing. Jing Yue. Information Science and Engineering. Electrical Engineering and Computer Science. KTH Royal Institute of Technology. 28 th January 2019. Outlines. Distributed Edge Computing. Machine Learning. Distributed Machine Learning. Outlines.

bettef
Télécharger la présentation

Distributed Edge Computing

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Distributed Edge Computing Jing Yue Information Science and Engineering Electrical Engineering and Computer Science KTH Royal Institute of Technology 28th January 2019

  2. Outlines • Distributed Edge Computing • Machine Learning • Distributed Machine Learning

  3. Outlines • Distributed Edge Computing • What? • Why? • How? • Machine Learning • Distributed Machine Learning

  4. Distributed Edge Computing Cloud

  5. Cloud Computing Cloud Share Resources Over Internet

  6. Cloud Computing Cloud Share Resources Over Internet Cut Cost !

  7. Cloud Computing Cloud

  8. Cloud Computing Cloud

  9. Cloud Computing Cloud

  10. Cloud Computing Cloud Share Resources Over Internet Perfect ?

  11. Cloud Computing Large amount of data Cloud Limited resources Mobile devices Long distance Delay/Latency

  12. Cloud Computing Large amount of data Cloud Limited resources Mobile devices Long distance Delay/Latency

  13. Cloud Computing Cloud

  14. Distributed Edge Computing A typical architecture of edge computing networks Front-end: end devices, e.g., sensors and actuators Near-end: edge/cloudlet servers, data computation and storage Far-end: cloud servers, more computing power and more data storage

  15. Distributed Edge Computing Server Server Server Server

  16. Distributed Edge Computing Server Server Server Server

  17. Distributed Edge Computing Server Server Server Server

  18. Distributed Edge Computing Server Server Cloud Server Server

  19. Distributed Edge Computing Server Server Cloud Server Server

  20. Distributed Edge Computing Advantages of edge computing • Shorter transmission time • Reduce latency/delay • Save transmission resources • Reduce energy consumption • …

  21. Distributed Edge Computing Challenges of edge computing • Stragglers Server 3 Server 2 Server 1 Server 4 Computation task

  22. Distributed Edge Computing Challenges of edge computing • Stragglers Server 3 Server 2 Server 1 Server 4 Subtask 1 Subtask 2 Subtask 3 Computation task

  23. Distributed Edge Computing Challenges of edge computing • Stragglers Server 3 Server 2 Server 1 Server 4 Subtask 1 Subtask 2 Subtask 3 Computation task

  24. Distributed Edge Computing Challenges of edge computing • Stragglers Server 3 Server 2 Server 1 Server 4 Subtask 1 Subtask 2 Subtask 3 Computation task

  25. Distributed Edge Computing Example 1: Industrial Manufacture - Computation tasks offloading High reliability, low latency/delay, … Server 2 … Server N Server 1 Whether? Where? When? Sensors Actuators

  26. Distributed Edge Computing Example 1: Industrial Manufacture - Computation tasks offloading High reliability, low latency/delay, … Server 2 … Server N Server 1 Whether? Where? When? Sensors Actuators

  27. Distributed Edge Computing Example 1: Industrial Manufacture - Computation tasks offloading High reliability, low latency/delay, … Server 2 … Server N Server 1 Whether? Where? When? Sensors Actuators

  28. Distributed Edge Computing Example 2: Smart Transportation - Mobile Devices, Server migration High reliability, low latency/delay, seamless service, … Server 2 … Server N Server 1 Whether? Where? When? Sensors Actuators

  29. Distributed Edge Computing Example 2: Smart Transportation - Mobile Devices, Server migration High reliability, low latency/delay, seamless service, … Server 2 … Server N Server 1 Whether? Where? When? Sensors Actuators

  30. Distributed Edge Computing Example 2: Smart Transportation - Mobile Devices, Server migration High reliability, low latency/delay, seamless service, … When? Where? Server 2 … Server N Server 1 Whether? Where? When? Sensors Actuators

  31. Distributed Edge Computing Example 2: Smart Transportation - Mobile Devices, Server migration High reliability, low latency/delay, seamless service, … When? Where? Server 2 … Server N Server 1 • Known? • Unknown? Whether? Where? When? Sensors Actuators

  32. Distributed Edge Computing Example 2: Smart Transportation - Mobile Devices, Server migration High reliability, low latency/delay, seamless service, … When? Where? Server 2 … Server N Server 1 • Known? • Unknown? Whether? Where? When? Sensors Actuators

  33. Distributed Edge Computing Challenges of edge computing • System integration • Resource management • Security and privacy • Smart system support • …

  34. Outlines • Distributed Edge Computing • Machine Learning • Distributed Machine Learning

  35. Machine Learning When you heard the term “Machine Learning” for the first time, did you think of something that was similar to this figure? Artificial Intelligence Machine learning • A specific technological group of AI • Find the model from data through training

  36. Machine Learning Applying a model based on actual data Once the Machine Learning process finds the model from the training data, we apply the model to the actual data supplied in the field application.

  37. Machine Learning Types of Machine Learning techniques • Supervised Learning • Unsupervised Learning

  38. Machine Learning Types of Machine Learning techniques • Supervised Learning Similar to the process in which a human learns things • Select an exercise problem. • Apply current knowledge to solve the problem. • Compare the answer with the solution. • If the answer is wrong, modify current knowledge. • Repeat Steps 1to4 for all the exercise problems. • Each training dataset should consist of input and correct output pairs. • The correct output is what the model is supposed to produce for the given input.

  39. Machine Learning Example: Supervised Learning - Image recognition Dog or Cat?

  40. Machine Learning Example: Supervised Learning - Image recognition

  41. Machine Learning Example: Supervised Learning - Image recognition Model

  42. Machine Learning Types of Machine Learning techniques • Unsupervised Learning Similar to a student who just sorts out the problems by construction and attribute and doesn’t learn how to solve them because there are no known correct outputs • The training data of unsupervised learning contains only inputs without correct outputs. • Unsupervised learning is generally used for investigating the characteristics of the data and preprocessing the data.

  43. Machine Learning Unsupervised Learning - Clustering

  44. Outlines • Distributed Edge Computing • Machine Learning • Distributed Machine Learning • Why? • What? • How?

  45. Distributed Machine Learning Traditional: A training task is completed at one single computer, device, machine, … • Computation resource • Training time • Dataset

  46. Distributed Machine Learning Traditional: A training task is completed at one single computer, device, machine, … • Computation resource • Training time • Dataset Distributed: Multiple computers (or devices, machines, servers, …) work together to complete a training task

  47. Distributed Machine Learning Traditional: A training task is completed at one single computer, device, machine, … • Computation resource • Training time • Dataset Distributed: Multiple computers (or devices, machines, servers, …) work together to complete a training task • A master node divides a training task into multiple subtasks • Each subtask corresponds to a sub-dataset • Master node assigns subtasks to distributed worker nodes • Worker nodes compute and transmit the intermediate training results to the master node • Master node aggregates the intermediate results, updates training parameters, and sends the updated parameters to worker nodes for the next round computation

  48. Distributed Machine Learning Example: Sub-dataset 3 Sub-dataset 2 Sub-dataset 4 Sub-dataset 1 Worker 3 Worker 2 Worker 4 Worker1 Master node Assign subtasks to worker nodes

  49. Distributed Machine Learning Example: Compute and transmit the intermediate training results to the master node Sub-dataset 3 Sub-dataset 2 Sub-dataset 4 Sub-dataset 1 Worker 3 Worker 2 Worker 4 Worker1 Master node

  50. Distributed Machine Learning Example: Sub-dataset 3 Sub-dataset 2 Sub-dataset 4 Sub-dataset 1 Worker 3 Worker 2 Worker 4 Worker1 Master node Update training parameters

More Related