1 / 25

Toward Optimal Configuration Space Sampling

Toward Optimal Configuration Space Sampling. Presented by: Yan Ke. Sampling Problem. Tool: Sample points. Target: Construct a roadmap representing the complete connectivity of the configuration space. More Points ≠ Better Sampling. How to Sample Smartly?.

essien
Télécharger la présentation

Toward Optimal Configuration Space Sampling

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Toward Optimal Configuration Space Sampling Presented by: Yan Ke NUS CS5247

  2. Sampling Problem • Tool: Sample points. • Target: Construct a roadmap representing the complete connectivity of the configuration space. NUS CS5247

  3. More Points ≠ Better Sampling NUS CS5247

  4. How to Sample Smartly? • Complete knowledge of configuration Space (usually unavailable). • Using information from past experience (our approach). NUS CS5247

  5. Modeling Configuration Space Section 1 NUS CS5247

  6. Build a Model from Past Exp. • Machine learning is concerned with how to automate learning from experience. • An existing obstructed node indicates being his neighbors, you are also likely to be obstructed. • And vise versa. NUS CS5247

  7. Probability for a single node • P(q=i | M) q – newly sampled point i – 1(free) or 0 (obstructed) M– Model built from past experience • We are learning P base on M. • We want : P(q=1 | M)↑ NUS CS5247

  8. Basic Idea • Model configuration space as binary classification: C(p) = (0,1) • If q is p’s neighbor, C(p) = 1 P(q=1 | M)↑ C(p) = 0 P(q=1 | M)↓ NUS CS5247

  9. Approximation Function • DenoteĈ(q) = P(q=1 | M) • Obviously Ĉ(q)[0,1] NUS CS5247

  10. K-nearest Neighbors • Q = { qi | i = 1,2……n} • N(q,k) – The function provides the k-nearest neighbors in Q. • Ĉ(q) = NUS CS5247

  11. A Screen Shot from the Paper NUS CS5247

  12. Probabilities • P(q=1 | M) = Ĉ(q) • P(q=0 | M) = 1 - Ĉ(q) NUS CS5247

  13. Utility Function Section 2 NUS CS5247

  14. Utility Function • Purpose: Characterize the relevance of a configuration to successfully guide sampling. • Relevance of a configuration: • Unexplored regions near to existing roadmap components? • maximally distance from existing components in unexplored regions of configuration space? NUS CS5247

  15. Utility Function • U(q=i , R) q – newly sampled point i – 1(free) or 0 (obstructed) R– the roadmap NUS CS5247

  16. Information Gain • IG(S,K) = H(S) – H(S|K) • S – some system • K – new knowledge • H() – entropy function • As S getting more information, H(S)↓ NUS CS5247

  17. Utility Function • U(q=i , R) = IG (R,q) =H(R) – H(R|q) • We claim that an obstructed sample doesn’t provide us any IG • i.e. U(q=i , R) = 0 NUS CS5247

  18. Another Screen Shot NUS CS5247

  19. How to get around it? • Return to our very basic goal: Full Connectivity • We restrict our current roadmap to be a set of disjoint component. • The maximal IG is likely to appear near the middle point of two large disjoint components. NUS CS5247

  20. Utility-Guided Sampling Section 3 NUS CS5247

  21. Utility-Guided Sampling NUS CS5247

  22. Algorithm: NUS CS5247

  23. Experiment • Environment: Two workspaces with robots of varying degrees of freedom. • Each robot – 3-4 links. • Each joint – 3 degrees of freedom. • Total – 9 or 12 DOF NUS CS5247

  24. Result: Faster NUS CS5247

  25. Conclusion • Utility-Guided Sampling • Guiding sampling to more relevant configurations. • Experimentally proved to be efficient NUS CS5247

More Related