1 / 31

Adaptive Optimization in the Jalapeño JVM

Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Presentation by Michael Bond Talk overview Introduction: Background & Jalapeño JVM Adaptive Optimization System (AOS) Multi-level recompilation Miscellaneous issues

paul2
Télécharger la présentation

Adaptive Optimization in the Jalapeño JVM

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Adaptive Optimization in the Jalapeño JVM Matthew Arnold Stephen Fink David Grove Michael Hind Peter F. Sweeney Presentation by Michael Bond

  2. Talk overview • Introduction: Background & Jalapeño JVM • Adaptive Optimization System (AOS) • Multi-level recompilation • Miscellaneous issues • Feedback-directed inlining • Conclusion

  3. Background • Three waves of JVMs: • First: Compile method when first encountered; use fixed set of optimizations • Second: Determine hot methods dynamically and compile them with more advanced optimizations • Third: Feedback-directed optimizations • Jalapeño JVM targets third wave, but current implementation is second wave

  4. Jalapeño JVM • Written in Java (core services precompiled to native code in boot image) • Compiles at four levels: baseline, 0, 1, & 2 • Why three levels of optimization? • Compile-only strategy (no interpretation) • Advantages? Disadvantages? • Yield points for quasi-preemptive switching • Advantages? (Disadvantages later)

  5. Talk progress • Introduction: Background & Jalapeño JVM • Adaptive Optimization System (AOS) • Multi-level recompilation • Miscellaneous issues • Feedback-directed inlining • Conclusion

  6. Adaptive Optimization System

  7. “Distributed, asynchronous, object-oriented design” useful for managing lots of data, say authors Each successive pipeline (from raw data to compilation decisions) performs increasingly complex analysis on decreasing amounts of data AOS: Design

  8. Talk progress • Introduction: Background & Jalapeño JVM • Adaptive Optimization System (AOS) • Multi-level recompilation • Other issues • Feedback-directed inlining • Conclusion

  9. Multi-level recompilation

  10. Multi-level recompilation:Sampling • Sampling occurs on thread switch • Thread switch triggered by clock interrupt • Thread switch can occur only at yield points • Yield points are method invocations and loop back edges • Discussion: Is this approach biased?

  11. Multi-level recompilation:Biased sampling Short method Code with no method calls or back edges Long method method call method call

  12. Multi-level recompilation: Cost-benefit analysis • Method m compiled at level i; estimate: • Ti, expected time program will spend executing m if m not recompiled • Cj, the cost of recompiling m at optimization level j, for i ≤ j ≤ N. • Tj, expected time program will spend executing method m if m recompiled at level j. • If, for best j, Cj + Tj < Ti, recompile m at level j.

  13. Multi-level recompilation: Cost-benefit analysis (continued) • Estimate Ti : Ti = Tf * Pm • Tf is the future running time of the program • We estimate that the program will run for as long as it has run so far • Reasonable assumption?

  14. Multi-level recompilation: Cost-benefit analysis (continued) • Pm is the percentage of Tf spent in m Pm estimated from sampling • Sample frequencies decay over time. • Why is this a good idea? • Could it be a disadvantage in certain cases?

  15. Multi-level recompilation: Cost-benefit analysis (continued) • Statically-measured speedups Si and Si used to determine Tj: Tj = Ti * Si / Sj • Statically-measured speedups?! • Is there any way to do better?

  16. Multi-level recompilation: Cost-benefit analysis (continued) • Cj (cost of recompilation) estimated using a linear model of speed for each optimization level: Cj = aj * size(m), where aj = constant for level j • Is it reasonable to assume a linear model? • OK to use statically-determined aj?

  17. Multi-level recompilation:Results

  18. Multi-level recompilation:Results (continued)

  19. Multi-level recompilation: Discussion • Adaptive multi-level compilation does better than JIT at any level in short term. • But in the long run, performance is slightly worse than JIT compilation. • The primary target is server applications, which tend to run for a long time.

  20. Multi-level recompilation: Discussion (continued) • So what’s so great about Jalapeño’s AOS? • Current AOS implementation gives good results for both short and long term – JIT compiler can’t do both cases well because optimization level is fixed. • The AOS can be extended to support feedback-directed optimizations such as • fragment creation (i.e., Dynamo) • determining if an optimization was effective

  21. Talk progress • Introduction: Background & Jalapeño JVM • Adaptive Optimization System (AOS) • Multi-level recompilation • Miscellaneous issues • Feedback-directed inlining • Conclusion

  22. Miscellaneous issues:Multiprocessing • Authors say that if a processor is idle, recompilation can be done almost for free. • Why almost for free? • Are there situations when you could get free recompilation on a uniprocessor?

  23. You’re so hot! Adaptively optimize me all night long! AOS Controller Hot method

  24. Miscellaneous issues:Models vs. heuristics • Authors moving toward “analytic model of program behavior” and elimination of ad-hoc tuning parameters. • Tuning parameters proved difficult because of “unforeseen differences in application behavior.” • Is it believable that ad-hoc parameters can be eliminated and replaced with models?

  25. Miscellaneous issues:More intrusive optimizations • The future of Jalapeño is more intrusive optimizations, such as compiler-inserted instrumentation for profiling • Advantages and disadvantages compared with current system? • Advantages: • Performance gains in the long term • Adjusts to phased behavior • Disadvantages: • Unlike with sampling, you can’t profile all the time • Harder to adaptively throttle overhead

  26. Miscellaneous:Stack frame rewriting • In the future, Jalapeño will support rewriting of a baseline stack frame with an optimized stack frame • Authors say that rewriting an optimized stack frame with an optimized stack frame is more difficult? • Why?

  27. Talk progress • Introduction: Background & Jalapeño JVM • Adaptive Optimization System (AOS) • Multi-level recompilation • Miscellaneous issues • Feedback-directed inlining • Conclusion

  28. Feedback-directed inlining: More cost-benefit analysis • Boost factor estimated: • Boost factor b is a function of • The fraction f of dynamic calls attributed to the call edge in the sampling-approximated call graph • Estimate s of the benefit (i.e., speedup) from eliminating virtually all calls from the program • Presumably something like b = f * s.

  29. Feedback-directed inlining: Results Why? Why?

  30. Talk progress • Introduction: Background & Jalapeño JVM • Adaptive Optimization System (AOS) • Multi-level recompilation • Other issues • Feedback-directed inlining • Conclusion

  31. Conclusion • AOS designed to support feedback-directed optimizations (third wave) • Current AOS implementation only supports selective optimizations (second wave) • Improves short-term performance without hurting long term • Uses mix of cost-benefit model and ad-hoc methods. • Future work will use more intrusive performance monitoring (e.g., instrumentation for path profiling, checking that an optimization improved performance)

More Related