1 / 38

Space Weather Prediction: Challenges in Computational Magnetohydrodynamics

Space Weather Prediction: Challenges in Computational Magnetohydrodynamics. Gábor Tóth Center for Space Environment Modeling University of Michigan. Collaborators. Tamas Gombosi, Kenneth Powell Ward Manchester, Ilia Roussev Darren De Zeeuw , Igor Sokolov Aaron Ridley, Kenneth Hansen

samson
Télécharger la présentation

Space Weather Prediction: Challenges in Computational Magnetohydrodynamics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Space Weather Prediction: Challenges in Computational Magnetohydrodynamics • Gábor Tóth • Center for Space Environment Modeling • University of Michigan

  2. Collaborators • Tamas Gombosi, Kenneth Powell • Ward Manchester, Ilia Roussev • Darren De Zeeuw, Igor Sokolov • Aaron Ridley, Kenneth Hansen • Richard Wolf, Stanislav Sazykin (Rice University) • József Kóta (Univ. of Arizona) Grants DoD MURI and NASA CT Projects

  3. Outline of Talk • What is Space Weather and Why to Predict It? • Parallel MHD Code: BATSRUS • Space Weather Modeling Framework (SWMF) • Some Results • Concluding Remarks

  4. What Space Weather Means Conditions on the Sun and in the solar wind, magnetosphere, ionosphere, and thermosphere that can influence the performance and reliability of space-born and ground-based technological systems and can endanger human life or health. Space physics that affects us.

  5. Affects Earth: The Aurorae

  6. Other Effects of Space Weather

  7. MHD Code: BATSRUS • Block Adaptive Tree Solar-wind Roe Upwind Scheme • Conservative finite-volume discretization • Shock-capturing Total Variation Diminishing schemes • Parallel block-adaptive grid (Cartesian and generalized) • Explicit and implicit time stepping • Classical and semi-relativistic MHD equations • Multi-species chemistry • Splitting the magnetic field into B0 + B1 • Various methods to control the divergence of B

  8. Conservative form is required for correct jump conditions across shock waves. Energy conservation provides proper amount of Joule heating for reconnection even in ideal MHD. Non-conservative pressure equation is preferred for maintaining positivity. Hybrid scheme: use pressure equation where possible. MHD Equations in Conservative vs. Non-Conservative Form

  9. The magnetic field has huge gradients near the Sun and Earth: Large truncation errors. Pressure calculated from total energy can become negative. Difficult to maintain boundary conditions. Solution: split the magnetic field as B = B0 + B1 where B0 is a divergence and curl free analytic function. Gradients in B1 are small. Total energy containsB1 only. Boundary condition for B1 is simple. Splitting the Magnetic Field

  10. Vastly Disparate Scales • Spatial: • Resolution needed at Earth:1/4 RE • Resolution needed at Sun:1/32 RS • Sun-Earth distance:1AU • 1 AU = 215 RS = 23,456 RE • Temporal: • CME needs 3 days to arrive at Earth. • Time step is limited to a fraction of a second in some regions.

  11. Adaptive Block Structure Blocks communicate with neighbors through “ghost” cells Each block is NxNxN

  12. Parallel Distribution of the Blocks

  13. Optimized Load Balancing

  14. Parallel Performance

  15. Why Explicit Time-Stepping May Not Be Good Enough • Explicit schemes have time steplimited by CFL condition: Δt < Δx/fastest wave speed. • High Alfvén speeds and/or small cells may lead to smaller time steps than required for accuracy. • The problem is particularly acute near planets with strong magnetic fields. • Implicit schemes do not have Δt limited by CFL.

  16. Building a Parallel Implicit Solver • BDF2 second-order implicit time-stepping scheme requires solution of a large nonlinear system of equations at each time step. • Newton linearization allows the nonlinear system to be solved by an iterative process in which large linear systems are solved. • Krylov solvers (GMRES, BiCGSTAB) with preconditioning are robust and efficient for solving large linear systems. • Schwarz preconditioning allows the process to be done in parallel: • Each adaptive block preconditions with based on local data • MBILU preconditioner

  17. Getting the Best of Both Worlds - Partial Implicit • Fully implicit scheme has no CFL limit, but each iteration is expensive (memory and CPU) • Fully explicit is inexpensive for one iteration, but CFL limit may mean a very small Δt • Set optimal Δt limited by accuracy requirement: • Solve blocks with unrestrictive CFL explicitly • Solve blocks with restrictive CFL implicitly • Load balance explicit and implicit blocks separately

  18. Timing Results for Space Weather on Compaq

  19. The Sun-Earth system consists of many different interconnecting domains that are independently modeled. Each physics domain model is a separate application, which has its own optimal mathematical and numerical representation. Our goal is to integrate models into a flexible software framework. The framework incorporates physics models with minimal changes. The framework can be extended with new components. The performance of a well designed framework can supercede monolithic codes or ad hoc couplings of models. From Codes To Framework

  20. Solar CoronaSCBATSRUS Eruptive Event GeneratorEEBATSRUS Inner HeliosphereIH BATSRUS Solar Energetic ParticlesSPKóta’s SEP model Global Magnetosphere GM BATSRUS Inner MagnetosphereIM Rice Convection Model Ionosphere ElectrodynamicsIERidley’s potential solver Upper AtmosphereUA General Ionosphere Thermosphere Model (GITM) Physics DomainsID Models

  21. Space Weather Modeling Framework

  22. The SWMF Architecture

  23. Parallel Layout and Execution LAYOUT.in for 20 PE-s SC/IHGMIM/IE ID ROOT LAST STRIDE #COMPONENTMAP SC 0 9 1 IH 0 9 1 GM 10 17 1 IE 18 19 1 IM 19 19 1 #END

  24. Stream line and field line tracing is a common problem in space physics. Two examples: Coupling inner and global magnetosphere models Coupling solar energetic particle model with MHD Tracing a line is an inherently serial procedure Tracing many lines can be parallelized,but Vector field may be distributed over many PE-s Collecting the vector field onto one PE may be too slow and it requires a lot of memory Parallel Field Line Tracing

  25. Coupling Inner and Global Magnetosphere Models Pressure Inner magnetospheremodel: needs the field line volumes, average pressure and density along field lines connected to the 2D grid on the ionosphere. Global magnetosphere model: needs the pressure correction along the closed field lines:

  26. Interpolated Tracing Algorithm 1. Trace lines inside blocks starting from faces. 2. Interpolate and communicate mapping. 3. Repeat 2. until the mapping is obtained for all faces. 4. Trace lines inside blocks starting from cell centers. 5. Interpolate mapping to cell centers.

  27. 4. Receive lines from other PE-s. 5. If received line go to 2a. Parallel Algorithm without Interpolation PE 1 PE 2 1.Find next local field line. 2. If there is a local field line then 2a. Integrate in local domain. 2b. If not done send to other PE. PE 3 PE 4 3. Go to 1. unless time to receive. 6. Go to 1. unless all finished.

  28. Interpolated versus No Interpolation

  29. Modeling a Coronal Mass Ejection • Set B0 to a magnetogram based potential field. • Obtain MHD steady state solution. • Use source terms to model solar wind acceleration and heating so that steady solution matches observed solar wind parameters. • Perturb this initial state with a “flux rope”. • Follow CME propagation. • Let CME hit the Magnetosphere of the Earth.

  30. The Zoom Movie

  31. More Detail at Earth Pressure and magnetic field Before shock After shock Density and magnetic field at shock arrival time South Turning BZ North Turning BZ

  32. Before shock hits. After shock: currents and the resulting electric potential increase. Region-2 currents develop. Although region-1 currents are strong, the potential decreases due to the shielding effect. Current Potential Ionosphere Electrodynamics

  33. The Hall conductance is calculated by the Upper Atmosphere component and it is used by the Ionosphere Electrodynamics. After the shock hits the conductance increases in the polar regions due to the electron precipitation. Note that the conductance caused by solar illumination at low latitudes does not change significantly. Before shock arrival Upper Atmosphere After shock arrival

  34. Performance of the SWMF

  35. 2003 Halloween Storm Simulation with GM, IM and IE Components • The magnetosphere during the solar storm associated with an X17 solar eruption. • Using satellite data for solar wind parameters • Solar wind speed: 1800 km/s. • Time: October 29, 0730UT • Shown are the last closed field lines shaded with the thermal pressure. • The cut planes are shaded with the values of the electric current density.

  36. GM, IM, IE Run vs. Observations

  37. Concluding Remarks • The Space Weather Modeling Framework (SWMF) uses sate-of-the-art methods to achieve flexible and efficient coupling and execution of the physics models. • Missing pieces for space weather prediction: • Better models for solar wind heating and acceleration; • Better understanding of CME initiation; • More observational data to constrain the model; • Even faster computers and improved algorithms.

More Related