1 / 11

3D multi-fluid model

3D multi-fluid model. Erika Harnett University of Washington. Equations solved. }. For each species . Numerical scheme. Second-order Runge-Kutta with flux correction smoothing on the plasma parameters only (B not smoothed), no time averaging.

marlee
Télécharger la présentation

3D multi-fluid model

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. 3D multi-fluid model Erika Harnett University of Washington

  2. Equations solved } For each species 

  3. Numerical scheme • Second-order Runge-Kutta with flux correction smoothing on • the plasma parameters only (B not smoothed), no time • averaging. • Solved on a Cartesian grid. The spacing on each grid is equal • in all three dimensions • Nested grid system is used for higher resolution (see next section) • Assume continuous boundary conditions at the outer boundaries • Assume a constant density and temperature (thus pressure) for • all species at inner (planetary) boundary, and plasma • assumed to have no bulk speed (but has a thermal vel.) • No chemistry used in this version of the model

  4. Numerical scheme - continued • Variable time step used, determined from stability conditions • i.e. t < (smallest x) / (fasted speed) • divB correction can be turned on/off. Only necessary when simulating • strong shocks at unmagnetized planets after initialization

  5. Nested Grid System • All grids are assumed to have shape (nx,ny,nz) • All formulas are solved on each grid independently • Values from inner portion of higher resolution grids (X) reset • overlapping points in lower resolution grids when the • boundary conditions are determined in each time step. • Values at outer boundary of high • resolution grid (O) are • taken from overlapping • points in lower res. grid o s o s o s o s s o x x o s s o x x o • Remaining boundary values (S) • are determined from • interpolation s s o s o s o s o

  6. Gridding System 4 grids, each with the shape (nx,ny,nz) = 111 x 89 x 89 Grid 1 - resolution: x = 168 km outer edge of grid along x: -2.67 RM & 2.77 RM y,z: -2.18 RM & < 2.18 RM Grid 2 - resolution: x = 336 km outer edge of grid along x: -4.46 RM & 6.53 RM y,z: -4.36 RM & 4.36 RM Grid 3 - resolution: x = 673 km outer edge of grid along x: -5.94 RM &15.8 RM y,z: -8.71 RM & 8.71 RM Grid 4 - resolution: x = 1345 km outer edge of grid along x: -13.9 RM & 73.3 RM y,z:-17.4 RM & 17.4 RM

  7. Model Input Parameters Three species: H+ (solar wind), H+ (ionosphere), O+(ionosphere) Note: A small amount of ionospheric species must be present in solar wind and a small amount of solar wind species at inner boundary to prevent divide by zero errors. • Solar Wind (set statically or driven with changing conditions • read in from an input file): • Density = 3 cm-3 • Velocity = 400 km s-1 in x direction • Te = 10 eV, Ti = 4.3 eV • Bx = -1.64 nT, By = 2.52 nT, Bz = 0 nT

  8. Model Input Parameters - continued • Planetary (inner) boundary (single grid point width): • Altitude = 302 km • O+ Density = 875 cm-3 @ equator (decreases to 70% less at poles) • H+ Density = 100 cm-3 @ equator (decreases to 70% less at poles) • No day/night asymmetry in density • Te = 0.3 eV, Ti = 0.07 eV • Rotation period = 24.6 earth hours

  9. Odds and Ends • Code written in Fortran, fastest code with Intel compiler. • Parallelized to run on multi-core processors using OpenMP. • Currently run on a high-end desktop (dual quad-core Intel • chips ~ 3 GHz, ~ 4-8G of RAM) • Output necessary to restart simulation (density, momentum, • pressure, magnetic field for each species at all grid points) • written in binary during course of simulation and at the end • of a run. Time between outputs varies, set as in input • parameter. • Wave reflection only an issue for sub-sonic/sub-Alfvenic • incident winds. Prevent reflections at outflow boundary by • having large outer simulation grid such that bow shock • contacts outer boundaries down-stream of planet

  10. Validation • Comparison to hybrid results: • Harnett, E., R. Winglee, and P. Delamere (2005), "3D multi-fluid • simulations of Pluto's magnetosphere - a comparison with 3D • hybrid simulation results", Geophys. Res. Lett., 32, L19104, • doi:10.1029/2005GL023178 • Standard reconnection (Harris current sheet) problem: • Winglee R. M., E. Harnett, A. Stickle, J. Porter (2008), • "Multiscale/multifluid simulations of flux ropes at the • magnetopause within a global magnetospheric model" , • J. Geophys. Res., 113, A02209, doi:10.1029/2007JA012653. • Comparison to satellite observations: • Paty, C., W. Paterson, and R. Winglee (2008), Ion energization in • Ganymede’s magnetosphere: Using multifluid simulations to • interpret ion energy spectrograms, J. Geophys. Res., 113, A06211, • doi:10.1029/2007JA012848.

  11. Validation - continued • Comparison to satellite observations: • Snowden, D., R. Winglee, C. Bertucci, and M. Dougherty (2007), • Three-dimensional multifluid simulation of the plasma interaction • at Titan, J. Geophys. Res., 112, A12221, • doi:10.1029/2007JA012393. Convergence • Spatial tested with results from: • Harnett E. M., R. M. Winglee, C. Paty (2006), • Multi-scale/multi-fluid simulations of the post plasmoid current • sheet in the terrestrial magnetosphere, Geophys. Res. Lett., 33, • L21110, doi:10.1029/2006GL027376. • Temporal not really an issue.

More Related