PICurv 0.1.0
A Parallel Particle-In-Cell Solver for Curvilinear LES
|
Welcome to the Developer Portal. This document provides a high-level architectural overview of the PICurv C codebase. It is intended for developers who wish to understand, modify, or extend the solver's capabilities.
The entire simulation is orchestrated by the main()
function in src/main.c
. The program follows a clear, five-stage execution path, which provides a roadmap to the entire codebase.
PetscInitialize
)**CreateSimulationContext
)**src/setup.c
SimCtx
struct and populates it by parsing the .control
file generated by pic-flow
. It reads every possible flag (e.g., -ren
, -dt
, -nblk
) and sets the corresponding member in simCtx
.Setup*
Functions)src/setup.c
, src/grid.c
, src/Boundaries.c
, src/ParticleSwarm.c
main()
:SetupGridAndSolvers()
: The main grid constructor. It allocates the multi-block/multi-grid data hierarchy (UserMG
, MGCtx
, UserCtx
), creates all PETSc DMDA
s and Vec
s, and computes all grid metrics.SetupBoundaryConditions()
: Parses the bcs.dat
files and initializes the boundary condition handlers for each block.SetupDomainRankInfo()
: Computes and shares the bounding box and cell ownership information for all MPI ranks, which is critical for particle migration.InitializeEulerianState()
: Sets the fluid fields to their t=0
state, either from a restart file or by applying initial conditions.InitializeParticleSwarm()
: If particles are enabled, this function creates the DMSwarm
, sets the number of particles, and initializes their properties.AdvanceSimulation
)**src/simulation.c
for
loop that advances the simulation from start_step
to total_steps
. See the 13_Core_Algorithms page for a detailed breakdown of this loop.PetscFinalize
)**
Understanding the two main context structs is essential to understanding the code. All data is passed explicitly through these structs—there are no global variables.
include/variables.h
.yml
files.ren
), time step (dt
), number of blocks (block_number
), pointers to top-level managers like the UserMG
struct, and all physics/model flags.CreateSimulationContext()
. It is then passed down (by pointer) to almost every function in the solver.
include/variables.h
DMDA
s (da
, fda
), all PETSc Vec
s for that grid's fields (Ucont
, P
, Csi
, etc.), and pointers to its coarser/finer neighbors in the multigrid hierarchy.UserCtx
structs is allocated in AllocateContextHierarchy
: mgctx[level][block]
. Each UserCtx
contains a back-pointer (simCtx
) to the master simulation context.
The codebase is organized into logical modules, typically as pairs of .c
and .h
files.
main.c
, simulation.c
)**setup.c
, io.c
)**SimCtx
creation, and orchestrates the setup of all other modules.grid.c
, Metric.c
)**DMDA
s, assigning coordinates, and computing all curvilinear metric terms (Csi
, Eta
, Zet
, Aj
, etc.).solvers.c
, implicitsolvers.c
, poisson.c
, rhs.c
)**Boundaries.c
, BC_Handlers.c
)**Boundaries.c
contains the parser and high-level logic, while BC_Handlers.c
would contain the specific implementations (e.g., constant_velocity
, noslip
).ParticleSwarm.c
, ParticleMotion.c
, walkingsearch.c
)**DMSwarm
. This includes particle initialization, advection (UpdateAllParticlePositions
), and the robust "walking search" algorithm for locating particles on the grid (LocateAllParticlesInGrid_TEST
).interpolation.c
)**
This page provides the "map" to the codebase. To understand how these components work together during a time step, proceed to the next page.