PICurv 0.1.0
A Parallel Particle-In-Cell Solver for Curvilinear LES
Loading...
Searching...
No Matches
io.h File Reference

Public interface for data input/output routines. More...

#include "variables.h"
#include "logging.h"
#include "Boundaries.h"
Include dependency graph for io.h:
This graph shows which files directly or indirectly include this file:

Go to the source code of this file.

Functions

PetscErrorCode ReadGridGenerationInputs (UserCtx *user)
 Parses command-line options for a programmatically generated grid for a SINGLE block.
 
PetscErrorCode ReadGridFile (UserCtx *user)
 Sets grid dimensions from a file for a SINGLE block using a one-time read cache.
 
PetscErrorCode ReadSimulationFields (UserCtx *user, PetscInt ti)
 Reads binary field data for velocity, pressure, and other required vectors.
 
PetscErrorCode ReadFieldData (UserCtx *user, const char *field_name, Vec field_vec, PetscInt ti, const char *ext)
 Reads data for a specific field from a file into the provided vector.
 
PetscErrorCode ReadStatisticalFields (UserCtx *user, PetscInt ti)
 Reads statistical fields used for time-averaged simulations.
 
PetscErrorCode ReadLESFields (UserCtx *user, PetscInt ti)
 Reads LES-related fields used in turbulence modeling.
 
PetscErrorCode ReadRANSFields (UserCtx *user, PetscInt ti)
 Reads RANS-related fields for turbulence modeling.
 
PetscErrorCode WriteFieldData (UserCtx *user, const char *field_name, Vec field_vec, PetscInt ti, const char *ext)
 Writes data from a specific PETSc vector to a file.
 
PetscErrorCode WriteSimulationFields (UserCtx *user)
 Writes simulation fields to files.
 
PetscErrorCode WriteStatisticalFields (UserCtx *user)
 Writes statistical fields for averaging purposes.
 
PetscErrorCode WriteLESFields (UserCtx *user)
 Writes LES-related fields.
 
PetscErrorCode WriteRANSFields (UserCtx *user)
 Writes RANS-related fields.
 
PetscErrorCode WriteSwarmField (UserCtx *user, const char *field_name, PetscInt ti, const char *ext)
 Writes data from a specific field in a PETSc Swarm to a file.
 
PetscErrorCode WriteSwarmIntField (UserCtx *user, const char *field_name, PetscInt ti, const char *ext)
 Writes integer data from a specific PETSc Swarm field to a file.
 
PetscErrorCode WriteAllSwarmFields (UserCtx *user)
 Writes a predefined set of PETSc Swarm fields to files.
 
PetscInt ReadDataFileToArray (const char *filename, double **data_out, PetscInt *Nout, MPI_Comm comm)
 
PetscInt CreateVTKFileFromMetadata (const char *filename, const VTKMetaData *meta, MPI_Comm comm)
 
PetscErrorCode VecToArrayOnRank0 (Vec inVec, PetscInt *N, double **arrayOut)
 Gathers the contents of a distributed PETSc Vec into a single array on rank 0.
 
PetscErrorCode SwarmFieldToArrayOnRank0 (DM swarm, const char *field_name, PetscInt *n_total_particles, PetscInt *n_components, void **gathered_array)
 Gathers any DMSwarm field from all ranks to a single, contiguous array on rank 0.
 
PetscErrorCode ReadSwarmField (UserCtx *user, const char *field_name, PetscInt ti, const char *ext)
 Reads data from a file into a specified field of a PETSc DMSwarm.
 
PetscErrorCode ReadSwarmIntField (UserCtx *user, const char *field_name, PetscInt ti, const char *ext)
 Reads integer swarm data by using ReadFieldData and casting the result.
 
PetscErrorCode ReadAllSwarmFields (UserCtx *user, PetscInt ti)
 Reads multiple fields (positions, velocity, CellID, and weight) into a DMSwarm.
 
PetscErrorCode ReadPositionsFromFile (PetscInt timeIndex, UserCtx *user, double **coordsArray, PetscInt *Ncoords)
 Reads coordinate data (for particles) from file into a PETSc Vec, then gathers it to rank 0.
 
PetscErrorCode ReadFieldDataToRank0 (PetscInt timeIndex, const char *fieldName, UserCtx *user, double **scalarArray, PetscInt *Nscalars)
 Reads a named field from file into a PETSc Vec, then gathers it to rank 0.
 
PetscErrorCode DisplayBanner (SimCtx *simCtx)
 Displays a structured banner summarizing the simulation configuration.
 
PetscErrorCode StringToBCFace (const char *str, BCFace *face_out)
 Converts a string representation of a face to a BCFace enum.
 
PetscErrorCode StringToBCType (const char *str, BCType *type_out)
 Converts a string representation of a BC type to a BCType enum.
 
PetscErrorCode StringToBCHandlerType (const char *str, BCHandlerType *handler_out)
 Converts a string representation of a handler to a BCHandlerType enum.
 
PetscErrorCode ValidateBCHandlerForBCType (BCType type, BCHandlerType handler)
 Validates that a specific handler is compatible with a general BC type.
 
void FreeBC_ParamList (BC_Param *head)
 Frees the memory allocated for a linked list of BC_Param structs.
 
PetscErrorCode ParseAllBoundaryConditions (UserCtx *user, const char *bcs_input_filename)
 Parses the boundary conditions file to configure the type, handler, and any associated parameters for all 6 global faces of the domain.
 
void TrimWhitespace (char *str)
 Helper function to trim leading/trailing whitespace from a string.
 
PetscErrorCode ParsePostProcessingSettings (SimCtx *simCtx)
 Initializes post-processing settings from a config file and command-line overrides.
 

Detailed Description

Public interface for data input/output routines.

This header declares functions responsible for parsing grid geometry information, either from command-line options for programmatically generated grids or by reading the header of a grid definition file.

Definition in file io.h.

Function Documentation

◆ ReadGridGenerationInputs()

PetscErrorCode ReadGridGenerationInputs ( UserCtx user)

Parses command-line options for a programmatically generated grid for a SINGLE block.

This function reads all per-block array options related to grid geometry, such as dimensions (-im), domain bounds (-xMins), and stretching ratios (-rxs). It then populates the fields of the provided UserCtx struct using its internal block index user->_this.

Parameters
userPointer to the UserCtx for a specific block. The function will populate the geometric fields (IM, Min_X, rx, etc.) within this struct.
Returns
PetscErrorCode 0 on success, or a PETSc error code on failure.

This function is responsible for reading all per-block array options related to grid geometry, such as dimensions (-im), domain bounds (-xMins, -xMaxs), and stretching ratios (-rxs). It reads the entire array for each option, then uses the block index stored in user->_this to select the correct value and populate the fields of the provided UserCtx struct.

Parameters
userPointer to the UserCtx for a specific block. The function will populate the geometric fields (IM, Min_X, rx, etc.) within this struct.
Returns
PetscErrorCode 0 on success, or a PETSc error code on failure.

Definition at line 47 of file io.c.

48{
49 PetscErrorCode ierr;
50 SimCtx *simCtx = user->simCtx;
51 PetscInt nblk = simCtx->block_number;
52 PetscInt block_index = user->_this;
53 PetscBool found;
54
55 // Temporary arrays to hold the parsed values for ALL blocks
56 PetscInt *IMs = NULL, *JMs = NULL, *KMs = NULL, *cgrids = NULL;
57 PetscReal *xMins = NULL, *xMaxs = NULL, *rxs = NULL;
58 PetscReal *yMins = NULL, *yMaxs = NULL, *rys = NULL;
59 PetscReal *zMins = NULL, *zMaxs = NULL, *rzs = NULL;
60
61 PetscFunctionBeginUser;
62
63 LOG_ALLOW(LOCAL, LOG_DEBUG, "Rank %d: Reading generated grid inputs for block %d.\n", simCtx->rank, block_index);
64
65 if (block_index >= nblk) {
66 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_OUTOFRANGE, "Block index %d is out of range for nblk=%d", block_index, nblk);
67 }
68
69 // --- Allocate temporary storage for all array options ---
70 ierr = PetscMalloc4(nblk, &IMs, nblk, &JMs, nblk, &KMs, nblk, &cgrids); CHKERRQ(ierr);
71 ierr = PetscMalloc6(nblk, &xMins, nblk, &xMaxs, nblk, &rxs, nblk, &yMins, nblk, &yMaxs, nblk, &rys); CHKERRQ(ierr);
72 ierr = PetscMalloc3(nblk, &zMins, nblk, &zMaxs, nblk, &rzs); CHKERRQ(ierr);
73
74 // --- Set default values for the temporary arrays ---
75 for (PetscInt i = 0; i < nblk; ++i) {
76 IMs[i] = 10; JMs[i] = 10; KMs[i] = 10; cgrids[i] = 0;
77 xMins[i] = 0.0; xMaxs[i] = 1.0; rxs[i] = 1.0;
78 yMins[i] = 0.0; yMaxs[i] = 1.0; rys[i] = 1.0;
79 zMins[i] = 0.0; zMaxs[i] = 1.0; rzs[i] = 1.0;
80 }
81
82 // --- Parse the array options from the command line / control file ---
83 PetscInt count;
84 count = nblk; ierr = PetscOptionsGetIntArray(NULL, NULL, "-im", IMs, &count, &found); CHKERRQ(ierr);
85 count = nblk; ierr = PetscOptionsGetIntArray(NULL, NULL, "-jm", JMs, &count, &found); CHKERRQ(ierr);
86 count = nblk; ierr = PetscOptionsGetIntArray(NULL, NULL, "-km", KMs, &count, &found); CHKERRQ(ierr);
87 count = nblk; ierr = PetscOptionsGetRealArray(NULL, NULL, "-xMins", xMins, &count, &found); CHKERRQ(ierr);
88 count = nblk; ierr = PetscOptionsGetRealArray(NULL, NULL, "-xMaxs", xMaxs, &count, &found); CHKERRQ(ierr);
89 count = nblk; ierr = PetscOptionsGetRealArray(NULL, NULL, "-rxs", rxs, &count, &found); CHKERRQ(ierr);
90 count = nblk; ierr = PetscOptionsGetRealArray(NULL, NULL, "-yMins", yMins, &count, &found); CHKERRQ(ierr);
91 count = nblk; ierr = PetscOptionsGetRealArray(NULL, NULL, "-yMaxs", yMaxs, &count, &found); CHKERRQ(ierr);
92 count = nblk; ierr = PetscOptionsGetRealArray(NULL, NULL, "-rys", rys, &count, &found); CHKERRQ(ierr);
93 count = nblk; ierr = PetscOptionsGetRealArray(NULL, NULL, "-zMins", zMins, &count, &found); CHKERRQ(ierr);
94 count = nblk; ierr = PetscOptionsGetRealArray(NULL, NULL, "-zMaxs", zMaxs, &count, &found); CHKERRQ(ierr);
95 count = nblk; ierr = PetscOptionsGetRealArray(NULL, NULL, "-rzs", rzs, &count, &found); CHKERRQ(ierr);
96 count = nblk; ierr = PetscOptionsGetIntArray(NULL, NULL, "-cgrids", cgrids, &count, &found); CHKERRQ(ierr);
97
98 // --- Assign the parsed values to the specific UserCtx struct passed in ---
99 user->IM = IMs[block_index];
100 user->JM = JMs[block_index];
101 user->KM = KMs[block_index];
102 user->Min_X = xMins[block_index];
103 user->Max_X = xMaxs[block_index];
104 user->rx = rxs[block_index];
105 user->Min_Y = yMins[block_index];
106 user->Max_Y = yMaxs[block_index];
107 user->ry = rys[block_index];
108 user->Min_Z = zMins[block_index];
109 user->Max_Z = zMaxs[block_index];
110 user->rz = rzs[block_index];
111 user->cgrid = cgrids[block_index];
112
113 LOG_ALLOW(LOCAL, LOG_DEBUG, "Rank %d: Block %d grid generation inputs set: IM=%d, JM=%d, KM=%d\n",
114 simCtx->rank, block_index, user->IM, user->JM, user->KM);
115 LOG_ALLOW(LOCAL, LOG_DEBUG, "Rank %d: Block %d bounds: X=[%.2f, %.2f], Y=[%.2f, %.2f], Z=[%.2f, %.2f]\n",
116 simCtx->rank, block_index, user->Min_X, user->Max_X, user->Min_Y, user->Max_Y, user->Min_Z, user->Max_Z);
117
118 // --- Clean up temporary storage ---
119 ierr = PetscFree4(IMs, JMs, KMs, cgrids); CHKERRQ(ierr);
120 ierr = PetscFree6(xMins, xMaxs, rxs, yMins, yMaxs, rys); CHKERRQ(ierr);
121 ierr = PetscFree3(zMins, zMaxs, rzs); CHKERRQ(ierr);
122
123 PetscFunctionReturn(0);
124}
#define LOCAL
Logging scope definitions for controlling message output.
Definition logging.h:44
#define LOG_ALLOW(scope, level, fmt,...)
Logging macro that checks both the log level and whether the calling function is in the allowed-funct...
Definition logging.h:207
@ LOG_DEBUG
Detailed debugging information.
Definition logging.h:33
PetscMPIInt rank
Definition variables.h:516
PetscInt cgrid
Definition variables.h:645
PetscInt block_number
Definition variables.h:562
SimCtx * simCtx
Back-pointer to the master simulation context.
Definition variables.h:633
PetscReal Min_X
Definition variables.h:640
PetscInt KM
Definition variables.h:639
PetscInt _this
Definition variables.h:643
PetscReal ry
Definition variables.h:644
PetscReal Max_Y
Definition variables.h:640
PetscReal rz
Definition variables.h:644
PetscInt JM
Definition variables.h:639
PetscReal Min_Z
Definition variables.h:640
PetscReal Max_X
Definition variables.h:640
PetscReal Min_Y
Definition variables.h:640
PetscInt IM
Definition variables.h:639
PetscReal rx
Definition variables.h:644
PetscReal Max_Z
Definition variables.h:640
The master context for the entire simulation.
Definition variables.h:513
Here is the caller graph for this function:

◆ ReadGridFile()

PetscErrorCode ReadGridFile ( UserCtx user)

Sets grid dimensions from a file for a SINGLE block using a one-time read cache.

This function uses a static-variable pattern to ensure the grid file header is read only once, collectively, by all processes on the first call. Subsequent calls simply retrieve the pre-loaded and broadcasted data for the specified block.

Parameters
userPointer to the UserCtx for a specific block. This function will populate the IM, JM, and KM fields.
Returns
PetscErrorCode 0 on success, or a PETSc error code on failure.

This function uses a static-variable pattern to ensure the grid file header is read only once, collectively, by all processes. The first time any process calls this function, it triggers a collective operation where rank 0 reads the file and broadcasts the dimensions for all blocks. This data is stored in static "cached" arrays.

On every call (including the first), the function retrieves the dimensions for the specific block (identified by user->_this) from the cached arrays and populates the IM, JM, and KM fields of the provided UserCtx.

Parameters
userPointer to the UserCtx for a specific block.
Returns
PetscErrorCode 0 on success, or a PETSc error code on failure.

Definition at line 145 of file io.c.

146{
147 PetscErrorCode ierr;
148 SimCtx *simCtx = user->simCtx;
149 PetscInt block_index = user->_this;
150
151 PetscFunctionBeginUser;
152
153 // --- One-Time Read and Broadcast Logic ---
155 LOG_ALLOW_SYNC(GLOBAL, LOG_INFO, "First call to ReadGridFile. Reading and broadcasting grid file header from '%s'...\n", simCtx->grid_file);
156 PetscMPIInt rank = simCtx->rank;
157 PetscInt nblk = simCtx->block_number;
158
159 if (rank == 0) {
160 FILE *fd = fopen(simCtx->grid_file, "r");
161 if (!fd) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_FILE_OPEN, "Cannot open file: %s", simCtx->grid_file);
162
163 fscanf(fd, "%d\n", &g_nblk_from_file);
164 // ---- Read first token; decide whether it is the header or nblk ----
165 char firstTok[32] = {0};
166 if (fscanf(fd, "%31s", firstTok) != 1)
167 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_FILE_READ, "Empty grid file: %s", simCtx->grid_file);
168
169 if (strcmp(firstTok, "PICGRID") == 0) {
170 // Header is present – read nblk from the next line
171 if (fscanf(fd, "%d", &g_nblk_from_file) != 1)
172 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_FILE_READ, "Expected number of blocks after \"PICGRID\" in %s", simCtx->grid_file);
173 } else {
174 // No header – the token we just read is actually nblk
175 g_nblk_from_file = (PetscInt)strtol(firstTok, NULL, 10);
176 }
177 if (g_nblk_from_file != nblk) {
178 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_FILE_UNEXPECTED, "Mismatch: -nblk is %d but grid file specifies %d blocks.", nblk, g_nblk_from_file);
179 }
180
181 ierr = PetscMalloc3(nblk, &g_IMs_from_file, nblk, &g_JMs_from_file, nblk, &g_KMs_from_file); CHKERRQ(ierr);
182 for (PetscInt i = 0; i < nblk; ++i) {
183 if (fscanf(fd, "%d %d %d\n", &g_IMs_from_file[i], &g_JMs_from_file[i], &g_KMs_from_file[i]) != 3) {
184 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_FILE_READ, "Expected 3 integers for block %d in %s", i, simCtx->grid_file);
185 }
186 }
187 fclose(fd);
188 }
189
190 // Broadcast nblk to verify (optional, good practice)
191 ierr = MPI_Bcast(&g_nblk_from_file, 1, MPI_INT, 0, PETSC_COMM_WORLD); CHKERRQ(ierr);
192
193 // Allocate on other ranks before receiving the broadcast
194 if (rank != 0) {
195 ierr = PetscMalloc3(nblk, &g_IMs_from_file, nblk, &g_JMs_from_file, nblk, &g_KMs_from_file); CHKERRQ(ierr);
196 }
197
198 // Broadcast the data arrays
199 ierr = MPI_Bcast(g_IMs_from_file, nblk, MPI_INT, 0, PETSC_COMM_WORLD); CHKERRQ(ierr);
200 ierr = MPI_Bcast(g_JMs_from_file, nblk, MPI_INT, 0, PETSC_COMM_WORLD); CHKERRQ(ierr);
201 ierr = MPI_Bcast(g_KMs_from_file, nblk, MPI_INT, 0, PETSC_COMM_WORLD); CHKERRQ(ierr);
202
203 g_file_has_been_read = PETSC_TRUE;
204 LOG_ALLOW(GLOBAL, LOG_INFO, "Grid file header read and broadcast complete.\n");
205 }
206
207 // --- Per-Block Assignment Logic (runs on every call) ---
208 user->IM = g_IMs_from_file[block_index];
209 user->JM = g_JMs_from_file[block_index];
210 user->KM = g_KMs_from_file[block_index];
211
212 LOG_ALLOW(LOCAL, LOG_DEBUG, "Rank %d: Set file inputs for Block %d: IM=%d, JM=%d, KM=%d\n",
213 simCtx->rank, block_index, user->IM, user->JM, user->KM);
214
215 PetscFunctionReturn(0);
216}
static PetscInt * g_IMs_from_file
Caches the IM dimensions for all blocks read from the grid file.
Definition io.c:19
static PetscInt g_nblk_from_file
Stores the number of blocks read from the grid file.
Definition io.c:17
static PetscBool g_file_has_been_read
A flag to ensure the grid file is read only once.
Definition io.c:25
static PetscInt * g_KMs_from_file
Caches the KM dimensions for all blocks read from the grid file.
Definition io.c:23
static PetscInt * g_JMs_from_file
Caches the JM dimensions for all blocks read from the grid file.
Definition io.c:21
#define LOG_ALLOW_SYNC(scope, level, fmt,...)
----— DEBUG ---------------------------------------— #define LOG_ALLOW(scope, level,...
Definition logging.h:274
#define GLOBAL
Scope for global logging across all processes.
Definition logging.h:45
@ LOG_INFO
Informational messages about program execution.
Definition logging.h:32
char grid_file[PETSC_MAX_PATH_LEN]
Definition variables.h:567
Here is the caller graph for this function:

◆ ReadSimulationFields()

PetscErrorCode ReadSimulationFields ( UserCtx user,
PetscInt  ti 
)

Reads binary field data for velocity, pressure, and other required vectors.

Reads contravariant velocity (Ucont) from vfield, Cartesian velocity (Ucat) from ufield, pressure (P), node state (Nvert_o), and optionally statistical quantities, LES, and RANS fields from binary files. Logs missing files but continues execution.

Parameters
[in,out]userPointer to the UserCtx structure containing the simulation context.
[in]Thetimestep at which the simulation field data needs to be read.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Reads binary field data for velocity, pressure, and other required vectors.

This function reads contravariant velocity, Cartesian velocity, pressure, and node state fields from their respective binary files. It also conditionally reads LES, RANS, and statistical fields if they are enabled.

Parameters
[in,out]userPointer to the UserCtx structure containing simulation context.
[in]tiTime index for constructing the file name.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 823 of file io.c.

824{
825 PetscErrorCode ierr;
826
827 SimCtx *simCtx = user->simCtx;
828
829 LOG_ALLOW(GLOBAL, LOG_INFO, "Starting to read simulation fields.\n");
830
831 // Read Cartesian velocity field
832 ierr = ReadFieldData(user, "ufield", user->Ucat, ti, "dat"); CHKERRQ(ierr);
833
834 // Read contravariant velocity field
835 ierr = ReadFieldData(user, "vfield", user->Ucont, ti, "dat"); CHKERRQ(ierr);
836
837 // Read pressure field
838 ierr = ReadFieldData(user, "pfield", user->P, ti, "dat"); CHKERRQ(ierr);
839
840 // Read node state field (nvert)
841 ierr = ReadFieldData(user, "nvfield", user->Nvert, ti, "dat"); CHKERRQ(ierr);
842
843 if(simCtx->np>0){
844 // Read Particle Count field
845 if(!user->ParticleCount){
846 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "ParticleCount Vec is NULL but np>0");
847 }
848 ierr = ReadFieldData(user, "ParticleCount", user->ParticleCount, ti, "dat"); CHKERRQ(ierr);
849 }
850 else{
851 LOG_ALLOW(GLOBAL, LOG_INFO, "No particles in simulation, skipping ParticleCount field read.\n");
852 }
853 // Process LES fields if enabled
854 if (simCtx->les) {
855 ierr = ReadLESFields(user,ti); CHKERRQ(ierr);
856 }
857
858 // Process RANS fields if enabled
859 if (simCtx->rans) {
860 ierr = ReadRANSFields(user,ti); CHKERRQ(ierr);
861 }
862
863 // Process statistical fields if averaging is enabled
864 if (simCtx->averaging) {
865 ierr = ReadStatisticalFields(user,ti); CHKERRQ(ierr);
866 }
867
868 LOG_ALLOW(GLOBAL, LOG_INFO, "Finished reading simulation fields.\n");
869
870 return 0;
871}
PetscErrorCode ReadStatisticalFields(UserCtx *user, PetscInt ti)
Reads statistical fields for averaging purposes.
Definition io.c:884
PetscErrorCode ReadRANSFields(UserCtx *user, PetscInt ti)
Reads RANS-related fields.
Definition io.c:940
PetscErrorCode ReadLESFields(UserCtx *user, PetscInt ti)
Reads LES-related fields.
Definition io.c:911
PetscErrorCode ReadFieldData(UserCtx *user, const char *field_name, Vec field_vec, PetscInt ti, const char *ext)
Reads data for a specific field from a file into the provided vector.
Definition io.c:639
PetscInt rans
Definition variables.h:578
PetscInt np
Definition variables.h:585
PetscBool averaging
Definition variables.h:582
Vec Ucont
Definition variables.h:657
Vec Ucat
Definition variables.h:657
Vec ParticleCount
Definition variables.h:697
PetscInt les
Definition variables.h:578
Vec Nvert
Definition variables.h:657
Here is the call graph for this function:
Here is the caller graph for this function:

◆ ReadFieldData()

PetscErrorCode ReadFieldData ( UserCtx user,
const char *  field_name,
Vec  field_vec,
PetscInt  ti,
const char *  ext 
)

Reads data for a specific field from a file into the provided vector.

This function uses the field name to construct the file path and reads the data from the corresponding file into the provided PETSc vector.

Parameters
[in]userPointer to the UserCtx structure containing simulation context.
[in]field_nameName of the field (e.g., "ufield", "vfield", "pfield").
[out]field_vecPETSc vector to store the field data.
[in]tiTime index for constructing the file name.
[in]extFile extension (e.g., "dat").
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 639 of file io.c.

644{
645 PetscErrorCode ierr;
646 char filename[PETSC_MAX_PATH_LEN];
647 MPI_Comm comm;
648 PetscMPIInt rank,size;
649
650 PetscFunctionBeginUser;
651
652 ierr = PetscObjectGetComm((PetscObject)field_vec,&comm);CHKERRQ(ierr);
653 ierr = MPI_Comm_rank(comm,&rank);CHKERRQ(ierr);
654 ierr = MPI_Comm_size(comm,&size);CHKERRQ(ierr);
655
656 /* ---------------------------------------------------------------------
657 * Compose <results>/<field_name><step with 5 digits>_0.<ext>
658 * (all restart files are written by rank-0 with that naming scheme).
659 * ------------------------------------------------------------------ */
660 ierr = PetscSNPrintf(filename,sizeof(filename),
661 "results/%s%05" PetscInt_FMT "_0.%s",
662 field_name,ti,ext);CHKERRQ(ierr);
663
665 "Attempting to read <%s> on rank %d/%d\n",
666 filename,(int)rank,(int)size);
667
668 /* ======================================================================
669 * 1. SERIAL JOB – just hand the Vec to VecLoad()
670 * ==================================================================== */
671 if(size==1)
672 {
673 PetscViewer viewer;
674 PetscBool found;
675 Vec temp_vec;
676 PetscInt expectedSize,loadedSize;
677
678 ierr = PetscTestFile(filename,'r',&found);CHKERRQ(ierr);
679 if(!found) SETERRQ(comm,PETSC_ERR_FILE_OPEN,
680 "Restart file not found: %s",filename);
681
682 ierr = PetscViewerBinaryOpen(PETSC_COMM_SELF,filename,FILE_MODE_READ,&viewer);CHKERRQ(ierr);
683// ---- START MODIFICATION ----
684 // DO NOT load directly into field_vec, as this can resize it, which is
685 // illegal for DMSwarm "view" vectors. Instead, load into a temporary vector.
686 ierr = VecCreate(PETSC_COMM_SELF, &temp_vec); CHKERRQ(ierr);
687 ierr = VecLoad(temp_vec,viewer);CHKERRQ(ierr);
688 ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);
689
690 // Sanity check: ensure the file size matches the expected vector size.
691 ierr = VecGetSize(field_vec, &expectedSize);CHKERRQ(ierr);
692 ierr = VecGetSize(temp_vec, &loadedSize);CHKERRQ(ierr);
693 if (loadedSize != expectedSize) {
694 SETERRQ(comm,PETSC_ERR_FILE_UNEXPECTED,
695 "File %s holds %d entries – expected %d for field '%s'",
696 filename, loadedSize, expectedSize, field_name);
697 }
698
699 // Now, safely copy the data from the temporary vector to the final destination.
700 ierr = VecCopy(temp_vec, field_vec);CHKERRQ(ierr);
701
702 // Clean up the temporary vector.
703 ierr = VecDestroy(&temp_vec);CHKERRQ(ierr);
704
705 // ---- END MODIFICATION ----
706
707 /* create EMPTY sequential Vec – VecLoad() will size it correctly */
708 /*
709 ierr = VecCreate(PETSC_COMM_SELF,&seq_vec);CHKERRQ(ierr);
710 ierr = VecSetType(seq_vec,VECSEQ);CHKERRQ(ierr);
711
712 ierr = PetscViewerBinaryOpen(PETSC_COMM_SELF,filename,
713 FILE_MODE_READ,&viewer);CHKERRQ(ierr);
714
715 ierr = VecLoad(field_vec,viewer);CHKERRQ(ierr);
716 ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);
717 */
719 "Loaded <%s> (serial path)\n",filename);
720 PetscFunctionReturn(0);
721 }
722
723 /* ======================================================================
724 * 2. PARALLEL JOB
725 * ==================================================================== */
726 PetscInt globalSize;
727 ierr = VecGetSize(field_vec,&globalSize);CHKERRQ(ierr);
728
729 /* -------------------- rank-0 : read the sequential file -------------- */
730 Vec seq_vec = NULL; /* only valid on rank-0 */
731 const PetscScalar *seqArray = NULL; /* borrowed pointer on rank-0 only */
732
733 if(rank==0)
734 {
735 PetscViewer viewer;
736 PetscBool found;
737
738 ierr = PetscTestFile(filename,'r',&found);CHKERRQ(ierr);
739 if(!found) SETERRQ(PETSC_COMM_SELF,PETSC_ERR_FILE_OPEN,
740 "Restart file not found: %s",filename);
741
742 /* create EMPTY sequential Vec – VecLoad() will size it correctly */
743 ierr = VecCreate(PETSC_COMM_SELF,&seq_vec);CHKERRQ(ierr);
744 ierr = VecSetType(seq_vec,VECSEQ);CHKERRQ(ierr);
745
746 ierr = PetscViewerBinaryOpen(PETSC_COMM_SELF,filename,
747 FILE_MODE_READ,&viewer);CHKERRQ(ierr);
748 ierr = VecLoad(seq_vec,viewer);CHKERRQ(ierr);
749 ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);
750
751 /* size sanity-check */
752 PetscInt loaded;
753 ierr = VecGetSize(seq_vec,&loaded);CHKERRQ(ierr);
754 if(loaded != globalSize)
755 SETERRQ(comm,PETSC_ERR_FILE_UNEXPECTED,
756 "File %s holds %d entries – expected %d",
757 filename,loaded,globalSize);
758
759 /* borrow array for later Bcast */
760 ierr = VecGetArrayRead(seq_vec,&seqArray);CHKERRQ(ierr);
761
763 "ReadFieldData - Rank 0 successfully loaded <%s>\n",filename);
764 }
765
766 /* -------------------- everybody : broadcast raw data ----------------- */
767 PetscScalar *buffer = NULL; /* receives the full field */
768 if(rank==0)
769 {
770 /* shallow-copy: const-cast is safe, we do not modify the data */
771 buffer = (PetscScalar *)seqArray;
772 }
773 else
774 { /* non-root ranks allocate a receive buffer */
775 ierr = PetscMalloc1(globalSize,&buffer);CHKERRQ(ierr);
776 }
777
778 ierr = MPI_Bcast(buffer, (int)globalSize, MPIU_SCALAR, 0, comm);CHKERRQ(ierr);
779
780 /* -------------------- copy my slice into field_vec ------------------- */
781 PetscInt rstart,rend,loc;
782 PetscScalar *locArray;
783
784 ierr = VecGetOwnershipRange(field_vec,&rstart,&rend);CHKERRQ(ierr);
785 loc = rend - rstart; /* local length */
786
787 ierr = VecGetArray(field_vec,&locArray);CHKERRQ(ierr);
788 ierr = PetscMemcpy(locArray,
789 buffer + rstart,
790 loc*sizeof(PetscScalar));CHKERRQ(ierr);
791 ierr = VecRestoreArray(field_vec,&locArray);CHKERRQ(ierr);
792
793 /* -------------------- tidy up ---------------------------------------- */
794 if(rank==0)
795 {
796 ierr = VecRestoreArrayRead(seq_vec,&seqArray);CHKERRQ(ierr);
797 ierr = VecDestroy(&seq_vec);CHKERRQ(ierr);
798 }
799 else
800 {
801 ierr = PetscFree(buffer);CHKERRQ(ierr);
802 }
803
805 "ReadFieldData - Loaded <%s> (parallel path)\n",filename);
806
807 PetscFunctionReturn(0);
808}
Here is the caller graph for this function:

◆ ReadStatisticalFields()

PetscErrorCode ReadStatisticalFields ( UserCtx user,
PetscInt  ti 
)

Reads statistical fields used for time-averaged simulations.

Reads statistical quantities such as velocity sums and pressure sums. Logs missing files and initializes fields to zero if files are unavailable.

Parameters
[in,out]userPointer to the UserCtx structure containing the simulation context.
[in]Thetimestep at which the simulation field data needs to be read.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Reads statistical fields used for time-averaged simulations.

This function reads data for fields such as Ucat_sum, Ucat_cross_sum, Ucat_square_sum, and P_sum, used for statistical analysis during simulation.

Parameters
[in,out]userPointer to the UserCtx structure containing simulation context.
[in]tiTime index for constructing the file name.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 884 of file io.c.

885{
886 PetscErrorCode ierr;
887
888 LOG_ALLOW(GLOBAL, LOG_INFO, "Starting to read statistical fields.\n");
889
890 ierr = ReadFieldData(user, "su0", user->Ucat_sum, ti, "dat"); CHKERRQ(ierr);
891 ierr = ReadFieldData(user, "su1", user->Ucat_cross_sum, ti, "dat"); CHKERRQ(ierr);
892 ierr = ReadFieldData(user, "su2", user->Ucat_square_sum, ti, "dat"); CHKERRQ(ierr);
893 ierr = ReadFieldData(user, "sp", user->P_sum, ti, "dat"); CHKERRQ(ierr);
894
895 LOG_ALLOW(GLOBAL, LOG_INFO, "Finished reading statistical fields.\n");
896
897 return 0;
898}
Vec Ucat_square_sum
Definition variables.h:683
Vec P_sum
Definition variables.h:683
Vec Ucat_sum
Definition variables.h:683
Vec Ucat_cross_sum
Definition variables.h:683
Here is the call graph for this function:
Here is the caller graph for this function:

◆ ReadLESFields()

PetscErrorCode ReadLESFields ( UserCtx user,
PetscInt  ti 
)

Reads LES-related fields used in turbulence modeling.

Reads the Smagorinsky constant (Cs) and transfers data to local vectors. Logs missing files and initializes fields to zero if files are unavailable.

Parameters
[in,out]userPointer to the UserCtx structure containing the simulation context.
[in]tiTime index for constructing the file name.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Reads LES-related fields used in turbulence modeling.

This function reads LES-related fields such as Cs (Smagorinsky constant) into their respective PETSc vectors.

Parameters
[in,out]userPointer to the UserCtx structure containing simulation context.
[in]tiTime index for constructing the file name.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 911 of file io.c.

912{
913 PetscErrorCode ierr;
914 Vec Cs;
915
916 LOG_ALLOW(GLOBAL, LOG_INFO, "Starting to read LES fields.\n");
917
918 VecDuplicate(user->P, &Cs);
919 ierr = ReadFieldData(user, "cs", Cs, ti, "dat"); CHKERRQ(ierr);
920 DMGlobalToLocalBegin(user->da, Cs, INSERT_VALUES, user->lCs);
921 DMGlobalToLocalEnd(user->da, Cs, INSERT_VALUES, user->lCs);
922 VecDestroy(&Cs);
923
924 LOG_ALLOW(GLOBAL, LOG_INFO, "Finished reading LES fields.\n");
925
926 return 0;
927}
Vec lCs
Definition variables.h:680
Here is the call graph for this function:
Here is the caller graph for this function:

◆ ReadRANSFields()

PetscErrorCode ReadRANSFields ( UserCtx user,
PetscInt  ti 
)

Reads RANS-related fields for turbulence modeling.

Reads K_Omega fields (used in RANS modeling) and initializes them if files are unavailable.

Parameters
[in,out]userPointer to the UserCtx structure containing the simulation context.
[in]tiTime index for constructing the file name.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Reads RANS-related fields for turbulence modeling.

This function reads RANS-related fields such as K_Omega into their respective PETSc vectors.

Parameters
[in,out]userPointer to the UserCtx structure containing simulation context.
[in]tiTime index for constructing the file name.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 940 of file io.c.

941{
942 PetscErrorCode ierr;
943
944 LOG_ALLOW(GLOBAL, LOG_INFO, "Starting to read RANS fields.\n");
945
946 ierr = ReadFieldData(user, "kfield", user->K_Omega, ti, "dat"); CHKERRQ(ierr);
947 VecCopy(user->K_Omega, user->K_Omega_o);
948
949 DMGlobalToLocalBegin(user->fda2, user->K_Omega, INSERT_VALUES, user->lK_Omega);
950 DMGlobalToLocalEnd(user->fda2, user->K_Omega, INSERT_VALUES, user->lK_Omega);
951
952 DMGlobalToLocalBegin(user->fda2, user->K_Omega_o, INSERT_VALUES, user->lK_Omega_o);
953 DMGlobalToLocalEnd(user->fda2, user->K_Omega_o, INSERT_VALUES, user->lK_Omega_o);
954
955 LOG_ALLOW(GLOBAL, LOG_INFO, "Finished reading RANS fields.\n");
956
957 return 0;
958}
Vec K_Omega_o
Definition variables.h:680
Vec K_Omega
Definition variables.h:680
Vec lK_Omega_o
Definition variables.h:680
Vec lK_Omega
Definition variables.h:680
Here is the call graph for this function:
Here is the caller graph for this function:

◆ WriteFieldData()

PetscErrorCode WriteFieldData ( UserCtx user,
const char *  field_name,
Vec  field_vec,
PetscInt  ti,
const char *  ext 
)

Writes data from a specific PETSc vector to a file.

This function uses the field name to construct the file path and writes the data from the provided PETSc vector to the corresponding file.

Parameters
[in]userPointer to the UserCtx structure containing simulation context.
[in]field_nameName of the field (e.g., "ufield", "vfield", "pfield").
[in]field_vecPETSc vector containing the field data to write.
[in]tiTime index for constructing the file name.
[in]extFile extension (e.g., "dat").
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Writes data from a specific PETSc vector to a file.

This function is now parallel-safe.

  • In PARALLEL: All processes send their local data to Rank 0. Rank 0 assembles the data into a temporary sequential vector and writes it to a single file.
  • In SERIAL: It performs a direct, simple write.

This ensures the output file is always in a simple, portable format.

Parameters
[in]userPointer to the UserCtx structure.
[in]field_nameName of the field (e.g., "position").
[in]field_vecThe parallel PETSc vector containing the data to write.
[in]tiTime index for constructing the file name.
[in]extFile extension (e.g., "dat").
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Dump a distributed PETSc Vec to the single sequential file format used by our restart / post-processing tools.

The companion of ReadFieldData(): it always produces one file (e.g. results/ufield00006_0.dat) regardless of how many MPI ranks are running.

Behaviour

# MPI ranks Strategy
1 Direct VecView() into the file.
> 1 VecScatterCreateToZero() gathers the distributed Vec onto rank 0.
Rank 0 writes the sequential Vec; all other ranks allocate no storage.

The routine never alters or destroys the parallel Vec passed in; the gather buffer is created and freed internally.

Parameters
[in]userSimulation context (used only for logging).
[in]field_nameLogical field name (forms part of the filename).
[in]field_vecDistributed PETSc Vec to write.
[in]tiTimestep index used in the filename.
[in]extFile extension ("dat" in our workflow).
Returns
0 on success or a PETSc error code.

Definition at line 1180 of file io.c.

1185{
1186 PetscErrorCode ierr;
1187 MPI_Comm comm;
1188 PetscMPIInt rank, size;
1189
1190 const PetscInt placeholder_int = 0; /* keep legacy name */
1191 char filename[PETSC_MAX_PATH_LEN];
1192
1193 PetscFunctionBeginUser;
1194
1195 /* ------------------------------------------------------------ */
1196 /* Basic communicator information */
1197 /* ------------------------------------------------------------ */
1198 ierr = PetscObjectGetComm((PetscObject)field_vec,&comm);CHKERRQ(ierr);
1199 ierr = MPI_Comm_rank(comm,&rank);CHKERRQ(ierr);
1200 ierr = MPI_Comm_size(comm,&size);CHKERRQ(ierr);
1201
1202 ierr = PetscSNPrintf(filename,sizeof(filename),
1203 "results/%s%05" PetscInt_FMT "_%d.%s",
1204 field_name,ti,placeholder_int,ext);CHKERRQ(ierr);
1205
1207 "WriteFieldData - Preparing to write <%s> on rank %d/%d\n",
1208 filename,rank,size);
1209
1210 /* ------------------------------------------------------------ */
1211 /* 1. Serial path */
1212 /* ------------------------------------------------------------ */
1213 if (size == 1) {
1214 PetscViewer viewer;
1215
1216 ierr = PetscViewerBinaryOpen(comm,filename,
1217 FILE_MODE_WRITE,&viewer);CHKERRQ(ierr);
1218 ierr = VecView(field_vec,viewer);CHKERRQ(ierr);
1219 ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);
1220
1222 "WriteFieldData - Wrote <%s> (serial path)\n",filename);
1223 PetscFunctionReturn(0);
1224 }
1225
1226 /* ------------------------------------------------------------ */
1227 /* 2. Parallel path */
1228 /* ------------------------------------------------------------ */
1229 VecScatter scatter;
1230 Vec seq_vec=NULL; /* created by PETSc, lives only on rank 0 */
1231
1232 /* 2.1 Create gather context and buffer */
1233 ierr = VecScatterCreateToZero(field_vec,&scatter,&seq_vec);CHKERRQ(ierr);
1234
1235 /* 2.2 Gather distributed → sequential (on rank 0) */
1236 ierr = VecScatterBegin(scatter,field_vec,seq_vec,
1237 INSERT_VALUES,SCATTER_FORWARD);CHKERRQ(ierr);
1238 ierr = VecScatterEnd (scatter,field_vec,seq_vec,
1239 INSERT_VALUES,SCATTER_FORWARD);CHKERRQ(ierr);
1240
1241 /* 2.3 Rank 0 writes the file */
1242 if (rank == 0) {
1243 PetscViewer viewer;
1244
1245 /* (optional) value diagnostics */
1246 PetscReal vmin,vmax;
1247 ierr = VecMin(seq_vec,NULL,&vmin);CHKERRQ(ierr);
1248 ierr = VecMax(seq_vec,NULL,&vmax);CHKERRQ(ierr);
1250 "WriteFieldData - <%s> range = [%.4e … %.4e]\n",
1251 field_name,(double)vmin,(double)vmax);
1252
1253 ierr = PetscViewerBinaryOpen(PETSC_COMM_SELF,filename,
1254 FILE_MODE_WRITE,&viewer);CHKERRQ(ierr);
1255 ierr = VecView(seq_vec,viewer);CHKERRQ(ierr);
1256 ierr = PetscViewerDestroy(&viewer);CHKERRQ(ierr);
1257
1259 "WriteFieldData - Wrote <%s> (parallel path)\n",filename);
1260 }
1261
1262 /* 2.4 Cleanup */
1263 ierr = VecScatterDestroy(&scatter);CHKERRQ(ierr);
1264 ierr = VecDestroy(&seq_vec);CHKERRQ(ierr);
1265
1266 PetscFunctionReturn(0);
1267}
Here is the caller graph for this function:

◆ WriteSimulationFields()

PetscErrorCode WriteSimulationFields ( UserCtx user)

Writes simulation fields to files.

This function writes contravariant velocity, Cartesian velocity, pressure, and node state fields to their respective binary files. It also conditionally writes LES, RANS, and statistical fields if they are enabled.

Parameters
[in]userPointer to the UserCtx structure containing simulation context.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 1280 of file io.c.

1281{
1282 PetscErrorCode ierr;
1283
1284 SimCtx *simCtx = user->simCtx;
1285
1286 LOG_ALLOW(GLOBAL, LOG_INFO, "WriteSimulationFields - Starting to write simulation fields.\n");
1287
1288 // Write contravariant velocity field
1289 ierr = WriteFieldData(user, "vfield", user->Ucont, simCtx->step, "dat"); CHKERRQ(ierr);
1290
1291 // Write Cartesian velocity field
1292 ierr = WriteFieldData(user, "ufield", user->Ucat, simCtx->step, "dat"); CHKERRQ(ierr);
1293
1294 // Write pressure field
1295 ierr = WriteFieldData(user, "pfield", user->P, simCtx->step, "dat"); CHKERRQ(ierr);
1296
1297 // Write node state field (nvert)
1298 ierr = WriteFieldData(user, "nvfield", user->Nvert, simCtx->step, "dat"); CHKERRQ(ierr);
1299
1300 // Write ParticleCountPerCell if enabled.
1301 if(simCtx->np>0){
1302 ierr = WriteFieldData(user, "ParticleCount",user->ParticleCount,simCtx->step,"dat"); CHKERRQ(ierr);
1303 }
1304
1305 // Write LES fields if enabled
1306 if (simCtx->les) {
1307 ierr = WriteLESFields(user); CHKERRQ(ierr);
1308 }
1309
1310 // Write RANS fields if enabled
1311 if (simCtx->rans) {
1312 ierr = WriteRANSFields(user); CHKERRQ(ierr);
1313 }
1314
1315 // Write statistical fields if averaging is enabled
1316 if (simCtx->averaging) {
1317 ierr = WriteStatisticalFields(user); CHKERRQ(ierr);
1318 }
1319
1320 LOG_ALLOW(GLOBAL, LOG_INFO, "WriteSimulationFields - Finished writing simulation fields.\n");
1321
1322 return 0;
1323}
PetscErrorCode WriteStatisticalFields(UserCtx *user)
Writes statistical fields for averaging purposes.
Definition io.c:1335
PetscErrorCode WriteFieldData(UserCtx *user, const char *field_name, Vec field_vec, PetscInt ti, const char *ext)
Writes data from a specific PETSc vector to a single, sequential file.
Definition io.c:1180
PetscErrorCode WriteRANSFields(UserCtx *user)
Writes RANS-related fields.
Definition io.c:1393
PetscErrorCode WriteLESFields(UserCtx *user)
Writes LES-related fields.
Definition io.c:1363
PetscInt step
Definition variables.h:521
Here is the call graph for this function:
Here is the caller graph for this function:

◆ WriteStatisticalFields()

PetscErrorCode WriteStatisticalFields ( UserCtx user)

Writes statistical fields for averaging purposes.

This function writes data for fields such as Ucat_sum, Ucat_cross_sum, Ucat_square_sum, and P_sum to their respective binary files.

Parameters
[in]userPointer to the UserCtx structure containing simulation context.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 1335 of file io.c.

1336{
1337 PetscErrorCode ierr;
1338
1339 SimCtx *simCtx = user->simCtx;
1340
1341 LOG_ALLOW(GLOBAL, LOG_INFO, "WriteStatisticalFields - Starting to write statistical fields.\n");
1342
1343 ierr = WriteFieldData(user, "su0", user->Ucat_sum, simCtx->step, "dat"); CHKERRQ(ierr);
1344 ierr = WriteFieldData(user, "su1", user->Ucat_cross_sum, simCtx->step, "dat"); CHKERRQ(ierr);
1345 ierr = WriteFieldData(user, "su2", user->Ucat_square_sum, simCtx->step, "dat"); CHKERRQ(ierr);
1346 ierr = WriteFieldData(user, "sp", user->P_sum, simCtx->step, "dat"); CHKERRQ(ierr);
1347
1348 LOG_ALLOW(GLOBAL, LOG_INFO, "WriteStatisticalFields - Finished writing statistical fields.\n");
1349
1350 return 0;
1351}
Here is the call graph for this function:
Here is the caller graph for this function:

◆ WriteLESFields()

PetscErrorCode WriteLESFields ( UserCtx user)

Writes LES-related fields.

This function writes LES-related fields such as Cs (Smagorinsky constant) to their respective binary files.

Parameters
[in]userPointer to the UserCtx structure containing simulation context.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 1363 of file io.c.

1364{
1365 PetscErrorCode ierr;
1366 Vec Cs;
1367
1368 SimCtx *simCtx = user->simCtx;
1369
1370 LOG_ALLOW(GLOBAL, LOG_INFO, "WriteLESFields - Starting to write LES fields.\n");
1371
1372 VecDuplicate(user->P, &Cs);
1373 DMLocalToGlobalBegin(user->da, user->lCs, INSERT_VALUES, Cs);
1374 DMLocalToGlobalEnd(user->da, user->lCs, INSERT_VALUES, Cs);
1375 ierr = WriteFieldData(user, "cs", Cs, simCtx->step, "dat"); CHKERRQ(ierr);
1376 VecDestroy(&Cs);
1377
1378 LOG_ALLOW(GLOBAL, LOG_INFO, "WriteLESFields - Finished writing LES fields.\n");
1379
1380 return 0;
1381}
Here is the call graph for this function:
Here is the caller graph for this function:

◆ WriteRANSFields()

PetscErrorCode WriteRANSFields ( UserCtx user)

Writes RANS-related fields.

This function writes RANS-related fields such as K_Omega to their respective binary files.

Parameters
[in]userPointer to the UserCtx structure containing simulation context.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 1393 of file io.c.

1394{
1395 PetscErrorCode ierr;
1396
1397 SimCtx *simCtx = user->simCtx;
1398
1399 LOG_ALLOW(GLOBAL, LOG_INFO, "WriteRANSFields - Starting to write RANS fields.\n");
1400
1401 ierr = WriteFieldData(user, "kfield", user->K_Omega, simCtx->step, "dat"); CHKERRQ(ierr);
1402
1403 LOG_ALLOW(GLOBAL, LOG_INFO, "WriteRANSFields - Finished writing RANS fields.\n");
1404
1405 return 0;
1406}
Here is the call graph for this function:
Here is the caller graph for this function:

◆ WriteSwarmField()

PetscErrorCode WriteSwarmField ( UserCtx user,
const char *  field_name,
PetscInt  ti,
const char *  ext 
)

Writes data from a specific field in a PETSc Swarm to a file.

This function retrieves the Swarm from the UserCtx (i.e., user->swarm) and creates a global PETSc vector from the specified Swarm field. It then calls the existing WriteFieldData() function to handle the actual I/O operation. After writing the data, the function destroys the temporary global vector to avoid memory leaks.

Parameters
[in]userPointer to the UserCtx structure containing simulation context and the PetscSwarm (as user->swarm).
[in]field_nameName of the Swarm field to be written (e.g., "my_field").
[in]tiTime index used to construct the output file name.
[in]extFile extension (e.g., "dat", "bin").
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.
Note
Compatible with PETSc 3.14.4.

This function retrieves the Swarm from the UserCtx (i.e., user->swarm) and creates a global PETSc vector from the specified Swarm field. It then calls the existing WriteFieldData() function to handle the actual I/O operation. After writing the data, the function destroys the temporary global vector to avoid memory leaks.

Parameters
[in]userPointer to the UserCtx structure containing simulation context and the PetscSwarm (as user->swarm).
[in]field_nameName of the Swarm field to be written (e.g., "my_field").
[in]tiTime index used to construct the output file name.
[in]extFile extension (e.g., "dat", "bin").
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.
Note
Compatible with PETSc 3.14.4 and newer.

Definition at line 1427 of file io.c.

1428{
1429 PetscErrorCode ierr;
1430 Vec fieldVec;
1431 DM swarm;
1432
1433 PetscFunctionBeginUser; /* PETSc macro indicating start of function */
1434
1435 /*
1436 * 1) Retrieve the PetscSwarm from the user context.
1437 * Ensure user->swarm is initialized and not NULL.
1438 */
1439 swarm = user->swarm;
1440
1441 /*
1442 * 2) Create a global vector from the specified swarm field.
1443 * This function is available in PETSc 3.14.4.
1444 * It provides a read/write "view" of the swarm field as a global Vec.
1445 */
1447 "WriteSwarmField - Attempting to create global vector from field: %s\n",
1448 field_name);
1449 ierr = DMSwarmCreateGlobalVectorFromField(swarm, field_name, &fieldVec);CHKERRQ(ierr);
1450
1451 /*
1452 * 3) Use your existing WriteFieldData() to write the global vector to a file.
1453 * The field name, time index, and extension are passed along for naming.
1454 */
1456 "WriteSwarmField - Calling WriteFieldData for field: %s\n",
1457 field_name);
1458 ierr = WriteFieldData(user, field_name, fieldVec, ti, ext);CHKERRQ(ierr);
1459
1460 /*
1461 * 4) Destroy the global vector once the data is successfully written.
1462 * This step is crucial for avoiding memory leaks.
1463 * DMSwarmDestroyGlobalVectorFromField() is also available in PETSc 3.14.4.
1464 */
1466 "WriteSwarmField - Destroying the global vector for field: %s\n",
1467 field_name);
1468 ierr = DMSwarmDestroyGlobalVectorFromField(swarm, field_name, &fieldVec);CHKERRQ(ierr);
1469
1470 /* Log and return success. */
1472 "WriteSwarmField - Successfully wrote swarm data for field: %s\n",
1473 field_name);
1474
1475 PetscFunctionReturn(0); /* PETSc macro indicating end of function */
1476}
Here is the call graph for this function:
Here is the caller graph for this function:

◆ WriteSwarmIntField()

PetscErrorCode WriteSwarmIntField ( UserCtx user,
const char *  field_name,
PetscInt  ti,
const char *  ext 
)

Writes integer data from a specific PETSc Swarm field to a file.

This function is designed for swarm fields that store integer data (e.g., DMSwarm_CellID), which cannot be converted to a standard PETSc Vec of PetscScalars. It accesses the raw data pointer for the field on each rank using DMSwarmGetField(), writes the local data to a rank-specific binary file, and then restores the field access.

Parameters
[in]userPointer to the UserCtx structure containing the PetscSwarm.
[in]field_nameName of the integer Swarm field to be written.
[in]tiTime index used to construct the output file name.
[in]extFile extension (e.g., "dat", "bin").
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Writes integer data from a specific PETSc Swarm field to a file.

This function provides a bridge to write integer-based swarm fields (like DMSwarm_CellID) using the existing Vec-based I/O routine (WriteFieldData). It works by:

  1. Creating a temporary parallel Vec with the same layout as other swarm fields.
  2. Accessing the local integer data from the swarm field.
  3. Populating the temporary Vec by casting each integer to a PetscScalar.
  4. Calling the standard WriteFieldData() function with the temporary Vec.
  5. Destroying the temporary Vec.
Parameters
[in]userPointer to the UserCtx structure.
[in]field_nameName of the integer Swarm field to be written.
[in]tiTime index for the output file.
[in]extFile extension.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 1496 of file io.c.

1497{
1498 PetscErrorCode ierr;
1499 DM swarm = user->swarm;
1500 Vec temp_vec; // Temporary Vec to hold casted data
1501 PetscInt nlocal, nglobal,bs,i;
1502 void *field_array_void;
1503 PetscScalar *scalar_array; // Pointer to the temporary Vec's scalar data
1504
1505 PetscFunctionBeginUser;
1506
1507 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Casting '%s' to Vec for writing.\n", field_name);
1508
1509 // Get the swarm field properties
1510 ierr = DMSwarmGetLocalSize(swarm, &nlocal); CHKERRQ(ierr);
1511 ierr = DMSwarmGetSize(swarm, &nglobal); CHKERRQ(ierr);
1512 ierr = DMSwarmGetField(swarm, field_name, &bs, NULL, &field_array_void); CHKERRQ(ierr);
1513
1514 // Create Temporary parallel Vec wit the CORRECT layout
1515 ierr = VecCreate(PETSC_COMM_WORLD, &temp_vec); CHKERRQ(ierr);
1516 ierr = VecSetType(temp_vec, VECMPI); CHKERRQ(ierr);
1517 ierr = VecSetSizes(temp_vec, nlocal*bs, nglobal*bs); CHKERRQ(ierr);
1518 ierr = VecSetUp(temp_vec); CHKERRQ(ierr);
1519
1520 // Defining Vector field to mandatory field 'position'
1521 DMSwarmVectorDefineField(swarm,"position");
1522
1523 ierr = VecGetArray(temp_vec, &scalar_array); CHKERRQ(ierr);
1524
1525 if(strcasecmp(field_name,"DMSwarm_pid") == 0){
1526 PetscInt64 *int64_array = (PetscInt64 *)field_array_void;
1527 // Perform the cast from PetscInt64 to PetscScalar
1528 for (i = 0; i < nlocal*bs; i++) {
1529 scalar_array[i] = (PetscScalar)int64_array[i];
1530 }
1531 }else{
1532 PetscInt *int_array = (PetscInt *)field_array_void;
1533 //Perform the cast from PetscInt to PetscScalar
1534 for (i = 0; i < nlocal*bs; i++) {
1535 scalar_array[i] = (PetscScalar)int_array[i];
1536 }
1537 }
1538
1539 // Restore access to both arrays
1540 ierr = VecRestoreArray(temp_vec, &scalar_array); CHKERRQ(ierr);
1541 ierr = DMSwarmRestoreField(swarm, field_name, &bs, NULL, &field_array_void); CHKERRQ(ierr);
1542
1543 // Call your existing writer with the temporary, populated Vec
1544 ierr = WriteFieldData(user, field_name, temp_vec, ti, ext); CHKERRQ(ierr);
1545
1546 // Clean up
1547 ierr = VecDestroy(&temp_vec); CHKERRQ(ierr);
1548
1549 PetscFunctionReturn(0);
1550}
Here is the call graph for this function:
Here is the caller graph for this function:

◆ WriteAllSwarmFields()

PetscErrorCode WriteAllSwarmFields ( UserCtx user)

Writes a predefined set of PETSc Swarm fields to files.

This function iterates through a hardcoded list of common swarm fields (position, velocity, etc.) and calls the WriteSwarmField() helper function for each one. This provides a straightforward way to output essential particle data at a given simulation step.

This function will only execute if particles are enabled in the simulation (i.e., user->simCtx->np > 0 and user->swarm is not NULL).

Parameters
[in]userPointer to the UserCtx structure containing the simulation context and the PetscSwarm.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 1568 of file io.c.

1569{
1570 PetscErrorCode ierr;
1571 SimCtx *simCtx = user->simCtx;
1572
1573 PetscFunctionBeginUser;
1574
1575 // If no swarm is configured or there are no particles, do nothing and return.
1576 if (!user->swarm || simCtx->np <= 0) {
1577 PetscFunctionReturn(0);
1578 }
1579
1580 LOG_ALLOW(GLOBAL, LOG_INFO, "WriteAllSwarmFields - Starting to write swarm fields.\n");
1581
1582 // Write particle position field
1583 ierr = WriteSwarmField(user, "position", simCtx->step, "dat"); CHKERRQ(ierr);
1584
1585 // Write particle velocity field
1586 ierr = WriteSwarmField(user, "velocity", simCtx->step, "dat"); CHKERRQ(ierr);
1587
1588 // Write particle weight field
1589 ierr = WriteSwarmField(user, "weight", simCtx->step, "dat"); CHKERRQ(ierr);
1590
1591 // Write custom particle field "Psi"
1592 ierr = WriteSwarmField(user, "Psi", simCtx->step, "dat"); CHKERRQ(ierr);
1593
1594 // Integer fields require special handling
1595
1596 // Write the background mesh cell ID for each particle
1597 ierr = WriteSwarmIntField(user, "DMSwarm_CellID", simCtx->step, "dat"); CHKERRQ(ierr);
1598
1599 // Write the particle location status (e.g., inside or outside the domain)
1600 ierr = WriteSwarmIntField(user, "DMSwarm_location_status", simCtx->step, "dat"); CHKERRQ(ierr);
1601
1602 // Write the unique particle ID
1603 ierr = WriteSwarmIntField(user, "DMSwarm_pid", simCtx->step, "dat"); CHKERRQ(ierr);
1604
1605 LOG_ALLOW(GLOBAL, LOG_INFO, "WriteAllSwarmFields - Finished writing swarm fields.\n");
1606
1607 PetscFunctionReturn(0);
1608}
PetscErrorCode WriteSwarmField(UserCtx *user, const char *field_name, PetscInt ti, const char *ext)
Writes data from a specific field in a PETSc Swarm to a file.
Definition io.c:1427
PetscErrorCode WriteSwarmIntField(UserCtx *user, const char *field_name, PetscInt ti, const char *ext)
Writes integer swarm data by casting it to a temporary Vec and using WriteFieldData.
Definition io.c:1496
Here is the call graph for this function:
Here is the caller graph for this function:

◆ ReadDataFileToArray()

PetscInt ReadDataFileToArray ( const char *  filename,
double **  data_out,
PetscInt *  Nout,
MPI_Comm  comm 
)

Definition at line 2050 of file io.c.

2054{
2055 /* STEP 0: Prepare local variables & log function entry */
2056 PetscMPIInt rank, size;
2057 PetscErrorCode ierr;
2058 FILE *fp = NULL;
2059 PetscInt N = 0; /* number of lines/values read on rank 0 */
2060 double *array = NULL; /* pointer to local array on each rank */
2061 PetscInt fileExistsFlag = 0; /* 0 = doesn't exist, 1 = does exist */
2062
2064 "ReadDataFileToArray - Start reading from file: %s\n",
2065 filename);
2066
2067 /* Basic error checking: data_out, Nout must be non-null. */
2068 if (!filename || !data_out || !Nout) {
2070 "ReadDataFileToArray - Null pointer argument provided.\n");
2071 return 1;
2072 }
2073
2074 /* Determine rank/size for coordinating I/O. */
2075 MPI_Comm_rank(comm, &rank);
2076 MPI_Comm_size(comm, &size);
2077
2078 /* STEP 1: On rank 0, check if file can be opened. */
2079 if (!rank) {
2080 fp = fopen(filename, "r");
2081 if (fp) {
2082 fileExistsFlag = 1;
2083 fclose(fp);
2084 }
2085 }
2086
2087 /* STEP 2: Broadcast file existence to all ranks. */
2088 // In ReadDataFileToArray:
2089 ierr = MPI_Bcast(&fileExistsFlag, 1, MPI_INT, 0, comm); CHKERRQ(ierr);
2090
2091 if (!fileExistsFlag) {
2092 /* If file does not exist, log & return. */
2093 if (!rank) {
2095 "ReadDataFileToArray - File '%s' not found.\n",
2096 filename);
2097 }
2098 return 2;
2099 }
2100
2101 /* STEP 3: Rank 0 re-opens and reads the file, counting lines, etc. */
2102 if (!rank) {
2103 fp = fopen(filename, "r");
2104 if (!fp) {
2106 "ReadDataFileToArray - File '%s' could not be opened for reading.\n",
2107 filename);
2108 return 3;
2109 }
2110
2111 /* (3a) Count lines first. */
2112 {
2113 char line[256];
2114 while (fgets(line, sizeof(line), fp)) {
2115 N++;
2116 }
2117 }
2118
2120 "ReadDataFileToArray - File '%s' has %d lines.\n",
2121 filename, N);
2122
2123 /* (3b) Allocate array on rank 0. */
2124 array = (double*)malloc(N * sizeof(double));
2125 if (!array) {
2126 fclose(fp);
2128 "ReadDataFileToArray - malloc failed for array.\n");
2129 return 4;
2130 }
2131
2132 /* (3c) Rewind & read values into array. */
2133 rewind(fp);
2134 {
2135 PetscInt i = 0;
2136 char line[256];
2137 while (fgets(line, sizeof(line), fp)) {
2138 double val;
2139 if (sscanf(line, "%lf", &val) == 1) {
2140 array[i++] = val;
2141 }
2142 }
2143 }
2144 fclose(fp);
2145
2147 "ReadDataFileToArray - Successfully read %d values from '%s'.\n",
2148 N, filename);
2149 }
2150
2151 /* STEP 4: Broadcast the integer N to all ranks. */
2152 ierr = MPI_Bcast(&N, 1, MPI_INT, 0, comm); CHKERRQ(ierr);
2153
2154 /* STEP 5: Each rank allocates an array to receive the broadcast if rank>0. */
2155 if (rank) {
2156 array = (double*)malloc(N * sizeof(double));
2157 if (!array) {
2159 "ReadDataFileToArray - malloc failed on rank %d.\n",
2160 rank);
2161 return 5;
2162 }
2163 }
2164
2165 /* STEP 6: Broadcast the actual data from rank 0 to all. */
2166 ierr = MPI_Bcast(array, N, MPI_DOUBLE, 0, comm); CHKERRQ(ierr);
2167
2168 /* STEP 7: Assign outputs on all ranks. */
2169 *data_out = array;
2170 *Nout = N;
2171
2173 "ReadDataFileToArray - Done. Provided array of length=%d to all ranks.\n",
2174 N);
2175 return 0; /* success */
2176}
@ LOG_WARNING
Non-critical issues that warrant attention.
Definition logging.h:30

◆ CreateVTKFileFromMetadata()

PetscInt CreateVTKFileFromMetadata ( const char *  filename,
const VTKMetaData meta,
MPI_Comm  comm 
)

Definition at line 128 of file vtk_io.c.

129{
130 PetscMPIInt rank;
131 MPI_Comm_rank(comm, &rank);
132 PetscErrorCode ierr = 0;
133
134 if (!rank) {
135 LOG_ALLOW(GLOBAL, LOG_INFO, "Rank 0 writing combined VTK file '%s'.\n", filename);
136 FILE *fp = fopen(filename, "wb");
137 if (!fp) {
138 LOG_ALLOW(GLOBAL, LOG_ERROR, "fopen failed for %s.\n", filename);
139 return PETSC_ERR_FILE_OPEN;
140 }
141
142 PetscInt boffset = 0;
143
144 ierr = WriteVTKFileHeader(fp, meta, &boffset);
145 if(ierr) { fclose(fp); return ierr; }
146
147 if (meta->coords) {
148 ierr = WriteVTKAppendedBlock(fp, meta->coords, 3 * meta->npoints, sizeof(PetscScalar));
149 if(ierr) { fclose(fp); return ierr; }
150 }
151 /*
152 // ======== DEBUG: Dump first few values of Ucat_nodal if present ========
153 if (!rank) {
154 for (PetscInt i = 0; i < meta->num_point_data_fields; i++) {
155 const VTKFieldInfo* f = &meta->point_data_fields[i];
156 if (strcasecmp(f->name, "Ucat_nodal") == 0) {
157 const PetscScalar *a = f->data;
158 const PetscInt npts = meta->npoints;
159 const PetscInt nshow = (npts < 5) ? npts : 5;
160
161 LOG_ALLOW(GLOBAL, LOG_INFO,
162 "DBG (writer) Ucat_nodal: first %d of %" PetscInt_FMT " tuples:\n",
163 (int)nshow, npts);
164 for (PetscInt t = 0; t < nshow; ++t) {
165 LOG_ALLOW(GLOBAL, LOG_INFO,
166 " Ucat_nodal[%3" PetscInt_FMT "] = (%g, %g, %g)\n",
167 t, (double)a[3*t+0], (double)a[3*t+1], (double)a[3*t+2]);
168 }
169 }
170 }
171 }
172 */
173 // ======== END DEBUG ========
174
175
176 for (PetscInt i = 0; i < meta->num_point_data_fields; i++) {
177 const VTKFieldInfo* field = &meta->point_data_fields[i];
178 if (field->data) {
179 ierr = WriteVTKAppendedBlock(fp, field->data, field->num_components * meta->npoints, sizeof(PetscScalar));
180 if(ierr) { fclose(fp); return ierr; }
181 }
182 }
183 if (meta->fileType == VTK_POLYDATA) {
184 if (meta->connectivity) {
185 ierr = WriteVTKAppendedBlock(fp, meta->connectivity, meta->npoints, sizeof(PetscInt));
186 if(ierr) { fclose(fp); return ierr; }
187 }
188 if (meta->offsets) {
189 ierr = WriteVTKAppendedBlock(fp, meta->offsets, meta->npoints, sizeof(PetscInt));
190 if(ierr) { fclose(fp); return ierr; }
191 }
192 }
193 ierr = WriteVTKFileFooter(fp, meta);
194 if(ierr) { fclose(fp); return ierr; }
195
196 fclose(fp);
197 LOG_ALLOW(GLOBAL, LOG_INFO, "Rank 0 finished writing VTK file '%s'.\n", filename);
198 }
199 return 0;
200}
@ LOG_ERROR
Critical errors that may halt the program.
Definition logging.h:29
PetscInt npoints
Definition variables.h:484
PetscInt num_components
Definition variables.h:471
PetscInt num_point_data_fields
Definition variables.h:487
PetscInt * connectivity
Definition variables.h:488
PetscInt * offsets
Definition variables.h:489
VTKFileType fileType
Definition variables.h:482
PetscScalar * data
Definition variables.h:472
PetscScalar * coords
Definition variables.h:485
VTKFieldInfo point_data_fields[20]
Definition variables.h:486
@ VTK_POLYDATA
Definition variables.h:478
Stores all necessary information for a single data array in a VTK file.
Definition variables.h:469
static PetscErrorCode WriteVTKFileFooter(FILE *fp, const VTKMetaData *meta)
Definition vtk_io.c:116
static PetscErrorCode WriteVTKAppendedBlock(FILE *fp, const void *data, PetscInt num_elements, size_t element_size)
Definition vtk_io.c:17
static PetscErrorCode WriteVTKFileHeader(FILE *fp, const VTKMetaData *meta, PetscInt *boffset)
Definition vtk_io.c:106
Here is the call graph for this function:
Here is the caller graph for this function:

◆ VecToArrayOnRank0()

PetscErrorCode VecToArrayOnRank0 ( Vec  inVec,
PetscInt *  N,
double **  arrayOut 
)

Gathers the contents of a distributed PETSc Vec into a single array on rank 0.

Parameters
[in]inVecThe input (possibly distributed) Vec.
[out]NThe global size of the vector.
[out]arrayOutOn rank 0, points to the newly allocated array holding all data. On other ranks, it is set to NULL.
Returns
PetscErrorCode Return 0 on success, nonzero on failure.

Gathers the contents of a distributed PETSc Vec into a single array on rank 0.

If the Vec has a DMDA attached, the gather is performed in DMDA "natural" ordering.

Parameters
[in]inVecThe PETSc Vec (may be distributed).
[out]NGlobal length of the Vec (includes dof).
[out]arrayOutOn rank 0, newly allocated buffer with the gathered values (PetscScalar-sized). On other ranks, set to NULL.

Definition at line 1619 of file io.c.

1620{
1621 PetscErrorCode ierr;
1622 MPI_Comm comm;
1623 PetscMPIInt rank;
1624 PetscInt globalSize;
1625 DM dm = NULL;
1626 const char *dmtype = NULL;
1627
1628 /* For DMDA path */
1629 Vec nat = NULL, seqNat = NULL;
1630 VecScatter scatNat = NULL;
1631 const PetscScalar *nar = NULL;
1632 PetscScalar *buf = NULL;
1633
1634 /* For generic (no DM) path */
1635 Vec seq = NULL;
1636 VecScatter scat = NULL;
1637 const PetscScalar *sar = NULL;
1638
1639 PetscFunctionBeginUser;
1640
1641 ierr = PetscObjectGetComm((PetscObject)inVec, &comm); CHKERRQ(ierr);
1642 ierr = MPI_Comm_rank(comm, &rank); CHKERRQ(ierr);
1643 ierr = VecGetSize(inVec, &globalSize); CHKERRQ(ierr);
1644 *N = globalSize;
1645 *arrayOut = NULL;
1646
1647 ierr = VecGetDM(inVec, &dm); CHKERRQ(ierr);
1648 if (dm) { ierr = DMGetType(dm, &dmtype); CHKERRQ(ierr); }
1649
1650 if (dmtype && !strcmp(dmtype, DMDA)) {
1651 /* --- DMDA path: go to NATURAL ordering, then gather to rank 0 --- */
1652 ierr = DMDACreateNaturalVector(dm, &nat); CHKERRQ(ierr);
1653 ierr = DMDAGlobalToNaturalBegin(dm, inVec, INSERT_VALUES, nat); CHKERRQ(ierr);
1654 ierr = DMDAGlobalToNaturalEnd (dm, inVec, INSERT_VALUES, nat); CHKERRQ(ierr);
1655
1656 ierr = VecScatterCreateToZero(nat, &scatNat, &seqNat); CHKERRQ(ierr);
1657 ierr = VecScatterBegin(scatNat, nat, seqNat, INSERT_VALUES, SCATTER_FORWARD); CHKERRQ(ierr);
1658 ierr = VecScatterEnd (scatNat, nat, seqNat, INSERT_VALUES, SCATTER_FORWARD); CHKERRQ(ierr);
1659
1660 if (rank == 0) {
1661 PetscInt nseq;
1662 ierr = VecGetLocalSize(seqNat, &nseq); CHKERRQ(ierr);
1663 ierr = VecGetArrayRead(seqNat, &nar); CHKERRQ(ierr);
1664
1665 ierr = PetscMalloc1(nseq, &buf); CHKERRQ(ierr);
1666 ierr = PetscMemcpy(buf, nar, (size_t)nseq * sizeof(PetscScalar)); CHKERRQ(ierr);
1667
1668 ierr = VecRestoreArrayRead(seqNat, &nar); CHKERRQ(ierr);
1669 *arrayOut = (double*)buf; /* hand back as double* for drop-in compatibility */
1670 }
1671
1672 ierr = VecScatterDestroy(&scatNat); CHKERRQ(ierr);
1673 ierr = VecDestroy(&seqNat); CHKERRQ(ierr);
1674 ierr = VecDestroy(&nat); CHKERRQ(ierr);
1675 } else {
1676 /* --- No DM attached: plain gather in Vec’s global (parallel) layout order --- */
1677 ierr = VecScatterCreateToZero(inVec, &scat, &seq); CHKERRQ(ierr);
1678 ierr = VecScatterBegin(scat, inVec, seq, INSERT_VALUES, SCATTER_FORWARD); CHKERRQ(ierr);
1679 ierr = VecScatterEnd (scat, inVec, seq, INSERT_VALUES, SCATTER_FORWARD); CHKERRQ(ierr);
1680
1681 if (rank == 0) {
1682 PetscInt nseq;
1683 ierr = VecGetLocalSize(seq, &nseq); CHKERRQ(ierr);
1684 ierr = VecGetArrayRead(seq, &sar); CHKERRQ(ierr);
1685
1686 ierr = PetscMalloc1(nseq, &buf); CHKERRQ(ierr);
1687 ierr = PetscMemcpy(buf, sar, (size_t)nseq * sizeof(PetscScalar)); CHKERRQ(ierr);
1688
1689 ierr = VecRestoreArrayRead(seq, &sar); CHKERRQ(ierr);
1690 *arrayOut = (double*)buf;
1691 }
1692
1693 ierr = VecScatterDestroy(&scat); CHKERRQ(ierr);
1694 ierr = VecDestroy(&seq); CHKERRQ(ierr);
1695 }
1696
1697 PetscFunctionReturn(0);
1698}
Here is the caller graph for this function:

◆ SwarmFieldToArrayOnRank0()

PetscErrorCode SwarmFieldToArrayOnRank0 ( DM  swarm,
const char *  field_name,
PetscInt *  n_total_particles,
PetscInt *  n_components,
void **  gathered_array 
)

Gathers any DMSwarm field from all ranks to a single, contiguous array on rank 0.

This is a generic, type-aware version of SwarmFieldToArrayOnRank0. It is a COLLECTIVE operation.

Parameters
[in]swarmThe DMSwarm to gather from.
[in]field_nameThe name of the field to gather.
[out]n_total_particles(Output on rank 0) Total number of particles in the global swarm.
[out]n_components(Output on rank 0) Number of components for the field.
[out]gathered_array(Output on rank 0) A newly allocated array containing the full, gathered data. The caller is responsible for freeing this memory and for casting it to the correct type.
Returns
PetscErrorCode

Definition at line 1714 of file io.c.

1715{
1716 PetscErrorCode ierr;
1717 PetscMPIInt rank, size;
1718 PetscInt nlocal, nglobal, bs;
1719 void *local_array_void;
1720 size_t element_size = 0;
1721 MPI_Datatype mpi_type = MPI_BYTE; // We'll send raw bytes
1722
1723 PetscFunctionBeginUser;
1724
1725 ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank); CHKERRQ(ierr);
1726 ierr = MPI_Comm_size(PETSC_COMM_WORLD, &size); CHKERRQ(ierr);
1727
1728 // All ranks get swarm properties to determine send/receive counts
1729 ierr = DMSwarmGetLocalSize(swarm, &nlocal); CHKERRQ(ierr);
1730 ierr = DMSwarmGetSize(swarm, &nglobal); CHKERRQ(ierr);
1731 ierr = DMSwarmGetField(swarm, field_name, &bs, NULL, &local_array_void); CHKERRQ(ierr);
1732
1733 // Determine the size of one element of the field's data type
1734 if (strcasecmp(field_name, "DMSwarm_pid") == 0) {
1735 element_size = sizeof(PetscInt64);
1736 } else if (strcasecmp(field_name, "DMSwarm_CellID") == 0 || strcasecmp(field_name, "DMSwarm_location_status") == 0) {
1737 element_size = sizeof(PetscInt);
1738 } else {
1739 element_size = sizeof(PetscScalar);
1740 }
1741
1742 if (rank == 0) {
1743 *n_total_particles = nglobal;
1744 *n_components = bs;
1745 *gathered_array = NULL; // Initialize output
1746 }
1747
1748 if (size == 1) { // Serial case is a simple copy
1749 if (rank == 0) {
1750 ierr = PetscMalloc(nglobal * bs * element_size, gathered_array); CHKERRQ(ierr);
1751 ierr = PetscMemcpy(*gathered_array, local_array_void, nglobal * bs * element_size); CHKERRQ(ierr);
1752 }
1753 } else { // Parallel case: use MPI_Gatherv
1754 PetscInt *recvcounts = NULL, *displs = NULL;
1755 if (rank == 0) {
1756 ierr = PetscMalloc1(size, &recvcounts); CHKERRQ(ierr);
1757 ierr = PetscMalloc1(size, &displs); CHKERRQ(ierr);
1758 }
1759 PetscInt sendcount = nlocal * bs;
1760
1761 // Gather the number of elements (not bytes) from each rank
1762 ierr = MPI_Gather(&sendcount, 1, MPIU_INT, recvcounts, 1, MPIU_INT, 0, PETSC_COMM_WORLD); CHKERRQ(ierr);
1763
1764 if (rank == 0) {
1765 displs[0] = 0;
1766 // Convert counts and calculate displacements in terms of BYTES
1767 for (PetscMPIInt i = 0; i < size; i++) recvcounts[i] *= element_size;
1768 for (PetscMPIInt i = 1; i < size; i++) displs[i] = displs[i-1] + recvcounts[i-1];
1769
1770 ierr = PetscMalloc(nglobal * bs * element_size, gathered_array); CHKERRQ(ierr);
1771 }
1772
1773 // Use Gatherv with MPI_BYTE to handle any data type generically
1774 ierr = MPI_Gatherv(local_array_void, nlocal * bs * element_size, MPI_BYTE,
1775 *gathered_array, recvcounts, displs, MPI_BYTE,
1776 0, PETSC_COMM_WORLD); CHKERRQ(ierr);
1777
1778 if (rank == 0) {
1779 ierr = PetscFree(recvcounts); CHKERRQ(ierr);
1780 ierr = PetscFree(displs); CHKERRQ(ierr);
1781 }
1782 }
1783
1784 ierr = DMSwarmRestoreField(swarm, field_name, &bs, NULL, &local_array_void); CHKERRQ(ierr);
1785
1786 PetscFunctionReturn(0);
1787}
Here is the caller graph for this function:

◆ ReadSwarmField()

PetscErrorCode ReadSwarmField ( UserCtx user,
const char *  field_name,
PetscInt  ti,
const char *  ext 
)

Reads data from a file into a specified field of a PETSc DMSwarm.

This function is the counterpart to WriteSwarmField(). It creates a global PETSc vector that references the specified DMSwarm field, uses ReadFieldData() to read the data from a file, and then destroys the global vector reference.

Parameters
[in]userPointer to the UserCtx structure (containing user->swarm).
[in]field_nameName of the DMSwarm field to read into (must be previously declared/allocated).
[in]tiTime index used to construct the input file name.
[in]extFile extension (e.g., "dat" or "bin").
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.
Note
Compatible with PETSc 3.14.x.

Definition at line 977 of file io.c.

978{
979 PetscErrorCode ierr;
980 DM swarm;
981 Vec fieldVec;
982
983 PetscFunctionBegin;
984
985 swarm = user->swarm;
986
987 LOG_ALLOW(GLOBAL,LOG_DEBUG," ReadSwarmField Begins \n");
988
989 /* 2) Create a global vector that references the specified Swarm field. */
990 ierr = DMSwarmCreateGlobalVectorFromField(swarm, field_name, &fieldVec);CHKERRQ(ierr);
991
992 LOG_ALLOW(GLOBAL,LOG_DEBUG," Vector created from Field \n");
993
994 /* 3) Use the ReadFieldData() function to read data into fieldVec. */
995 ierr = ReadFieldData(user, field_name, fieldVec, ti, ext);CHKERRQ(ierr);
996
997 /* 4) Destroy the global vector reference. */
998 ierr = DMSwarmDestroyGlobalVectorFromField(swarm, field_name, &fieldVec);CHKERRQ(ierr);
999
1000 PetscFunctionReturn(0);
1001}
Here is the call graph for this function:
Here is the caller graph for this function:

◆ ReadSwarmIntField()

PetscErrorCode ReadSwarmIntField ( UserCtx user,
const char *  field_name,
PetscInt  ti,
const char *  ext 
)

Reads integer swarm data by using ReadFieldData and casting the result.

This function is the counterpart to WriteSwarmIntField. It reads a file containing floating-point data (that was originally integer) into a temporary Vec and then casts it back to the integer swarm field. It works by:

  1. Creating a temporary parallel Vec.
  2. Calling the standard ReadFieldData() to populate this Vec.
  3. Accessing the local data of both the Vec and the swarm field.
  4. Populating the swarm's integer field by casting each PetscScalar back to a PetscInt.
  5. Destroying the temporary Vec.
Parameters
[in]userPointer to the UserCtx structure.
[in]field_nameName of the integer Swarm field to be read.
[in]tiTime index for the input file.
[in]extFile extension.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 1022 of file io.c.

1023{
1024 PetscErrorCode ierr;
1025 DM swarm = user->swarm;
1026 Vec temp_vec;
1027 PetscInt nlocal, nglobal, bs, i;
1028 const PetscScalar *scalar_array; // Read-only pointer from the temp Vec
1029 void *field_array_void;
1030
1031
1032 PetscFunctionBeginUser;
1033
1034 LOG_ALLOW(GLOBAL, LOG_DEBUG, "ReadSwarmIntField - Reading '%s' via temporary Vec.\n", field_name);
1035
1036 // Get the properties of the swarm field to determine the expected layout
1037 ierr = DMSwarmGetLocalSize(swarm, &nlocal); CHKERRQ(ierr);
1038 ierr = DMSwarmGetSize(swarm, &nglobal); CHKERRQ(ierr);
1039 // We get the block size but not the data pointer yet
1040 ierr = DMSwarmGetField(swarm, field_name, &bs, NULL, NULL); CHKERRQ(ierr);
1041 ierr = DMSwarmRestoreField(swarm, field_name, &bs, NULL, NULL); CHKERRQ(ierr);
1042
1043 // Create a temporary Vec with the CORRECT layout to receive the data
1044 ierr = VecCreate(PETSC_COMM_WORLD, &temp_vec); CHKERRQ(ierr);
1045 ierr = VecSetType(temp_vec, VECMPI); CHKERRQ(ierr);
1046 ierr = VecSetSizes(temp_vec, nlocal * bs, nglobal * bs); CHKERRQ(ierr);
1047 ierr = VecSetBlockSize(temp_vec, bs); CHKERRQ(ierr);
1048 ierr = VecSetUp(temp_vec); CHKERRQ(ierr);
1049
1050 // Call your existing reader to populate the temporary Vec
1051 ierr = ReadFieldData(user, field_name, temp_vec, ti, ext); CHKERRQ(ierr);
1052
1053 // Get local pointers
1054 ierr = VecGetArrayRead(temp_vec, &scalar_array); CHKERRQ(ierr);
1055 ierr = DMSwarmGetField(swarm, field_name, NULL, NULL, &field_array_void); CHKERRQ(ierr);
1056
1057 // Perform the cast back, using the correct loop size (nlocal * bs)
1058 if (strcmp(field_name, "DMSwarm_pid") == 0) {
1059 PetscInt64 *int64_array = (PetscInt64 *)field_array_void;
1060 for (i = 0; i < nlocal * bs; i++) {
1061 int64_array[i] = (PetscInt64)scalar_array[i];
1062 }
1063 } else {
1064 PetscInt *int_array = (PetscInt *)field_array_void;
1065 for (i = 0; i < nlocal * bs; i++) {
1066 int_array[i] = (PetscInt)scalar_array[i];
1067 }
1068 }
1069
1070 // Restore access
1071 ierr = DMSwarmRestoreField(swarm, field_name, NULL, NULL, &field_array_void); CHKERRQ(ierr);
1072 ierr = VecRestoreArrayRead(temp_vec, &scalar_array); CHKERRQ(ierr);
1073
1074 // 6. Clean up
1075 ierr = VecDestroy(&temp_vec); CHKERRQ(ierr);
1076
1077 PetscFunctionReturn(0);
1078}
Here is the call graph for this function:
Here is the caller graph for this function:

◆ ReadAllSwarmFields()

PetscErrorCode ReadAllSwarmFields ( UserCtx user,
PetscInt  ti 
)

Reads multiple fields (positions, velocity, CellID, and weight) into a DMSwarm.

This function is analogous to ReadSimulationFields() but targets a DMSwarm. Each Swarm field is read from a separate file using ReadSwarmField().

Parameters
[in,out]userPointer to the UserCtx structure containing the DMSwarm (user->swarm).
[in]tiTime index for constructing the file name.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Reads multiple fields (positions, velocity, CellID, and weight) into a DMSwarm.

This function reads all necessary and optional fields for a DMSwarm for a given timestep. It assumes the swarm has already been resized to match the particle count in the input files.

The 'position' field is considered MANDATORY. If its file is missing or corrupt, the function will return a fatal error.

All other fields (velocity, CellID, weight, etc.) are OPTIONAL. If their corresponding files are not found, a warning is logged, and the function continues without error. If an optional file IS found but is corrupt or has a size mismatch, it is treated as a fatal error.

Parameters
[in,out]userPointer to the UserCtx structure containing the DMSwarm (user->swarm).
[in]tiTime index for constructing the file names.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 1100 of file io.c.

1101{
1102 PetscErrorCode ierr;
1103 PetscInt nGlobal;
1104
1105 PetscFunctionBeginUser;
1106 ierr = DMSwarmGetSize(user->swarm, &nGlobal); CHKERRQ(ierr);
1107 LOG_ALLOW(GLOBAL, LOG_INFO, "Reading DMSwarm fields for timestep %d (swarm size is %d).\n", ti, nGlobal);
1108
1109 if (nGlobal == 0) {
1110 LOG_ALLOW(GLOBAL, LOG_INFO, "Swarm is empty for timestep %d. Nothing to read.\n", ti);
1111 PetscFunctionReturn(0);
1112 }
1113
1114 /* 1) Read positions (REQUIRED) */
1115 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Reading mandatory position field...\n");
1116 ierr = ReadSwarmField(user, "position", ti, "dat");
1117 if (ierr) {
1118 SETERRQ(PETSC_COMM_WORLD, ierr, "Failed to read MANDATORY 'position' field for step %d. Cannot continue.", ti);
1119 }
1120 LOG_ALLOW(GLOBAL, LOG_INFO, "Successfully read mandatory position field for step %d.\n", ti);
1121
1122 /* 2) Read all OPTIONAL fields using the helper function. */
1123 /* The helper will print a warning and continue if a file is not found. */
1124 ierr = ReadOptionalSwarmField(user, "velocity", "Velocity", ti, "dat"); CHKERRQ(ierr);
1125 ierr = ReadOptionalSwarmField(user, "DMSwarm_pid", "Particle ID", ti, "dat"); CHKERRQ(ierr);
1126 ierr = ReadOptionalSwarmField(user, "DMSwarm_CellID", "Cell ID", ti, "dat"); CHKERRQ(ierr);
1127 ierr = ReadOptionalSwarmField(user, "weight", "Particle Weight", ti, "dat"); CHKERRQ(ierr);
1128 ierr = ReadOptionalSwarmField(user, "Psi", "Scalar Psi", ti, "dat"); CHKERRQ(ierr);
1129 ierr = ReadOptionalSwarmField(user, "DMSwarm_location_status", "Migration Status", ti, "dat"); CHKERRQ(ierr);
1130
1131 LOG_ALLOW(GLOBAL, LOG_INFO, "Finished reading DMSwarm fields for timestep %d.\n", ti);
1132 PetscFunctionReturn(0);
1133}
PetscErrorCode ReadSwarmField(UserCtx *user, const char *field_name, PetscInt ti, const char *ext)
Reads data from a file into a specified field of a PETSc DMSwarm.
Definition io.c:977
static PetscErrorCode ReadOptionalSwarmField(UserCtx *user, const char *field_name, const char *field_label, PetscInt ti, const char *ext)
Checks for and reads an optional DMSwarm field from a file.
Definition io.c:571
Here is the call graph for this function:
Here is the caller graph for this function:

◆ ReadPositionsFromFile()

PetscErrorCode ReadPositionsFromFile ( PetscInt  timeIndex,
UserCtx user,
double **  coordsArray,
PetscInt *  Ncoords 
)

Reads coordinate data (for particles) from file into a PETSc Vec, then gathers it to rank 0.

This function uses ReadFieldData to fill a PETSc Vec with coordinate data, then leverages VecToArrayOnRank0 to gather that data into a contiguous array (valid on rank 0 only).

Parameters
[in]timeIndexThe time index used to construct file names.
[in]userPointer to the user context.
[out]coordsArrayOn rank 0, will point to a newly allocated array holding the coordinates.
[out]NcoordsOn rank 0, the length of coordsArray. On other ranks, 0.
Returns
PetscErrorCode Returns 0 on success, or non-zero on failures.

Definition at line 2192 of file io.c.

2196{
2197 PetscFunctionBeginUser;
2198
2199 PetscErrorCode ierr;
2200 Vec coordsVec;
2201
2202 LOG_ALLOW(GLOBAL, LOG_DEBUG, "ReadPositionsFromFile - Creating coords Vec.\n");
2203 ierr = VecCreate(PETSC_COMM_WORLD, &coordsVec);CHKERRQ(ierr);
2204 ierr = VecSetFromOptions(coordsVec);CHKERRQ(ierr);
2205
2206 // For example: "position" is the name of the coordinate data
2207 ierr = ReadFieldData(user, "position", coordsVec, timeIndex, "dat");
2208 if (ierr) {
2210 "ReadPositionsFromFile - Error reading position data (ti=%d).\n",
2211 timeIndex);
2212 PetscFunctionReturn(ierr);
2213 }
2214
2215 LOG_ALLOW(GLOBAL, LOG_DEBUG, "ReadPositions - Gathering coords Vec to rank 0.\n");
2216 ierr = VecToArrayOnRank0(coordsVec, Ncoords, coordsArray);CHKERRQ(ierr);
2217
2218 ierr = VecDestroy(&coordsVec);CHKERRQ(ierr);
2219
2221 "ReadPositionsFromFile - Successfully gathered coordinates. Ncoords=%d.\n", *Ncoords);
2222 PetscFunctionReturn(0);
2223}
PetscErrorCode VecToArrayOnRank0(Vec inVec, PetscInt *N, double **arrayOut)
Gather a (possibly distributed) PETSc Vec onto rank 0 as a contiguous C array.
Definition io.c:1619
Here is the call graph for this function:

◆ ReadFieldDataToRank0()

PetscErrorCode ReadFieldDataToRank0 ( PetscInt  timeIndex,
const char *  fieldName,
UserCtx user,
double **  scalarArray,
PetscInt *  Nscalars 
)

Reads a named field from file into a PETSc Vec, then gathers it to rank 0.

This function wraps ReadFieldData and VecToArrayOnRank0 into a single step. The gathered data is stored in scalarArray on rank 0, with its length in Nscalars.

Parameters
[in]timeIndexThe time index used to construct file names.
[in]fieldNameName of the field to be read (e.g., "velocity").
[in]userPointer to the user context.
[out]scalarArrayOn rank 0, a newly allocated array holding the field data.
[out]NscalarsOn rank 0, length of scalarArray. On other ranks, 0.
Returns
PetscErrorCode Returns 0 on success, or non-zero on failures.

Definition at line 2240 of file io.c.

2245{
2246 PetscFunctionBeginUser;
2247
2248 PetscErrorCode ierr;
2249 Vec fieldVec;
2250
2251 LOG_ALLOW(GLOBAL, LOG_DEBUG, "ReadFieldDataWrapper - Creating field Vec.\n");
2252 ierr = VecCreate(PETSC_COMM_WORLD, &fieldVec);CHKERRQ(ierr);
2253 ierr = VecSetFromOptions(fieldVec);CHKERRQ(ierr);
2254
2255 ierr = ReadFieldData(user, fieldName, fieldVec, timeIndex, "dat");
2256 if (ierr) {
2258 "ReadFieldDataWrapper - Error reading field '%s' (ti=%d).\n",
2259 fieldName, timeIndex);
2260 PetscFunctionReturn(ierr);
2261 }
2262
2263 LOG_ALLOW(GLOBAL, LOG_DEBUG, "ReadFieldDataWrapper - Gathering field Vec to rank 0.\n");
2264 ierr = VecToArrayOnRank0(fieldVec, Nscalars, scalarArray);CHKERRQ(ierr);
2265
2266 ierr = VecDestroy(&fieldVec);CHKERRQ(ierr);
2267
2269 "ReadFieldDataWrapper - Successfully gathered field '%s'. Nscalars=%d.\n",
2270 fieldName, *Nscalars);
2271 PetscFunctionReturn(0);
2272}
Here is the call graph for this function:

◆ DisplayBanner()

PetscErrorCode DisplayBanner ( SimCtx simCtx)

Displays a structured banner summarizing the simulation configuration.

This function prints key simulation parameters to standard output. It is intended to be called ONLY by MPI rank 0. It retrieves global domain bounds from user->global_domain_bbox and boundary conditions for all faces from user->face_bc_types.

Parameters
[in]userPointer to UserCtx structure.
[in]bboxlist(If rank 0 needed to compute global_domain_bbox here, otherwise NULL)
Returns
PetscErrorCode Returns 0 on success.

This function prints key simulation parameters to standard output. It is intended to be called ONLY by MPI rank 0. It retrieves global domain bounds from user->global_domain_bbox and boundary conditions for all faces from user->face_bc_types.

Parameters
[in]userPointer to UserCtx structure.
[in]StartTimeInitial simulation time.
[in]StartStepStarting timestep index.
[in]StepsToRunTotal number of timesteps to run.
[in]num_mpi_procsTotal number of MPI processes.
[in]total_num_particlesTotal number of particles.
[in]bboxlist(If rank 0 needed to compute global_domain_bbox here, otherwise NULL)
Returns
PetscErrorCode Returns 0 on success.

Definition at line 1807 of file io.c.

1808{
1809 PetscErrorCode ierr;
1810 PetscMPIInt rank;
1811 Cmpnts global_min_coords, global_max_coords;
1812 PetscReal StartTime;
1813 PetscInt StartStep,StepsToRun,total_num_particles;
1814 PetscMPIInt num_mpi_procs;
1815
1816 // SimCtx *simCtx = user->simCtx;
1817 UserCtx *user = simCtx->usermg.mgctx[simCtx->usermg.mglevels - 1].user;
1818 num_mpi_procs = simCtx->size;
1819 StartTime = simCtx->StartTime;
1820 StartStep = simCtx->StartStep;
1821 StepsToRun = simCtx->StepsToRun;
1822 total_num_particles = simCtx->np;
1823 BoundingBox *bboxlist_on_rank0 = simCtx->bboxlist;
1824
1825
1826 PetscFunctionBeginUser;
1827
1828 if (!user) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "DisplayBanner - UserCtx pointer is NULL.");
1829 ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank); CHKERRQ(ierr);
1830
1831 if (rank == 0) {
1832 // If global_domain_bbox is not pre-populated in UserCtx, compute it here from bboxlist_on_rank0
1833 // This assumes bboxlist_on_rank0 is valid and contains all local bounding boxes on rank 0.
1834 if (bboxlist_on_rank0 && num_mpi_procs > 0) {
1835 global_min_coords = bboxlist_on_rank0[0].min_coords;
1836 global_max_coords = bboxlist_on_rank0[0].max_coords;
1837 for (PetscMPIInt p = 1; p < num_mpi_procs; ++p) {
1838 global_min_coords.x = PetscMin(global_min_coords.x, bboxlist_on_rank0[p].min_coords.x);
1839 global_min_coords.y = PetscMin(global_min_coords.y, bboxlist_on_rank0[p].min_coords.y);
1840 global_min_coords.z = PetscMin(global_min_coords.z, bboxlist_on_rank0[p].min_coords.z);
1841 global_max_coords.x = PetscMax(global_max_coords.x, bboxlist_on_rank0[p].max_coords.x);
1842 global_max_coords.y = PetscMax(global_max_coords.y, bboxlist_on_rank0[p].max_coords.y);
1843 global_max_coords.z = PetscMax(global_max_coords.z, bboxlist_on_rank0[p].max_coords.z);
1844 }
1845 // Optionally store this in user->global_domain_bbox if it's useful elsewhere
1846 // user->global_domain_bbox.min_coords = global_min_coords;
1847 // user->global_domain_bbox.max_coords = global_max_coords;
1848 } else {
1849 // Fallback or warning if bboxlist is not available for global calculation
1850 LOG_ALLOW(PETSC_COMM_SELF, LOG_WARNING, "DisplayBanner (Rank 0) - bboxlist not provided or num_mpi_procs <=0; using user->bbox for domain bounds.\n");
1851 // global_min_coords = user->bbox.min_coords; // Use local bbox of rank 0 as fallback
1852 // global_max_coords = user->bbox.max_coords;
1853 }
1854
1855
1856 ierr = PetscPrintf(PETSC_COMM_SELF, "\n"); CHKERRQ(ierr);
1857 ierr = PetscPrintf(PETSC_COMM_SELF, "=============================================================\n"); CHKERRQ(ierr);
1858 ierr = PetscPrintf(PETSC_COMM_SELF, " CASE SUMMARY \n"); CHKERRQ(ierr);
1859 ierr = PetscPrintf(PETSC_COMM_SELF, "=============================================================\n"); CHKERRQ(ierr);
1860 ierr = PetscPrintf(PETSC_COMM_SELF, " Grid Points : %d X %d X %d\n", user->IM, user->JM, user->KM); CHKERRQ(ierr);
1861 ierr = PetscPrintf(PETSC_COMM_SELF, " Cells : %d X %d X %d\n", user->IM - 1, user->JM - 1, user->KM - 1); CHKERRQ(ierr);
1862 ierr = PetscPrintf(PETSC_COMM_SELF, " Global Domain Bounds (X) : %.6f to %.6f\n", (double)global_min_coords.x, (double)global_max_coords.x); CHKERRQ(ierr);
1863 ierr = PetscPrintf(PETSC_COMM_SELF, " Global Domain Bounds (Y) : %.6f to %.6f\n", (double)global_min_coords.y, (double)global_max_coords.y); CHKERRQ(ierr);
1864 ierr = PetscPrintf(PETSC_COMM_SELF, " Global Domain Bounds (Z) : %.6f to %.6f\n", (double)global_min_coords.z, (double)global_max_coords.z); CHKERRQ(ierr);
1865 ierr = PetscPrintf(PETSC_COMM_SELF, "-------------------- Boundary Conditions --------------------\n"); CHKERRQ(ierr);
1866 const int face_name_width = 17; // Adjusted for longer names (Zeta,Eta,Xi)
1867 for (PetscInt i_face = 0; i_face < 6; ++i_face) {
1868 BCFace current_face = (BCFace)i_face;
1869 // The BCFaceToString will now return the Xi, Eta, Zeta versions
1870 const char* face_str = BCFaceToString(current_face);
1871 const char* bc_type_str = BCTypeToString(user->boundary_faces[current_face].mathematical_type);
1872
1873 ierr = PetscPrintf(PETSC_COMM_SELF, " Face %-*s : %s\n",
1874 face_name_width, face_str, bc_type_str); CHKERRQ(ierr);
1875 }
1876 ierr = PetscPrintf(PETSC_COMM_SELF, "-------------------------------------------------------------\n"); CHKERRQ(ierr);
1877 ierr = PetscPrintf(PETSC_COMM_SELF, " Start Time : %.4f\n", (double)StartTime); CHKERRQ(ierr);
1878 ierr = PetscPrintf(PETSC_COMM_SELF, " Timestep Size : %.4f\n", (double)simCtx->dt); CHKERRQ(ierr);
1879 ierr = PetscPrintf(PETSC_COMM_SELF, " Starting Step : %d\n", StartStep); CHKERRQ(ierr);
1880 ierr = PetscPrintf(PETSC_COMM_SELF, " Total Steps to Run : %d\n", StepsToRun); CHKERRQ(ierr);
1881 ierr = PetscPrintf(PETSC_COMM_SELF, " Number of MPI Processes : %d\n", num_mpi_procs); CHKERRQ(ierr);
1882 ierr = PetscPrintf(PETSC_COMM_WORLD," Number of Particles : %d\n", total_num_particles); CHKERRQ(ierr);
1883 ierr = PetscPrintf(PETSC_COMM_WORLD," Reynolds Number : %le\n", simCtx->ren); CHKERRQ(ierr);
1884 ierr = PetscPrintf(PETSC_COMM_WORLD," Stanton Number : %le\n", simCtx->st); CHKERRQ(ierr);
1885 ierr = PetscPrintf(PETSC_COMM_WORLD," CFL Number : %le\n", simCtx->cfl); CHKERRQ(ierr);
1886 ierr = PetscPrintf(PETSC_COMM_WORLD," Von-Neumann Number : %le\n", simCtx->vnn); CHKERRQ(ierr);
1887 ierr = PetscPrintf(PETSC_COMM_SELF, " Particle Initialization Mode: %d\n", simCtx->ParticleInitialization); CHKERRQ(ierr);
1888 if (simCtx->ParticleInitialization == 0) {
1889 if (user->inletFaceDefined) {
1890 ierr = PetscPrintf(PETSC_COMM_SELF, " Particles Initialized At : %s (Enum Val: %d)\n", BCFaceToString(user->identifiedInletBCFace), user->identifiedInletBCFace); CHKERRQ(ierr);
1891 } else {
1892 ierr = PetscPrintf(PETSC_COMM_SELF, " Particles Initialized At : --- (No INLET face identified)\n"); CHKERRQ(ierr);
1893 }
1894 }
1895 ierr = PetscPrintf(PETSC_COMM_SELF, " Field Initialization Mode : %d\n", simCtx->FieldInitialization); CHKERRQ(ierr);
1896 if (simCtx->FieldInitialization == 1) {
1897 ierr = PetscPrintf(PETSC_COMM_SELF, " Constant Velocity : x - %.4f, y - %.4f, z - %.4f \n", (double)simCtx->InitialConstantContra.x,(double)simCtx->InitialConstantContra.y,(double)simCtx->InitialConstantContra.z ); CHKERRQ(ierr);
1898 }
1899
1900 ierr = PetscPrintf(PETSC_COMM_SELF, "=============================================================\n"); CHKERRQ(ierr);
1901 ierr = PetscPrintf(PETSC_COMM_SELF, "\n"); CHKERRQ(ierr);
1902 }
1903 PetscFunctionReturn(0);
1904}
const char * BCFaceToString(BCFace face)
Helper function to convert BCFace enum to a string representation.
Definition logging.c:645
const char * BCTypeToString(BCType type)
Helper function to convert BCType enum to a string representation.
Definition logging.c:662
UserCtx * user
Definition variables.h:418
PetscBool inletFaceDefined
Definition variables.h:649
BoundaryFaceConfig boundary_faces[6]
Definition variables.h:648
BCFace identifiedInletBCFace
Definition variables.h:650
PetscReal StartTime
Definition variables.h:526
PetscReal st
Definition variables.h:549
UserMG usermg
Definition variables.h:599
PetscInt FieldInitialization
Definition variables.h:550
PetscReal ren
Definition variables.h:549
Cmpnts max_coords
Maximum x, y, z coordinates of the bounding box.
Definition variables.h:155
PetscReal dt
Definition variables.h:527
PetscInt StepsToRun
Definition variables.h:524
PetscInt StartStep
Definition variables.h:523
Cmpnts min_coords
Minimum x, y, z coordinates of the bounding box.
Definition variables.h:154
PetscScalar x
Definition variables.h:100
BoundingBox * bboxlist
Definition variables.h:588
PetscScalar z
Definition variables.h:100
PetscInt mglevels
Definition variables.h:425
PetscReal cfl
Definition variables.h:549
Cmpnts InitialConstantContra
Definition variables.h:551
PetscScalar y
Definition variables.h:100
PetscMPIInt size
Definition variables.h:517
MGCtx * mgctx
Definition variables.h:428
BCType mathematical_type
Definition variables.h:264
PetscInt ParticleInitialization
Definition variables.h:589
PetscReal vnn
Definition variables.h:549
BCFace
Identifies the six logical faces of a structured computational block.
Definition variables.h:199
Defines a 3D axis-aligned bounding box.
Definition variables.h:153
A 3D point or vector with PetscScalar components.
Definition variables.h:99
User-defined context containing data specific to a single computational grid level.
Definition variables.h:630
Here is the call graph for this function:
Here is the caller graph for this function:

◆ StringToBCFace()

PetscErrorCode StringToBCFace ( const char *  str,
BCFace face_out 
)

Converts a string representation of a face to a BCFace enum.

Parameters
strThe input string (e.g., "-Xi", "+Zeta"). Case-insensitive.
[out]face_outThe resulting BCFace enum.
Returns
0 on success.

Definition at line 246 of file io.c.

246 {
247 if (strcasecmp(str, "-Xi") == 0) *face_out = BC_FACE_NEG_X;
248 else if (strcasecmp(str, "+Xi") == 0) *face_out = BC_FACE_POS_X;
249 else if (strcasecmp(str, "-Eta") == 0) *face_out = BC_FACE_NEG_Y;
250 else if (strcasecmp(str, "+Eta") == 0) *face_out = BC_FACE_POS_Y;
251 else if (strcasecmp(str, "-Zeta") == 0) *face_out = BC_FACE_NEG_Z;
252 else if (strcasecmp(str, "+Zeta") == 0) *face_out = BC_FACE_POS_Z;
253 else SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_UNKNOWN_TYPE, "Unknown face specifier: %s", str);
254 return 0;
255}
@ BC_FACE_NEG_X
Definition variables.h:200
@ BC_FACE_POS_Z
Definition variables.h:202
@ BC_FACE_POS_Y
Definition variables.h:201
@ BC_FACE_NEG_Z
Definition variables.h:202
@ BC_FACE_POS_X
Definition variables.h:200
@ BC_FACE_NEG_Y
Definition variables.h:201
Here is the caller graph for this function:

◆ StringToBCType()

PetscErrorCode StringToBCType ( const char *  str,
BCType type_out 
)

Converts a string representation of a BC type to a BCType enum.

Parameters
strThe input string (e.g., "WALL", "INLET"). Case-insensitive.
[out]type_outThe resulting BCType enum.
Returns
0 on success.

Definition at line 263 of file io.c.

263 {
264 if (strcasecmp(str, "WALL") == 0) *type_out = WALL;
265 else if (strcasecmp(str, "SYMMETRY") == 0) *type_out = SYMMETRY;
266 else if (strcasecmp(str, "INLET") == 0) *type_out = INLET;
267 else if (strcasecmp(str, "OUTLET") == 0) *type_out = OUTLET;
268 else if (strcasecmp(str, "NOGRAD") == 0) *type_out = NOGRAD;
269 // ... add other BCTypes here ...
270 else SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_UNKNOWN_TYPE, "Unknown BC Type string: %s", str);
271 return 0;
272}
@ INLET
Definition variables.h:210
@ NOGRAD
Definition variables.h:216
@ SYMMETRY
Definition variables.h:209
@ OUTLET
Definition variables.h:211
@ WALL
Definition variables.h:209
Here is the caller graph for this function:

◆ StringToBCHandlerType()

PetscErrorCode StringToBCHandlerType ( const char *  str,
BCHandlerType handler_out 
)

Converts a string representation of a handler to a BCHandlerType enum.

Parameters
strThe input string (e.g., "noslip", "constant_velocity"). Case-insensitive.
[out]handler_outThe resulting BCHandlerType enum.
Returns
0 on success.

Definition at line 280 of file io.c.

280 {
281 if (strcasecmp(str, "noslip") == 0) *handler_out = BC_HANDLER_WALL_NOSLIP;
282 else if (strcasecmp(str, "constant_velocity") == 0) *handler_out = BC_HANDLER_INLET_CONSTANT_VELOCITY;
283 else if (strcasecmp(str, "conservation") == 0) *handler_out = BC_HANDLER_OUTLET_CONSERVATION;
284 else if (strcasecmp(str, "allcopy") == 0) *handler_out = BC_HANDLER_NOGRAD_COPY_GHOST;
285 else if (strcasecmp(str, "parabolic") == 0) *handler_out = BC_HANDLER_INLET_PARABOLIC;
286 // ... add other BCHandlerTypes here ...
287 else SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_UNKNOWN_TYPE, "Unknown BC Handler string: %s", str);
288 return 0;
289}
@ BC_HANDLER_INLET_PARABOLIC
Definition variables.h:223
@ BC_HANDLER_INLET_CONSTANT_VELOCITY
Definition variables.h:224
@ BC_HANDLER_NOGRAD_COPY_GHOST
Definition variables.h:221
@ BC_HANDLER_WALL_NOSLIP
Definition variables.h:222
@ BC_HANDLER_OUTLET_CONSERVATION
Definition variables.h:225
Here is the caller graph for this function:

◆ ValidateBCHandlerForBCType()

PetscErrorCode ValidateBCHandlerForBCType ( BCType  type,
BCHandlerType  handler 
)

Validates that a specific handler is compatible with a general BC type.

Parameters
typeThe general BCType.
handlerThe specific BCHandlerType.
Returns
0 if compatible, error code otherwise.

Definition at line 297 of file io.c.

297 {
298 switch (type) {
299 case NOGRAD:
300 if(handler != BC_HANDLER_NOGRAD_COPY_GHOST) return PETSC_ERR_ARG_WRONG;
301 break;
302 case WALL:
303 if (handler != BC_HANDLER_WALL_NOSLIP && handler != BC_HANDLER_WALL_MOVING) return PETSC_ERR_ARG_WRONG;
304 break;
305 case INLET:
306 if (handler != BC_HANDLER_INLET_CONSTANT_VELOCITY && handler != BC_HANDLER_INLET_PARABOLIC) return PETSC_ERR_ARG_WRONG;
307 break;
308 // ... add other validation cases here ...
309 default: break;
310 }
311 return 0; // Combination is valid
312}
@ BC_HANDLER_WALL_MOVING
Definition variables.h:222
Here is the caller graph for this function:

◆ FreeBC_ParamList()

void FreeBC_ParamList ( BC_Param head)

Frees the memory allocated for a linked list of BC_Param structs.

Parameters
headA pointer to the head of the linked list to be freed.

Definition at line 229 of file io.c.

229 {
230 BC_Param *current = head;
231 while (current != NULL) {
232 BC_Param *next = current->next;
233 PetscFree(current->key);
234 PetscFree(current->value);
235 PetscFree(current);
236 current = next;
237 }
238}
struct BC_Param_s * next
Definition variables.h:239
char * key
Definition variables.h:237
char * value
Definition variables.h:238
A node in a linked list for storing key-value parameters from the bcs.dat file.
Definition variables.h:236
Here is the caller graph for this function:

◆ ParseAllBoundaryConditions()

PetscErrorCode ParseAllBoundaryConditions ( UserCtx user,
const char *  bcs_input_filename 
)

Parses the boundary conditions file to configure the type, handler, and any associated parameters for all 6 global faces of the domain.

This function performs the following steps:

  1. On MPI rank 0, it reads the specified configuration file line-by-line.
  2. It parses each line for <Face> <Type> <Handler> [param=value]... format.
  3. It validates the parsed strings and stores the configuration, including a linked list of parameters, in a temporary array.
  4. It then serializes this configuration and broadcasts it to all other MPI ranks.
  5. All ranks (including rank 0) then deserialize the broadcasted data to populate their local user->boundary_faces array identically.
  6. It also sets legacy fields in UserCtx for compatibility with other modules.
Parameters
[in,out]userThe main UserCtx struct where the final configuration for all ranks will be stored.
[in]bcs_input_filenameThe path to the boundary conditions configuration file.
Returns
PetscErrorCode 0 on success, error code on failure.

Definition at line 339 of file io.c.

340{
341 PetscErrorCode ierr;
342 PetscMPIInt rank;
343
344 // Temporary storage for rank 0 to build the configuration before broadcasting.
345 BoundaryFaceConfig configs_rank0[6];
346
347 PetscFunctionBeginUser;
348 ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank); CHKERRQ(ierr);
349
350 if (rank == 0) {
351 FILE *file;
352 char line_buffer[1024];
353
354 // Initialize the temporary config array with safe defaults on rank 0.
355 for (int i = 0; i < 6; i++) {
356 configs_rank0[i].face_id = (BCFace)i;
357 configs_rank0[i].mathematical_type = WALL;
358 configs_rank0[i].handler_type = BC_HANDLER_WALL_NOSLIP;
359 configs_rank0[i].params = NULL;
360 configs_rank0[i].handler = NULL; // Handler object is not created here.
361 }
362
363 LOG_ALLOW(GLOBAL, LOG_INFO, "Parsing BC configuration from '%s' on rank 0... \n", bcs_input_filename);
364 file = fopen(bcs_input_filename, "r");
365 if (!file) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_FILE_OPEN, "Could not open BCs file '%s'.", bcs_input_filename);
366
367 while (fgets(line_buffer, sizeof(line_buffer), file)) {
368 char *current_pos = line_buffer;
369 while (isspace((unsigned char)*current_pos)) current_pos++; // Skip leading whitespace
370 if (*current_pos == '#' || *current_pos == '\0' || *current_pos == '\n' || *current_pos == '\r') continue;
371
372 char *face_str = strtok(current_pos, " \t\n\r");
373 char *type_str = strtok(NULL, " \t\n\r");
374 char *handler_str = strtok(NULL, " \t\n\r");
375
376 if (!face_str || !type_str || !handler_str) {
377 LOG_ALLOW(GLOBAL, LOG_WARNING, "Malformed line in bcs.dat, skipping: %s \n", line_buffer);
378 continue;
379 }
380
381 BCFace face_enum;
382 BCType type_enum;
383 BCHandlerType handler_enum;
384 const char* handler_name_for_log;
385
386 // --- Convert strings to enums and validate ---
387 ierr = StringToBCFace(face_str, &face_enum); CHKERRQ(ierr);
388 ierr = StringToBCType(type_str, &type_enum); CHKERRQ(ierr);
389 ierr = StringToBCHandlerType(handler_str, &handler_enum); CHKERRQ(ierr);
390 ierr = ValidateBCHandlerForBCType(type_enum, handler_enum);
391 if (ierr) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, "Validation failed: Handler '%s' is not valid for Type '%s' on Face '%s'.\n", handler_str, type_str, face_str);
392
393 // Store the core types for the corresponding face
394 configs_rank0[face_enum].mathematical_type = type_enum;
395 configs_rank0[face_enum].handler_type = handler_enum;
396 handler_name_for_log = BCHandlerTypeToString(handler_enum); // Assumes this utility exists
397 LOG_ALLOW(GLOBAL, LOG_DEBUG, " Parsed Face '%s': Type=%s, Handler=%s \n", face_str, type_str, handler_name_for_log);
398
399 // --- Parse optional key=value parameters for this face ---
400 FreeBC_ParamList(configs_rank0[face_enum].params); // Clear any previous (default) params
401 configs_rank0[face_enum].params = NULL;
402 BC_Param **param_next_ptr = &configs_rank0[face_enum].params; // Pointer to the 'next' pointer to build the list
403
404 char* token;
405 while ((token = strtok(NULL, " \t\n\r")) != NULL) {
406 char* equals_ptr = strchr(token, '=');
407 if (!equals_ptr) {
408 LOG_ALLOW(GLOBAL, LOG_WARNING, "Malformed parameter '%s' on face '%s', skipping. \n", token, face_str);
409 continue;
410 }
411
412 *equals_ptr = '\0'; // Temporarily split the string at '=' to separate key and value
413 char* key_str = token;
414 char* value_str = equals_ptr + 1;
415
416 BC_Param *new_param;
417 ierr = PetscMalloc1(1, &new_param); CHKERRQ(ierr);
418 ierr = PetscStrallocpy(key_str, &new_param->key); CHKERRQ(ierr);
419 ierr = PetscStrallocpy(value_str, &new_param->value); CHKERRQ(ierr);
420 new_param->next = NULL;
421
422 *param_next_ptr = new_param;
423 param_next_ptr = &new_param->next;
424 LOG_ALLOW(GLOBAL, LOG_DEBUG, " - Found param: [%s] = [%s] \n", new_param->key, new_param->value);
425 }
426 }
427 fclose(file);
428 }
429
430 // =========================================================================
431 // BROADCASTING THE CONFIGURATION FROM RANK 0
432 // =========================================================================
433 // This is a critical step to ensure all processes have the same configuration.
434
435 LOG_ALLOW(LOCAL, LOG_DEBUG, "Rank %d broadcasting/receiving BC configuration.\n", rank);
436
437 for (int i = 0; i < 6; i++) {
438 // --- Broadcast simple enums ---
439 if (rank == 0) {
440 user->boundary_faces[i] = configs_rank0[i]; // Rank 0 populates its final struct
441 }
442 ierr = MPI_Bcast(&user->boundary_faces[i].mathematical_type, 1, MPI_INT, 0, PETSC_COMM_WORLD); CHKERRQ(ierr);
443 ierr = MPI_Bcast(&user->boundary_faces[i].handler_type, 1, MPI_INT, 0, PETSC_COMM_WORLD); CHKERRQ(ierr);
444
445 // --- Serialize and Broadcast the parameter linked list ---
446 PetscInt n_params = 0;
447 if (rank == 0) { // On rank 0, count the number of parameters to send
448 for (BC_Param *p = user->boundary_faces[i].params; p; p = p->next) n_params++;
449 }
450 ierr = MPI_Bcast(&n_params, 1, MPI_INT, 0, PETSC_COMM_WORLD);CHKERRQ(ierr);
451
452 if (rank != 0) { // Non-root ranks need to receive and build the list
453 FreeBC_ParamList(user->boundary_faces[i].params); // Ensure list is empty before building
454 user->boundary_faces[i].params = NULL;
455 }
456
457 BC_Param **param_next_ptr = &user->boundary_faces[i].params;
458
459 for (int j = 0; j < n_params; j++) {
460 char key_buf[256] = {0}, val_buf[256] = {0};
461 if (rank == 0) {
462 // On rank 0, navigate to the j-th param and copy its data to buffers
463 BC_Param *p = user->boundary_faces[i].params;
464 for (int k = 0; k < j; k++) p = p->next;
465 strncpy(key_buf, p->key, 255);
466 strncpy(val_buf, p->value, 255);
467 }
468
469 ierr = MPI_Bcast(key_buf, 256, MPI_CHAR, 0, PETSC_COMM_WORLD); CHKERRQ(ierr);
470 ierr = MPI_Bcast(val_buf, 256, MPI_CHAR, 0, PETSC_COMM_WORLD); CHKERRQ(ierr);
471
472 if (rank != 0) {
473 // On non-root ranks, deserialize: create a new node and append it
474 BC_Param *new_param;
475 ierr = PetscMalloc1(1, &new_param); CHKERRQ(ierr);
476 ierr = PetscStrallocpy(key_buf, &new_param->key); CHKERRQ(ierr);
477 ierr = PetscStrallocpy(val_buf, &new_param->value); CHKERRQ(ierr);
478 new_param->next = NULL;
479 *param_next_ptr = new_param;
480 param_next_ptr = &new_param->next;
481 } else {
482 // On rank 0, just advance the pointer for the next iteration
483 param_next_ptr = &((*param_next_ptr)->next);
484 }
485 }
486 user->boundary_faces[i].face_id = (BCFace)i; // Ensure face_id is set on all ranks
487 }
488
489 // --- Set legacy fields for compatibility with particle system ---
490 user->inletFaceDefined = PETSC_FALSE;
491 for (int i=0; i<6; i++) {
492 if (user->boundary_faces[i].mathematical_type == INLET) {
493 user->inletFaceDefined = PETSC_TRUE;
494 user->identifiedInletBCFace = (BCFace)i;
495 LOG_ALLOW(GLOBAL, LOG_INFO, "Inlet face for particle initialization identified as Face %d.\n", i);
496 break; // Found the first one, stop looking
497 }
498 }
499
500 if (rank == 0) {
501 // Rank 0 can now free the linked lists it created, as they have been broadcast.
502 // Or, if user->boundary_faces was used directly, this is not needed.
503 }
504
505 PetscFunctionReturn(0);
506}
PetscErrorCode StringToBCHandlerType(const char *str, BCHandlerType *handler_out)
Converts a string representation of a handler to a BCHandlerType enum.
Definition io.c:280
PetscErrorCode ValidateBCHandlerForBCType(BCType type, BCHandlerType handler)
Validates that a specific handler is compatible with a general BC type.
Definition io.c:297
PetscErrorCode StringToBCFace(const char *str, BCFace *face_out)
Converts a string representation of a face to a BCFace enum.
Definition io.c:246
void FreeBC_ParamList(BC_Param *head)
Frees the memory allocated for a linked list of BC_Param structs.
Definition io.c:229
PetscErrorCode StringToBCType(const char *str, BCType *type_out)
Converts a string representation of a BC type to a BCType enum.
Definition io.c:263
const char * BCHandlerTypeToString(BCHandlerType handler_type)
Converts a BCHandlerType enum to its string representation.
Definition logging.c:689
BCType
Defines the general mathematical/physical category of a boundary.
Definition variables.h:206
BCHandlerType
Defines the specific computational "strategy" for a boundary handler.
Definition variables.h:220
BCHandlerType handler_type
Definition variables.h:265
BC_Param * params
Definition variables.h:266
BoundaryCondition * handler
Definition variables.h:267
Holds the complete configuration for one of the six boundary faces.
Definition variables.h:262
Here is the call graph for this function:
Here is the caller graph for this function:

◆ TrimWhitespace()

void TrimWhitespace ( char *  str)

Helper function to trim leading/trailing whitespace from a string.

Parameters
strThe string to trim in-place.

Definition at line 1910 of file io.c.

1910 {
1911 char *end;
1912 // Trim leading space
1913 while(isspace((unsigned char)*str)) str++;
1914 if(*str == 0) return; // All spaces?
1915
1916 // Trim trailing space
1917 end = str + strlen(str) - 1;
1918 while(end > str && isspace((unsigned char)*end)) end--;
1919 end[1] = '\0';
1920}
Here is the caller graph for this function:

◆ ParsePostProcessingSettings()

PetscErrorCode ParsePostProcessingSettings ( SimCtx simCtx)

Initializes post-processing settings from a config file and command-line overrides.

This function establishes the configuration for a post-processing run by:

  1. Setting hardcoded default values in the PostProcessParams struct.
  2. Reading a configuration file to override the defaults.
  3. Parsing command-line options (-startTime, -endTime, etc.) which can override both the defaults and the file settings.
Parameters
simCtxThe pointer to the simulation context that contains the postprocessing file and struct.
Returns
PetscErrorCode

This function establishes the configuration for a post-processing run by:

  1. Setting hardcoded default values in the PostProcessParams struct.
  2. Reading a configuration file to override the defaults.
  3. Parsing command-line options (-startTime, -endTime, etc.) which can override both the defaults and the file settings.
Parameters
configFileThe path to the configuration file to parse.
ppsPointer to the PostProcessParams struct to be populated.
Returns
PetscErrorCode

Definition at line 1935 of file io.c.

1936{
1937 FILE *file;
1938 char line[1024];
1939 PetscBool startTimeSet, endTimeSet, timeStepSet;
1940
1941 PetscFunctionBeginUser;
1942
1943 if (!simCtx || !simCtx->pps) {
1944 SETERRQ(PETSC_COMM_WORLD, PETSC_ERR_ARG_NULL, "SimCtx or its pps member is NULL in ParsePostProcessingSettings.");
1945 }
1946
1947 char *configFile = simCtx->PostprocessingControlFile;
1948 PostProcessParams *pps = simCtx->pps;
1949
1950
1951 // --- 1. Set Sane Defaults First ---
1952 pps->startTime = 0;
1953 pps->endTime = 0;
1954 pps->timeStep = 1;
1955 pps->outputParticles = PETSC_FALSE;
1956 pps->particle_output_freq = simCtx->LoggingFrequency; // Default to logging frequency;
1957 strcpy(pps->process_pipeline, "");
1958 strcpy(pps->output_fields_instantaneous, "Ucat,P");
1959 strcpy(pps->output_fields_averaged, "");
1960 strcpy(pps->output_prefix, "results/viz");
1961 strcpy(pps->particle_output_prefix,"results/viz");
1962 strcpy(pps->particle_fields,"velocity,CellID,weight,pid");
1963 strcpy(pps->particle_pipeline,"");
1964 strcpy(pps->particleExt,"dat"); // The input file format for particles.
1965 strcpy(pps->eulerianExt,"dat"); // The input file format for Eulerian fields.
1966
1967 // --- 2. Parse the Configuration File (overrides defaults) ---
1968 file = fopen(configFile, "r");
1969 if (file) {
1970 LOG_ALLOW(GLOBAL, LOG_INFO, "Parsing post-processing config file: %s\n", configFile);
1971 while (fgets(line, sizeof(line), file)) {
1972 char *key, *value, *comment;
1973 comment = strchr(line, '#'); if (comment) *comment = '\0';
1974 TrimWhitespace(line); if (strlen(line) == 0) continue;
1975 key = strtok(line, "="); value = strtok(NULL, "=");
1976 if (key && value) {
1977 TrimWhitespace(key); TrimWhitespace(value);
1978 if (strcmp(key, "startTime") == 0) pps->startTime = atoi(value);
1979 else if (strcmp(key, "endTime") == 0) pps->endTime = atoi(value);
1980 else if (strcmp(key, "timeStep") == 0) pps->timeStep = atoi(value);
1981 else if (strcmp(key, "output_particles") == 0) {
1982 if (strcasecmp(value, "true") == 0) pps->outputParticles = PETSC_TRUE;
1983 }
1984 else if (strcasecmp(key, "process_pipeline") == 0) {
1985 strncpy(pps->process_pipeline, value, MAX_PIPELINE_LENGTH - 1);
1986 pps->process_pipeline[MAX_PIPELINE_LENGTH - 1] = '\0'; // Ensure null-termination
1987 } else if (strcasecmp(key, "output_fields_instantaneous") == 0) {
1988 strncpy(pps->output_fields_instantaneous, value, MAX_FIELD_LIST_LENGTH - 1);
1990 } else if (strcasecmp(key, "output_prefix") == 0) {
1991 strncpy(pps->output_prefix, value, MAX_FILENAME_LENGTH - 1);
1992 pps->output_prefix[MAX_FILENAME_LENGTH - 1] = '\0';
1993 } else if (strcasecmp(key, "particle_output_prefix") == 0) {
1994 strncpy(pps->particle_output_prefix, value, MAX_FILENAME_LENGTH - 1);
1996 } else if (strcasecmp(key, "particle_fields_instantaneous") == 0) {
1997 strncpy(pps->particle_fields, value, MAX_FIELD_LIST_LENGTH - 1);
1998 pps->particle_fields[MAX_FIELD_LIST_LENGTH - 1] = '\0';
1999 } else if (strcasecmp(key, "particle_pipeline") == 0) {
2000 strncpy(pps->particle_pipeline, value, MAX_PIPELINE_LENGTH - 1);
2001 pps->particle_pipeline[MAX_PIPELINE_LENGTH - 1] = '\0';
2002 } else if (strcasecmp(key, "particle_output_freq") == 0) {
2003 pps->particle_output_freq = atoi(value);
2004 } else {
2005 LOG_ALLOW(GLOBAL, LOG_WARNING, "Unknown key '%s' in post-processing config file. Ignoring.\n", key);
2006 }
2007 // Add parsing for pipeline, fields, etc. in later phases
2008 }
2009 }
2010 fclose(file);
2011 } else {
2012 LOG_ALLOW(GLOBAL, LOG_WARNING, "Could not open post-processing config file '%s'. Using defaults and command-line overrides.\n", configFile);
2013 }
2014
2015 // --- 3. Parse Command-Line Options (overrides file settings and defaults) ---
2016 PetscOptionsGetInt(NULL, NULL, "-startTime", &pps->startTime, &startTimeSet);
2017 PetscOptionsGetInt(NULL, NULL, "-endTime", &pps->endTime, &endTimeSet);
2018 PetscOptionsGetInt(NULL, NULL, "-timeStep", &pps->timeStep, &timeStepSet);
2019 PetscOptionsGetBool(NULL, NULL, "-output_particles", &pps->outputParticles, NULL);
2020
2021 // If only startTime is given on command line, run for a single step
2022 if (startTimeSet && !endTimeSet) {
2023 pps->endTime = pps->startTime;
2024 }
2025
2026 LOG_ALLOW(GLOBAL, LOG_INFO, "Post-processing configured to run from t=%d to t=%d with step %d. Particle output: %s.\n",
2027 pps->startTime, pps->endTime, pps->timeStep, pps->outputParticles ? "TRUE" : "FALSE");
2028
2029 LOG_ALLOW(GLOBAL, LOG_INFO, "Process Pipeline: %s\n", pps->process_pipeline);
2030 LOG_ALLOW(GLOBAL, LOG_INFO, "Instantaneous Output Fields: %s\n", pps->output_fields_instantaneous);
2031 LOG_ALLOW(GLOBAL, LOG_INFO, "Output Prefix: %s\n", pps->output_prefix);
2032 LOG_ALLOW(GLOBAL, LOG_INFO, "Particle Output Prefix: %s\n", pps->particle_output_prefix);
2033 LOG_ALLOW(GLOBAL, LOG_INFO, "Particle Fields: %s\n", pps->particle_fields);
2034 LOG_ALLOW(GLOBAL, LOG_INFO, "Particle Pipeline: %s\n", pps->particle_pipeline);
2035 LOG_ALLOW(GLOBAL, LOG_INFO, "Particle Output Frequency: %d\n", pps->particle_output_freq);
2036 PetscFunctionReturn(0);
2037}
void TrimWhitespace(char *str)
Helper function to trim leading/trailing whitespace from a string.
Definition io.c:1910
char particle_output_prefix[256]
Definition variables.h:457
char output_prefix[256]
Definition variables.h:454
#define MAX_FIELD_LIST_LENGTH
Definition variables.h:434
char output_fields_averaged[1024]
Definition variables.h:453
#define MAX_PIPELINE_LENGTH
Definition variables.h:433
PetscInt timeStep
Definition variables.h:447
#define MAX_FILENAME_LENGTH
Definition variables.h:435
char output_fields_instantaneous[1024]
Definition variables.h:452
char eulerianExt[8]
Definition variables.h:461
char particle_pipeline[1024]
Definition variables.h:455
char process_pipeline[1024]
Definition variables.h:451
PetscInt particle_output_freq
Definition variables.h:458
char particle_fields[1024]
Definition variables.h:456
PetscBool outputParticles
Definition variables.h:448
PostProcessParams * pps
Definition variables.h:616
char particleExt[8]
Definition variables.h:462
PetscInt startTime
Definition variables.h:445
char PostprocessingControlFile[PETSC_MAX_PATH_LEN]
Definition variables.h:615
PetscInt LoggingFrequency
Definition variables.h:604
Holds all configuration parameters for a post-processing run.
Definition variables.h:443
Here is the call graph for this function:
Here is the caller graph for this function: