PICurv 0.1.0
A Parallel Particle-In-Cell Solver for Curvilinear LES
Loading...
Searching...
No Matches
Particle-to-Grid Scattering

Functions for scattering particle data (scalar or vector) onto Eulerian grid fields by averaging contributions within each cell. More...

Collaboration diagram for Particle-to-Grid Scattering:

Modules

 Internal Scattering Helpers
 Lower-level functions used by the main scattering routines.
 

Macros

#define __FUNCT__   "InterpolateAllFieldsToSwarm"
 Interpolates a cell-centered field (scalar or vector) onto DMSwarm particles, converting the cell-center data to corner data first, then looping over particles.
 
#define __FUNCT__   "ScatterAllParticleFieldsToEulerFields"
 Interpolates a cell-centered field (scalar or vector) onto DMSwarm particles, converting the cell-center data to corner data first, then looping over particles.
 

Functions

PetscErrorCode GetScatterTargetInfo (UserCtx *user, const char *particleFieldName, DM *targetDM, PetscInt *expected_dof)
 Determines the target Eulerian DM and expected DOF for scattering a given particle field.
 
PetscErrorCode ScatterParticleFieldToEulerField (UserCtx *user, const char *particleFieldName, Vec eulerFieldAverageVec)
 Scatters a particle field (scalar or vector) to the corresponding Eulerian field average.
 
PetscErrorCode ScatterAllParticleFieldsToEulerFields (UserCtx *user)
 Scatters a predefined set of particle fields to their corresponding Eulerian fields.
 
PetscErrorCode CalculateParticleCountPerCell (UserCtx *user)
 Counts particles in each cell of the DMDA 'da' and stores the result in user->ParticleCount.
 

Detailed Description

Functions for scattering particle data (scalar or vector) onto Eulerian grid fields by averaging contributions within each cell.

This file provides a modular set of functions to perform particle-to-grid projection, specifically calculating cell-averaged quantities from particle properties. It assumes a PETSc environment using DMDA for the grids and DMSwarm for particles.

Key Features:

Dependencies:

Macro Definition Documentation

◆ __FUNCT__ [1/2]

#define __FUNCT__   "InterpolateAllFieldsToSwarm"

Interpolates a cell-centered field (scalar or vector) onto DMSwarm particles, converting the cell-center data to corner data first, then looping over particles.

Steps: 1) Check that the Vec has blockSize=1 or 3 (scalar vs. vector). 2) Map the cell-centered Vec to a local array (fieldGlobal -> localPtr). 3) Allocate a corner array (cornerPtr) via Allocate3DArray(...), sized (zm+1, ym+1, xm+1). 4) Convert from cell-centers to corners via InterpolateFieldFromCenterToCorner(...). 5) Restore the cell-centered array. 6) Retrieve DMSwarm fields: "DMSwarm_CellID", "weight", and swarmOutFieldName. 7) Loop over local particles, clamp i/j/k, skip or zero out if out of range, read (a1,a2,a3). 8) Call InterpolateEulerFieldToSwarmForParticle(...) with cornerPtr to do final Interpolatetion. 9) Restore swarm fields, free the corner array.

Parameters
[in]userUser context with:
  • user->da (cell-centered DMDA),
  • user->swarm (DMSwarm).
[in]fieldGlobalVec with blockSize=1 or 3, storing the cell-centered field.
[in]fieldNameHuman-readable field name for logging (e.g. "velocity").
[in]swarmOutFieldNameName of the DMSwarm field where Interpolatetion results go.
Returns
PetscErrorCode 0 on success, non-zero on error.

Definition at line 1647 of file interpolation.c.

◆ __FUNCT__ [2/2]

#define __FUNCT__   "ScatterAllParticleFieldsToEulerFields"

Interpolates a cell-centered field (scalar or vector) onto DMSwarm particles, converting the cell-center data to corner data first, then looping over particles.

Steps: 1) Check that the Vec has blockSize=1 or 3 (scalar vs. vector). 2) Map the cell-centered Vec to a local array (fieldGlobal -> localPtr). 3) Allocate a corner array (cornerPtr) via Allocate3DArray(...), sized (zm+1, ym+1, xm+1). 4) Convert from cell-centers to corners via InterpolateFieldFromCenterToCorner(...). 5) Restore the cell-centered array. 6) Retrieve DMSwarm fields: "DMSwarm_CellID", "weight", and swarmOutFieldName. 7) Loop over local particles, clamp i/j/k, skip or zero out if out of range, read (a1,a2,a3). 8) Call InterpolateEulerFieldToSwarmForParticle(...) with cornerPtr to do final Interpolatetion. 9) Restore swarm fields, free the corner array.

Parameters
[in]userUser context with:
  • user->da (cell-centered DMDA),
  • user->swarm (DMSwarm).
[in]fieldGlobalVec with blockSize=1 or 3, storing the cell-centered field.
[in]fieldNameHuman-readable field name for logging (e.g. "velocity").
[in]swarmOutFieldNameName of the DMSwarm field where Interpolatetion results go.
Returns
PetscErrorCode 0 on success, non-zero on error.

Definition at line 1647 of file interpolation.c.

Function Documentation

◆ GetScatterTargetInfo()

PetscErrorCode GetScatterTargetInfo ( UserCtx user,
const char *  particleFieldName,
DM *  targetDM,
PetscInt *  expected_dof 
)

Determines the target Eulerian DM and expected DOF for scattering a given particle field.

Based on hardcoded rules mapping particle field names to user context DMs (da/fda). This function encapsulates the policy of where different fields should be scattered.

Parameters
[in]userPointer to the UserCtx containing da and fda.
[in]particleFieldNameName of the particle field (e.g., "P", "Ucat").
[out]targetDMPointer to store the determined target DM (da or fda).
[out]expected_dofPointer to store the expected DOF (1 or 3) for this field.
Returns
PetscErrorCode Returns 0 on success, PETSC_ERR_ARG_UNKNOWN if field name is not recognized, or PETSC_ERR_ARG_NULL for NULL inputs.

Based on hardcoded rules mapping particle field names ("P", "Nvert", "Ucat", "Ucont") to user context DMs (user->da or user->fda). This function encapsulates the policy of where different fields should be scattered. Modify this function to add rules for custom fields.

Parameters
[in]userPointer to the UserCtx containing required DMs (da, fda).
[in]particleFieldNameName of the particle field.
[out]targetDMPointer to store the determined target DM (user->da or user->fda).
[out]expected_dofPointer to store the expected DOF (1 or 3) for this field.
Returns
PetscErrorCode Returns 0 on success. Error codes:
  • PETSC_ERR_ARG_NULL if required inputs are NULL.
  • PETSC_ERR_ARG_WRONG if particleFieldName is not recognized.

Definition at line 1784 of file interpolation.c.

1786{
1787 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
1788 PetscFunctionBeginUser;
1789
1790 // --- Input Validation ---
1791 // Check for NULL pointers in essential inputs
1792 if (!user) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx pointer is NULL.");
1793 if (!user->da) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->da is NULL.");
1794 if (!user->fda) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->fda is NULL."); // Needed for vector fields
1795 if (!particleFieldName) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "particleFieldName is NULL.");
1796 if (!targetDM) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output targetDM pointer is NULL.");
1797 if (!expected_dof) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output expected_dof pointer is NULL.");
1798
1799 // --- Determine Target DM and DOF based on Field Name ---
1800 // Compare the input field name with known scalar fields targeting 'da'
1801 if (strcmp(particleFieldName, "Psi") == 0 || strcmp(particleFieldName, "Nvert") == 0) {
1802 *expected_dof = 1; // Scalar fields have DOF 1
1803 *targetDM = user->da; // Target the primary scalar DMDA
1804 LOG_ALLOW(GLOBAL, LOG_DEBUG, "GetScatterTargetInfo: Field '%s' targets DM 'da' (DOF=1).\n", particleFieldName);
1805 }
1806 // Compare with known vector fields targeting 'fda'
1807 else if (strcmp(particleFieldName, "Ucat") == 0 || strcmp(particleFieldName, "Ucont") == 0) {
1808 *expected_dof = 3; // Vector fields have DOF 3
1809 *targetDM = user->fda; // Target the vector DMDA (often node-based)
1810 LOG_ALLOW(GLOBAL, LOG_DEBUG, "GetScatterTargetInfo: Field '%s' targets DM 'fda' (DOF=3).\n", particleFieldName);
1811 }
1812 // --- Add rules for other fields here ---
1813 // else if (strcmp(particleFieldName, "SomeOtherScalar") == 0) { *expected_dof = 1; *targetDM = user->da; }
1814 // else if (strcmp(particleFieldName, "SomeOtherVector") == 0) { *expected_dof = 3; *targetDM = user->someOtherDM; }
1815 else {
1816 // The provided field name doesn't match any known rules
1817 *targetDM = NULL; // Indicate failure
1818 *expected_dof = 0;
1819 // Format the error message manually
1820 PetscSNPrintf(msg, sizeof(msg), "GetScatterTargetInfo: Field name '%s' is not recognized for automatic DM selection.", particleFieldName);
1821 // Use SETERRQ with the formatted message and appropriate error code
1822 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg); // Use WRONG argument error code
1823 }
1824
1825 PetscFunctionReturn(0);
1826}
#define ERROR_MSG_BUFFER_SIZE
#define GLOBAL
Scope for global logging across all processes.
Definition logging.h:45
#define LOG_ALLOW(scope, level, fmt,...)
Logging macro that checks both the log level and whether the calling function is in the allowed-funct...
Definition logging.h:207
@ LOG_DEBUG
Detailed debugging information.
Definition logging.h:33
Here is the caller graph for this function:

◆ ScatterParticleFieldToEulerField()

PetscErrorCode ScatterParticleFieldToEulerField ( UserCtx user,
const char *  particleFieldName,
Vec  eulerFieldAverageVec 
)

Scatters a particle field (scalar or vector) to the corresponding Eulerian field average.

This is the main user-facing function. It determines the target Eulerian DM based on the particleFieldName, validates the provided eulerFieldAverageVec against the target DM, and then orchestrates the scatter operation by calling the internal helper function ScatterParticleFieldToEulerField_Internal. The final averaged result is stored IN-PLACE in eulerFieldAverageVec.

Parameters
[in]userPointer to UserCtx containing da, fda, swarm, ParticleCount.
[in]particleFieldNameName of the field in the DMSwarm (e.g., "P", "Ucat").
[in,out]eulerFieldAverageVecPre-created Vec associated with the correct target DM (implicitly da or fda). Result stored here.
Returns
PetscErrorCode 0 on success. Errors on NULL input, unrecognized field name, or incompatible target vector.

This is the primary user-facing function for scattering a single field. It determines the target Eulerian DM (user->da or user->fda) based on the particleFieldName, validates that the provided eulerFieldAverageVec is compatible with that target DM, and then orchestrates the scatter operation by calling an internal helper function. The final averaged result is stored IN-PLACE in the eulerFieldAverageVec.

Parameters
[in]userPointer to UserCtx containing da, fda, swarm, ParticleCount.
[in]particleFieldNameName of the field in the DMSwarm (e.g., "Psi", "Ucat").
[in,out]eulerFieldAverageVecPre-created Vec associated with the correct target DM (implicitly da or fda). Result stored here. This vector should be zeroed by the caller if desired before this function (e.g., using VecSet). The wrapper ScatterAllParticleFieldsToEulerFields handles this zeroing.
Returns
PetscErrorCode 0 on success. Errors on NULL input, unrecognized field name, incompatible target vector, or if the particle field doesn't exist.

Definition at line 2151 of file interpolation.c.

2154{
2155 PetscErrorCode ierr;
2156 DM targetDM = NULL; // Will point to user->da or user->fda
2157 PetscInt expected_dof = 0; // Will be 1 or 3
2158 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
2159
2160 PetscFunctionBeginUser;
2161
2162 // --- Essential Input Validation ---
2163 if (!user) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx pointer is NULL.");
2164 if (!user->swarm) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->swarm is NULL.");
2165 if (!user->ParticleCount) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->ParticleCount is NULL.");
2166 if (!eulerFieldAverageVec) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output eulerFieldAverageVec is NULL.");
2167 // particleFieldName validity checked within GetScatterTargetInfo
2168
2169 // --- Determine Target DM & DOF using the helper function ---
2170 ierr = GetScatterTargetInfo(user, particleFieldName, &targetDM, &expected_dof); CHKERRQ(ierr);
2171 // If GetScatterTargetInfo failed (e.g., unknown name), CHKERRQ would have returned.
2172
2173 // --- Validate the provided Target Vec's Compatibility ---
2174 DM vec_dm;
2175 PetscInt vec_dof;
2176 // Check that the provided average vector has a DM associated with it
2177 ierr = VecGetDM(eulerFieldAverageVec, &vec_dm); CHKERRQ(ierr);
2178 if (!vec_dm) {
2179 PetscSNPrintf(msg, sizeof(msg), "Provided eulerFieldAverageVec for field '%s' does not have an associated DM.", particleFieldName);
2180 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONGSTATE, msg);
2181 }
2182 // Get the block size (DOF) of the provided vector
2183 ierr = VecGetBlockSize(eulerFieldAverageVec, &vec_dof); CHKERRQ(ierr);
2184 // Compare the vector's associated DM with the one determined by the field name
2185 if (vec_dm != targetDM) {
2186 const char *target_dm_name = "targetDM", *vec_dm_name = "vec_dm";
2187 // Get actual names if possible for a more informative error message
2188 PetscObjectGetName((PetscObject)targetDM, &target_dm_name);
2189 PetscObjectGetName((PetscObject)vec_dm, &vec_dm_name);
2190 PetscSNPrintf(msg, sizeof(msg), "Provided eulerFieldAverageVec associated with DM '%s', but field '%s' requires scatter to DM '%s'.", vec_dm_name, particleFieldName, target_dm_name);
2191 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_INCOMP, msg);
2192 }
2193 // Compare the vector's DOF with the one expected for the field name
2194 if (vec_dof != expected_dof) {
2195 PetscSNPrintf(msg, sizeof(msg), "Field '%s' requires DOF %d, but provided eulerFieldAverageVec has DOF %d.", particleFieldName, expected_dof, vec_dof);
2196 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_INCOMP, msg);
2197 }
2198
2199 // --- Perform Scatter using Internal Helper ---
2200 // Log intent before calling the core logic
2201 LOG_ALLOW(GLOBAL, LOG_DEBUG, "ScatterParticleFieldToEulerField: Scattering field '%s' (DOF=%d).\n", particleFieldName, expected_dof);
2202 ierr = ScatterParticleFieldToEulerField_Internal(user, // Pass user context
2203 particleFieldName, // Name of particle field
2204 targetDM, // Determined target DM (da or fda)
2205 expected_dof, // Determined DOF (1 or 3)
2206 eulerFieldAverageVec); // The output vector
2207 CHKERRQ(ierr); // Handle potential errors from the internal function
2208
2209 LOG_ALLOW(GLOBAL, LOG_INFO, "ScatterParticleFieldToEulerField: Successfully scattered field '%s'.\n", particleFieldName);
2210 PetscFunctionReturn(0);
2211}
PetscErrorCode GetScatterTargetInfo(UserCtx *user, const char *particleFieldName, DM *targetDM, PetscInt *expected_dof)
Determines the target Eulerian DM and expected DOF for scattering a given particle field.
static PetscErrorCode ScatterParticleFieldToEulerField_Internal(UserCtx *user, const char *particleFieldName, DM targetDM, PetscInt expected_dof, Vec eulerFieldAverageVec)
Internal helper function to orchestrate the scatter operation (accumulate + normalize).
@ LOG_INFO
Informational messages about program execution.
Definition logging.h:32
Vec ParticleCount
Definition variables.h:697
Here is the call graph for this function:
Here is the caller graph for this function:

◆ ScatterAllParticleFieldsToEulerFields()

PetscErrorCode ScatterAllParticleFieldsToEulerFields ( UserCtx user)

Scatters a predefined set of particle fields to their corresponding Eulerian fields.

This convenience function calls the unified ScatterParticleFieldToEulerField for a standard set of fields ("P", potentially others). It assumes the target Eulerian Vec objects (e.g., user->P, user->Ucat) exist in the UserCtx structure and are correctly associated with their respective DMs (user->da or user->fda). It zeros the target Vecs before scattering.

Parameters
[in,out]userPointer to the UserCtx structure containing all required DMs, Vecs (ParticleCount, target Eulerian fields like P, Ucat), and swarm.
Returns
PetscErrorCode 0 on success. Errors if prerequisites (like ParticleCount) are missing or if underlying scatter calls fail.

This convenience function calls the unified ScatterParticleFieldToEulerField for a standard set of fields (currently just "Psi", others commented out). It assumes the target Eulerian Vec objects (e.g., user->Psi, user->Ucat) exist in the UserCtx structure and are correctly associated with their respective DMs (user->da or user->fda).

IMPORTANT: It zeros the target Vecs before scattering. Ensure user->ParticleCount has been computed accurately before calling this function, reflecting the current particle distribution.

Parameters
[in,out]userPointer to the UserCtx structure containing all required DMs, Vecs (ParticleCount, target Eulerian fields like P, Ucat), and swarm.
Returns
PetscErrorCode 0 on success. Errors if prerequisites (like ParticleCount) are missing or if underlying scatter calls fail (e.g., particle field missing).

Definition at line 2235 of file interpolation.c.

2236{
2237 PetscErrorCode ierr;
2238 PetscFunctionBeginUser;
2240
2241 LOG_ALLOW(GLOBAL, LOG_INFO, "Starting scattering of specified particle fields to Eulerian grids.\n");
2242
2243 // --- Pre-computation Check: Ensure Particle Counts are Ready ---
2244 if (!user->ParticleCount) {
2245 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONGSTATE, "UserCtx->ParticleCount is NULL. Compute counts before calling ScatterAllParticleFieldsToEulerFields.");
2246 }
2247
2248 // --- Scatter Particle Field "Psi" -> Eulerian Field user->P (on da) ---
2249 // Check if the target Eulerian vector 'user->P' exists.
2250 if (user->Psi) {
2251
2252 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering particle field 'Psi' to user->Psi.\n");
2253 // Zero the target vector before accumulating the new average for this step/call.
2254
2255 // Debug Verification ------------------------------------------------
2256 Vec swarm_Psi;
2257 PetscReal Avg_Psi,Avg_swarm_Psi;
2258
2259 ierr = VecMean(user->P,&Avg_Psi);
2260 LOG_ALLOW(GLOBAL,LOG_DEBUG," Average of Scalar(Psi) before scatter: %.4f.\n",Avg_Psi);
2261
2262 ierr = DMSwarmCreateGlobalVectorFromField(user->swarm,"Psi",&swarm_Psi);
2263 ierr = VecMean(swarm_Psi,&Avg_swarm_Psi);
2264
2265 LOG_ALLOW(GLOBAL,LOG_DEBUG," Average of Particle Scalar(Psi): %.4f.\n",Avg_swarm_Psi);
2266
2267 ierr = DMSwarmDestroyGlobalVectorFromField(user->swarm,"Psi",&swarm_Psi);
2268 // Debug----------------------------------------------------------------
2269
2270 //ierr = VecSet(user->P, 0.0); CHKERRQ(ierr);
2271 // Call the unified scatter function. It will handle DM determination and validation.
2272 // It will also error out if the *particle* field "Psi" doesn't exist in the swarm.
2273 ierr = ScatterParticleFieldToEulerField(user, "Psi", user->Psi); CHKERRQ(ierr);
2274 ierr = VecMean(user->Psi,&Avg_Psi);
2275
2276 LOG_ALLOW(GLOBAL,LOG_DEBUG," Average of Scalar(Psi) after scatter: %.4f.\n",Avg_Psi);
2277 } else {
2278 // Only log a warning if the target Eulerian field is missing in the context.
2279 LOG_ALLOW(GLOBAL, LOG_WARNING, "Skipping scatter for 'Psi': UserCtx->Psi is NULL.\n");
2280 }
2281
2282 // --- (Commented Out) Scatter Other Fields ---
2283 // To enable scattering for other fields, uncomment the relevant block
2284 // AND ensure:
2285 // 1. The corresponding particle field (e.g., "Nvert", "Ucat") exists in user->swarm.
2286 // 2. The corresponding target Eulerian Vec (e.g., user->Nvert, user->Ucat) exists in user->ctx.
2287 // 3. The target Eulerian Vec is associated with the correct DM (da for Nvert, fda for Ucat).
2288
2289 /*
2290 // Scatter Particle Field "Nvert" -> Eulerian Field user->Nvert (on da)
2291 if (user->Nvert) {
2292 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering particle field 'Nvert' to user->Nvert.\n");
2293 ierr = VecSet(user->Nvert, 0.0); CHKERRQ(ierr);
2294 ierr = ScatterParticleFieldToEulerField(user, "Nvert", user->Nvert); CHKERRQ(ierr);
2295 } else {
2296 LOG_ALLOW(GLOBAL, LOG_WARNING, "Skipping scatter for 'Nvert': UserCtx->Nvert is NULL.\n");
2297 }
2298 */
2299
2300 /*
2301 // Scatter Particle Field "Ucat" -> Eulerian Field user->Ucat (on fda)
2302 if (user->Ucat) {
2303 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering particle field 'Ucat' to user->Ucat.\n");
2304 ierr = VecSet(user->Ucat, 0.0); CHKERRQ(ierr);
2305 ierr = ScatterParticleFieldToEulerField(user, "Ucat", user->Ucat); CHKERRQ(ierr);
2306 } else {
2307 LOG_ALLOW(GLOBAL, LOG_WARNING, "Skipping scatter for 'Ucat': UserCtx->Ucat is NULL.\n");
2308 }
2309 */
2310
2311 /*
2312 // Scatter Particle Field "Ucont" -> Eulerian Field user->Ucont (on fda)
2313 if (user->Ucont) {
2314 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering particle field 'Ucont' to user->Ucont.\n");
2315 ierr = VecSet(user->Ucont, 0.0); CHKERRQ(ierr);
2316 ierr = ScatterParticleFieldToEulerField(user, "Ucont", user->Ucont); CHKERRQ(ierr);
2317 } else {
2318 LOG_ALLOW(GLOBAL, LOG_WARNING, "Skipping scatter for 'Ucont': UserCtx->Ucont is NULL.\n");
2319 }
2320 */
2321
2322 // Add more fields as needed following the pattern above...
2323
2324 LOG_ALLOW(GLOBAL, LOG_INFO, "Finished scattering specified particle fields.\n");
2326 PetscFunctionReturn(0);
2327}
PetscErrorCode ScatterParticleFieldToEulerField(UserCtx *user, const char *particleFieldName, Vec eulerFieldAverageVec)
Scatters a particle field (scalar or vector) to the corresponding Eulerian field average.
#define PROFILE_FUNCTION_END
Marks the end of a profiled code block.
Definition logging.h:767
@ LOG_WARNING
Non-critical issues that warrant attention.
Definition logging.h:30
#define PROFILE_FUNCTION_BEGIN
Marks the beginning of a profiled code block (typically a function).
Definition logging.h:758
Vec Psi
Definition variables.h:698
Here is the call graph for this function:
Here is the caller graph for this function:

◆ CalculateParticleCountPerCell()

PetscErrorCode CalculateParticleCountPerCell ( UserCtx user)

Counts particles in each cell of the DMDA 'da' and stores the result in user->ParticleCount.

Zeros the user->ParticleCount vector, then iterates through local particles. Reads the GLOBAL cell index (I, J, K) stored in the "DMSwarm_CellID" field. Uses DMDAVecGetArray to access the local portion of the count vector and increments the count at the global index (I, J, K) if it belongs to the local patch (including ghosts).

Parameters
[in,out]userPointer to the UserCtx structure containing da, swarm, and ParticleCount.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 714 of file ParticleMotion.c.

714 {
715 PetscErrorCode ierr;
716 DM da = user->da;
717 DM swarm = user->swarm;
718 Vec countVec = user->ParticleCount;
719 PetscInt nlocal, p;
720 PetscInt *global_cell_id_arr; // Read GLOBAL cell IDs
721 PetscScalar ***count_arr_3d; // Use 3D accessor
722 PetscInt64 *PID_arr;
723 PetscMPIInt rank;
724 char msg[ERROR_MSG_BUFFER_SIZE];
725 PetscInt particles_counted_locally = 0;
726
727 PetscFunctionBeginUser;
728 ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank); CHKERRQ(ierr);
729
730 // --- Input Validation ---
731 if (!da) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->da is NULL.");
732 if (!swarm) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->swarm is NULL.");
733 if (!countVec) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->ParticleCount is NULL.");
734 // Check DOF of da
735 PetscInt count_dof;
736 ierr = DMDAGetInfo(da, NULL, NULL, NULL, NULL, NULL, NULL, NULL, &count_dof, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
737 if (count_dof != 1) { PetscSNPrintf(msg, sizeof(msg), "countDM must have DOF=1, got %d.", count_dof); SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg); }
738
739 // --- Zero the count vector ---
740 ierr = VecSet(countVec, 0.0); CHKERRQ(ierr);
741
742 // --- Get Particle Data ---
743 LOG_ALLOW(GLOBAL,LOG_DEBUG, "CalculateParticleCountPerCell: Accessing particle data.\n");
744 ierr = DMSwarmGetLocalSize(swarm, &nlocal); CHKERRQ(ierr);
745 ierr = DMSwarmGetField(swarm,"DMSwarm_CellID", NULL, NULL, (void **)&global_cell_id_arr); CHKERRQ(ierr);
746 ierr = DMSwarmGetField(swarm,"DMSwarm_pid",NULL,NULL,(void **)&PID_arr);CHKERRQ(ierr);
747
748 // --- Get Grid Vector Array using DMDA accessor ---
749 LOG_ALLOW(GLOBAL, LOG_DEBUG, "CalculateParticleCountPerCell: Accessing ParticleCount vector array (using DMDAVecGetArray).\n");
750 ierr = DMDAVecGetArray(da, countVec, &count_arr_3d); CHKERRQ(ierr);
751
752 // Get local owned range for validation/logging if needed, but not for indexing with DMDAVecGetArray
753 PetscInt xs, ys, zs, xm, ym, zm;
754 ierr = DMDAGetCorners(da, &xs, &ys, &zs, &xm, &ym, &zm); CHKERRQ(ierr);
755
756 // --- Accumulate Counts Locally ---
757 LOG_ALLOW(LOCAL, LOG_DEBUG, "CalculateParticleCountPerCell (Rank %d): Processing %d local particles using GLOBAL CellIDs.\n",rank,nlocal);
758 for (p = 0; p < nlocal; p++) {
759 // Read the GLOBAL indices stored for this particle
760 PetscInt64 i = global_cell_id_arr[p * 3 + 0]; // Global i index
761 PetscInt64 j = global_cell_id_arr[p * 3 + 1]; // Global j index
762 PetscInt64 k = global_cell_id_arr[p * 3 + 2]; // Global k index
763
764 // *** Bounds check is implicitly handled by DMDAVecGetArray for owned+ghost region ***
765 // However, accessing outside this region using global indices WILL cause an error.
766 // A preliminary check might still be wise if global IDs could be wild.
767 // We rely on LocateAllParticles to provide valid global indices [0..IM-1] etc.
768 // *** ADD PRINTF HERE ***
769 LOG_LOOP_ALLOW(LOCAL,LOG_DEBUG,p,100,"[Rank %d CalcCount] Read CellID for p=%d, PID = %ld: (%ld, %ld, %ld)\n", rank, p,PID_arr[p],i, j, k);
770 // *** END PRINTF ***
771 // *** Access the local array using GLOBAL indices ***
772 // DMDAVecGetArray allows this, mapping global (I,J,K) to the correct
773 // location within the local ghosted array segment.
774 // This assumes (I,J,K) corresponds to a cell within the local owned+ghost region.
775 // If (I,J,K) is for a cell owned by *another* rank (and not in the ghost region
776 // of this rank), accessing count_arr_3d[K][J][I] will likely lead to a crash
777 // or incorrect results. This highlights why storing LOCAL indices is preferred
778 // for parallel runs. But proceeding with GLOBAL as requested:
779 if (i >= xs && i < xs + xm &&
780 j >= ys && j < ys + ym && // Adjust based on actual ghost width
781 k >= zs && k < zs + zm ) // This check prevents definite crashes but doesn't guarantee ownership
782 {
783 // Increment count at the location corresponding to GLOBAL index (I,J,K)
784 // LOG_ALLOW(LOCAL, LOG_DEBUG, "CalculateParticleCountPerCell (Rank %d): Particle %d with global CellID (%d, %d, %d) incremented with a particle.\n",rank, p, i, j, k);
785 count_arr_3d[k][j][i] += 1.0;
786 particles_counted_locally++;
787 } else {
788 // This particle's global ID is likely outside the range this rank handles (even ghosts)
789 LOG_ALLOW(LOCAL, LOG_DEBUG, "CalculateParticleCountPerCell (Rank %d): Skipping particle %ld with global CellID (%ld, %ld, %ld) - likely outside local+ghost range.\n",rank, PID_arr[p] , i, j, k);
790 }
791 }
792 LOG_ALLOW(LOCAL, LOG_DEBUG, "CalculateParticleCountPerCell (Rank %d): Local counting finished. Processed %d particles locally.\n", rank, particles_counted_locally);
793
794 // --- Restore Access ---
795 ierr = DMDAVecRestoreArray(da, countVec, &count_arr_3d); CHKERRQ(ierr);
796 ierr = DMSwarmRestoreField(swarm,"DMSwarm_CellID", NULL, NULL, (void **)&global_cell_id_arr); CHKERRQ(ierr);
797 ierr = DMSwarmRestoreField(swarm,"DMSwarm_pid",NULL,NULL,(void **)&PID_arr);CHKERRQ(ierr);
798
799 // --- Assemble Global Vector ---
800 LOG_ALLOW(GLOBAL, LOG_DEBUG, "CalculateParticleCountPerCell: Assembling global ParticleCount vector.\n");
801 ierr = VecAssemblyBegin(countVec); CHKERRQ(ierr);
802 ierr = VecAssemblyEnd(countVec); CHKERRQ(ierr);
803
804 // --- Verification Logging ---
805 PetscReal total_counted_particles = 0.0, max_count_in_cell = 0.0;
806 ierr = VecSum(countVec, &total_counted_particles); CHKERRQ(ierr);
807 PetscInt max_idx_global = -1;
808 ierr = VecMax(countVec, &max_idx_global, &max_count_in_cell); CHKERRQ(ierr);
809 LOG_ALLOW(GLOBAL, LOG_INFO, "CalculateParticleCountPerCell: Total counted globally = %.0f, Max count in cell = %.0f\n",
810 total_counted_particles, max_count_in_cell);
811
812 // --- ADD THIS DEBUGGING BLOCK ---
813 if (max_idx_global >= 0) { // Check if VecMax found a location
814 // Need to convert the flat global index back to 3D global index (I, J, K)
815 // Get global grid dimensions (Nodes, NOT Cells IM/JM/KM)
816 PetscInt M, N, P;
817 ierr = DMDAGetInfo(da, NULL, &M, &N, &P, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
818 // Note: Assuming DOF=1 for countVec, index mapping uses node dimensions M,N,P from DMDA creation (IM+1, etc)
819 // Re-check if your DMDA uses cell counts (IM) or node counts (IM+1) for Vec layout. Let's assume Node counts M,N,P.
820 PetscInt Kmax = max_idx_global / (M * N);
821 PetscInt Jmax = (max_idx_global % (M * N)) / M;
822 PetscInt Imax = max_idx_global % M;
823 LOG_ALLOW(GLOBAL, LOG_INFO, " -> Max count located at global index (I,J,K) = (%d, %d, %d) [Flat index: %d]\n",
824 (int)Imax, (int)Jmax, (int)Kmax, (int)max_idx_global);
825
826 // Also, let's explicitly check the count at (0,0,0)
827 PetscScalar count_at_origin = 0.0;
828 PetscScalar ***count_arr_for_check;
829 ierr = DMDAVecGetArrayRead(da, countVec, &count_arr_for_check); CHKERRQ(ierr);
830 // Check bounds before accessing - crucial if using global indices
831 PetscInt xs, ys, zs, xm, ym, zm;
832 ierr = DMDAGetCorners(da, &xs, &ys, &zs, &xm, &ym, &zm); CHKERRQ(ierr);
833 if (0 >= xs && 0 < xs+xm && 0 >= ys && 0 < ys+ym && 0 >= zs && 0 < zs+zm) {
834 count_at_origin = count_arr_for_check[0][0][0]; // Access using global index (0,0,0)
835 } else {
836 // Origin is not on this rank (relevant for parallel, but check anyway)
837 count_at_origin = -999.0; // Indicate it wasn't accessible locally
838 }
839 ierr = DMDAVecRestoreArrayRead(da, countVec, &count_arr_for_check); CHKERRQ(ierr);
840 LOG_ALLOW(GLOBAL, LOG_INFO, " -> Count at global index (0,0,0) = %.1f\n", count_at_origin);
841
842 } else {
843 LOG_ALLOW(GLOBAL, LOG_WARNING, " -> VecMax did not return a location for the maximum value.\n");
844 }
845 // --- END DEBUGGING BLOCK ---
846
847 LOG_ALLOW(GLOBAL, LOG_INFO, "CalculateParticleCountPerCell: Particle counting complete.\n");
848
849 PetscFunctionReturn(0);
850}
#define ERROR_MSG_BUFFER_SIZE
#define LOG_LOOP_ALLOW(scope, level, iterVar, interval, fmt,...)
Logs a message inside a loop, but only every interval iterations.
Definition logging.h:319
#define LOCAL
Logging scope definitions for controlling message output.
Definition logging.h:44
Here is the caller graph for this function: