PICurv 0.1.0
A Parallel Particle-In-Cell Solver for Curvilinear LES
Loading...
Searching...
No Matches
Particle-to-Grid Scattering

Functions for scattering particle data (scalar or vector) onto Eulerian grid fields by averaging contributions within each cell. More...

Collaboration diagram for Particle-to-Grid Scattering:

Modules

 Internal Scattering Helpers
 Lower-level functions used by the main scattering routines.
 

Functions

PetscErrorCode GetScatterTargetInfo (UserCtx *user, const char *particleFieldName, DM *targetDM, PetscInt *expected_dof)
 Determines the target Eulerian DM and expected DOF for scattering a given particle field.
 
PetscErrorCode ScatterParticleFieldToEulerField (UserCtx *user, const char *particleFieldName, Vec eulerFieldAverageVec)
 Scatters a particle field (scalar or vector) to the corresponding Eulerian field average.
 
PetscErrorCode ScatterAllParticleFieldsToEulerFields (UserCtx *user)
 Scatters a predefined set of particle fields to their corresponding Eulerian fields.
 
PetscErrorCode CalculateParticleCountPerCell (UserCtx *user)
 Counts particles in each cell of the DMDA 'da' and stores the result in user->ParticleCount.
 

Detailed Description

Functions for scattering particle data (scalar or vector) onto Eulerian grid fields by averaging contributions within each cell.

This file provides a modular set of functions to perform particle-to-grid projection, specifically calculating cell-averaged quantities from particle properties. It assumes a PETSc environment using DMDA for the grids and DMSwarm for particles.

Key Features:

Dependencies:

Function Documentation

◆ GetScatterTargetInfo()

PetscErrorCode GetScatterTargetInfo ( UserCtx user,
const char *  particleFieldName,
DM *  targetDM,
PetscInt *  expected_dof 
)

Determines the target Eulerian DM and expected DOF for scattering a given particle field.

Based on hardcoded rules mapping particle field names to user context DMs (da/fda). This function encapsulates the policy of where different fields should be scattered.

Parameters
[in]userPointer to the UserCtx containing da and fda.
[in]particleFieldNameName of the particle field (e.g., "P", "Ucat").
[out]targetDMPointer to store the determined target DM (da or fda).
[out]expected_dofPointer to store the expected DOF (1 or 3) for this field.
Returns
PetscErrorCode Returns 0 on success, PETSC_ERR_ARG_UNKNOWN if field name is not recognized, or PETSC_ERR_ARG_NULL for NULL inputs.

Based on hardcoded rules mapping particle field names ("P", "Nvert", "Ucat", "Ucont") to user context DMs (user->da or user->fda). This function encapsulates the policy of where different fields should be scattered. Modify this function to add rules for custom fields.

Parameters
[in]userPointer to the UserCtx containing required DMs (da, fda).
[in]particleFieldNameName of the particle field.
[out]targetDMPointer to store the determined target DM (user->da or user->fda).
[out]expected_dofPointer to store the expected DOF (1 or 3) for this field.
Returns
PetscErrorCode Returns 0 on success. Error codes:
  • PETSC_ERR_ARG_NULL if required inputs are NULL.
  • PETSC_ERR_ARG_WRONG if particleFieldName is not recognized.

Definition at line 1426 of file interpolation.c.

1428{
1429 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
1430 PetscFunctionBeginUser;
1431
1433
1434 // --- Input Validation ---
1435 // Check for NULL pointers in essential inputs
1436 if (!user) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx pointer is NULL.");
1437 if (!user->da) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->da is NULL.");
1438 if (!user->fda) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->fda is NULL."); // Needed for vector fields
1439 if (!particleFieldName) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "particleFieldName is NULL.");
1440 if (!targetDM) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output targetDM pointer is NULL.");
1441 if (!expected_dof) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output expected_dof pointer is NULL.");
1442
1443 // --- Determine Target DM and DOF based on Field Name ---
1444 // Compare the input field name with known scalar fields targeting 'da'
1445 if (strcmp(particleFieldName, "Psi") == 0 || strcmp(particleFieldName, "Nvert") == 0) {
1446 *expected_dof = 1; // Scalar fields have DOF 1
1447 *targetDM = user->da; // Target the primary scalar DMDA
1448 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Field '%s' targets DM 'da' (DOF=1).\n", particleFieldName);
1449 }
1450 // Compare with known vector fields targeting 'fda'
1451 else if (strcmp(particleFieldName, "Ucat") == 0 || strcmp(particleFieldName, "Ucont") == 0) {
1452 *expected_dof = 3; // Vector fields have DOF 3
1453 *targetDM = user->fda; // Target the vector DMDA (often node-based)
1454 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Field '%s' targets DM 'fda' (DOF=3).\n", particleFieldName);
1455 }
1456 // --- Add rules for other fields here ---
1457 // else if (strcmp(particleFieldName, "SomeOtherScalar") == 0) { *expected_dof = 1; *targetDM = user->da; }
1458 // else if (strcmp(particleFieldName, "SomeOtherVector") == 0) { *expected_dof = 3; *targetDM = user->someOtherDM; }
1459 else {
1460 // The provided field name doesn't match any known rules
1461 *targetDM = NULL; // Indicate failure
1462 *expected_dof = 0;
1463 // Format the error message manually
1464 PetscSNPrintf(msg, sizeof(msg), "Field name '%s' is not recognized for automatic DM selection.", particleFieldName);
1465 // Use SETERRQ with the formatted message and appropriate error code
1466 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg); // Use WRONG argument error code
1467 }
1468
1470
1471 PetscFunctionReturn(0);
1472}
#define ERROR_MSG_BUFFER_SIZE
#define GLOBAL
Scope for global logging across all processes.
Definition logging.h:46
#define LOG_ALLOW(scope, level, fmt,...)
Logging macro that checks both the log level and whether the calling function is in the allowed-funct...
Definition logging.h:200
#define PROFILE_FUNCTION_END
Marks the end of a profiled code block.
Definition logging.h:746
@ LOG_DEBUG
Detailed debugging information.
Definition logging.h:32
#define PROFILE_FUNCTION_BEGIN
Marks the beginning of a profiled code block (typically a function).
Definition logging.h:737
Here is the caller graph for this function:

◆ ScatterParticleFieldToEulerField()

PetscErrorCode ScatterParticleFieldToEulerField ( UserCtx user,
const char *  particleFieldName,
Vec  eulerFieldAverageVec 
)

Scatters a particle field (scalar or vector) to the corresponding Eulerian field average.

This is the main user-facing function. It determines the target Eulerian DM based on the particleFieldName, validates the provided eulerFieldAverageVec against the target DM, and then orchestrates the scatter operation by calling the internal helper function ScatterParticleFieldToEulerField_Internal. The final averaged result is stored IN-PLACE in eulerFieldAverageVec.

Parameters
[in]userPointer to UserCtx containing da, fda, swarm, ParticleCount.
[in]particleFieldNameName of the field in the DMSwarm (e.g., "P", "Ucat").
[in,out]eulerFieldAverageVecPre-created Vec associated with the correct target DM (implicitly da or fda). Result stored here.
Returns
PetscErrorCode 0 on success. Errors on NULL input, unrecognized field name, or incompatible target vector.

This is the primary user-facing function for scattering a single field. It determines the target Eulerian DM (user->da or user->fda) based on the particleFieldName, validates that the provided eulerFieldAverageVec is compatible with that target DM, and then orchestrates the scatter operation by calling an internal helper function. The final averaged result is stored IN-PLACE in the eulerFieldAverageVec.

Parameters
[in]userPointer to UserCtx containing da, fda, swarm, ParticleCount.
[in]particleFieldNameName of the field in the DMSwarm (e.g., "Psi", "Ucat").
[in,out]eulerFieldAverageVecPre-created Vec associated with the correct target DM (implicitly da or fda). Result stored here. This vector should be zeroed by the caller if desired before this function (e.g., using VecSet). The wrapper ScatterAllParticleFieldsToEulerFields handles this zeroing.
Returns
PetscErrorCode 0 on success. Errors on NULL input, unrecognized field name, incompatible target vector, or if the particle field doesn't exist.

Definition at line 2022 of file interpolation.c.

2025{
2026 PetscErrorCode ierr;
2027 DM targetDM = NULL; // Will point to user->da or user->fda
2028 PetscInt expected_dof = 0; // Will be 1 or 3
2029 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
2030
2031 PetscFunctionBeginUser;
2032
2034
2035 // --- Essential Input Validation ---
2036 if (!user) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx pointer is NULL.");
2037 if (!user->swarm) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->swarm is NULL.");
2038 if (!user->ParticleCount) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->ParticleCount is NULL.");
2039 if (!eulerFieldAverageVec) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output eulerFieldAverageVec is NULL.");
2040 // particleFieldName validity checked within GetScatterTargetInfo
2041
2042 // --- Determine Target DM & DOF using the helper function ---
2043 ierr = GetScatterTargetInfo(user, particleFieldName, &targetDM, &expected_dof); CHKERRQ(ierr);
2044 // If GetScatterTargetInfo failed (e.g., unknown name), CHKERRQ would have returned.
2045
2046 // --- Validate the provided Target Vec's Compatibility ---
2047 DM vec_dm;
2048 PetscInt vec_dof;
2049 // Check that the provided average vector has a DM associated with it
2050 ierr = VecGetDM(eulerFieldAverageVec, &vec_dm); CHKERRQ(ierr);
2051 if (!vec_dm) {
2052 PetscSNPrintf(msg, sizeof(msg), "Provided eulerFieldAverageVec for field '%s' does not have an associated DM.", particleFieldName);
2053 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONGSTATE, msg);
2054 }
2055 // Get the block size (DOF) of the provided vector
2056 ierr = VecGetBlockSize(eulerFieldAverageVec, &vec_dof); CHKERRQ(ierr);
2057 // Compare the vector's associated DM with the one determined by the field name
2058 if (vec_dm != targetDM) {
2059 const char *target_dm_name = "targetDM", *vec_dm_name = "vec_dm";
2060 // Get actual names if possible for a more informative error message
2061 PetscObjectGetName((PetscObject)targetDM, &target_dm_name);
2062 PetscObjectGetName((PetscObject)vec_dm, &vec_dm_name);
2063 PetscSNPrintf(msg, sizeof(msg), "Provided eulerFieldAverageVec associated with DM '%s', but field '%s' requires scatter to DM '%s'.", vec_dm_name, particleFieldName, target_dm_name);
2064 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_INCOMP, msg);
2065 }
2066 // Compare the vector's DOF with the one expected for the field name
2067 if (vec_dof != expected_dof) {
2068 PetscSNPrintf(msg, sizeof(msg), "Field '%s' requires DOF %d, but provided eulerFieldAverageVec has DOF %d.", particleFieldName, expected_dof, vec_dof);
2069 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_INCOMP, msg);
2070 }
2071
2072 // --- Perform Scatter using Internal Helper ---
2073 // Log intent before calling the core logic
2074 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering field '%s' (DOF=%d).\n", particleFieldName, expected_dof);
2075 ierr = ScatterParticleFieldToEulerField_Internal(user, // Pass user context
2076 particleFieldName, // Name of particle field
2077 targetDM, // Determined target DM (da or fda)
2078 expected_dof, // Determined DOF (1 or 3)
2079 eulerFieldAverageVec); // The output vector
2080 CHKERRQ(ierr); // Handle potential errors from the internal function
2081
2082 LOG_ALLOW(GLOBAL, LOG_INFO, "Successfully scattered field '%s'.\n", particleFieldName);
2083
2085
2086 PetscFunctionReturn(0);
2087}
PetscErrorCode GetScatterTargetInfo(UserCtx *user, const char *particleFieldName, DM *targetDM, PetscInt *expected_dof)
Determines the target Eulerian DM and expected DOF for scattering a given particle field.
static PetscErrorCode ScatterParticleFieldToEulerField_Internal(UserCtx *user, const char *particleFieldName, DM targetDM, PetscInt expected_dof, Vec eulerFieldAverageVec)
Internal helper function to orchestrate the scatter operation (accumulate + normalize).
@ LOG_INFO
Informational messages about program execution.
Definition logging.h:31
Vec ParticleCount
Definition variables.h:800
Here is the call graph for this function:
Here is the caller graph for this function:

◆ ScatterAllParticleFieldsToEulerFields()

PetscErrorCode ScatterAllParticleFieldsToEulerFields ( UserCtx user)

Scatters a predefined set of particle fields to their corresponding Eulerian fields.

This convenience function calls the unified ScatterParticleFieldToEulerField for a standard set of fields ("P", potentially others). It assumes the target Eulerian Vec objects (e.g., user->P, user->Ucat) exist in the UserCtx structure and are correctly associated with their respective DMs (user->da or user->fda). It zeros the target Vecs before scattering.

Parameters
[in,out]userPointer to the UserCtx structure containing all required DMs, Vecs (ParticleCount, target Eulerian fields like P, Ucat), and swarm.
Returns
PetscErrorCode 0 on success. Errors if prerequisites (like ParticleCount) are missing or if underlying scatter calls fail.

This convenience function calls the unified ScatterParticleFieldToEulerField for a standard set of fields (currently just "Psi", others commented out). It assumes the target Eulerian Vec objects (e.g., user->Psi, user->Ucat) exist in the UserCtx structure and are correctly associated with their respective DMs (user->da or user->fda).

IMPORTANT: It zeros the target Vecs before scattering. Ensure user->ParticleCount has been computed accurately before calling this function, reflecting the current particle distribution.

Parameters
[in,out]userPointer to the UserCtx structure containing all required DMs, Vecs (ParticleCount, target Eulerian fields like P, Ucat), and swarm.
Returns
PetscErrorCode 0 on success. Errors if prerequisites (like ParticleCount) are missing or if underlying scatter calls fail (e.g., particle field missing).

Definition at line 2111 of file interpolation.c.

2112{
2113 PetscErrorCode ierr;
2114 PetscFunctionBeginUser;
2116
2117 LOG_ALLOW(GLOBAL, LOG_INFO, "Starting scattering of specified particle fields to Eulerian grids.\n");
2118
2119 // --- Pre-computation Check: Ensure Particle Counts are Ready ---
2120 if (!user->ParticleCount) {
2121 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONGSTATE, "UserCtx->ParticleCount is NULL. Compute counts before calling ScatterAllParticleFieldsToEulerFields.");
2122 }
2123
2124 // --- Scatter Particle Field "Psi" -> Eulerian Field user->Psi (on da) ---
2125 // Check if the target Eulerian vector 'user->Psi' exists.
2126 if (user->Psi) {
2127
2128 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering particle field 'Psi' to user->Psi.\n");
2129 // Zero the target vector before accumulating the new average for this step/call.
2130
2131 // Debug Verification ------------------------------------------------
2132 Vec swarm_Psi;
2133 PetscReal Avg_Psi,Avg_swarm_Psi;
2134
2135 ierr = VecMean(user->Psi,&Avg_Psi);
2136 LOG_ALLOW(GLOBAL,LOG_DEBUG," Average of Scalar(Psi) before scatter: %.4f.\n",Avg_Psi);
2137
2138 ierr = DMSwarmCreateGlobalVectorFromField(user->swarm,"Psi",&swarm_Psi);
2139 ierr = VecMean(swarm_Psi,&Avg_swarm_Psi);
2140
2141 LOG_ALLOW(GLOBAL,LOG_DEBUG," Average of Particle Scalar(Psi): %.4f.\n",Avg_swarm_Psi);
2142
2143 ierr = DMSwarmDestroyGlobalVectorFromField(user->swarm,"Psi",&swarm_Psi);
2144 // Debug----------------------------------------------------------------
2145
2146 //ierr = VecSet(user->P, 0.0); CHKERRQ(ierr);
2147 // Call the unified scatter function. It will handle DM determination and validation.
2148 // It will also error out if the *particle* field "Psi" doesn't exist in the swarm.
2149 ierr = ScatterParticleFieldToEulerField(user, "Psi", user->Psi); CHKERRQ(ierr);
2150 ierr = VecMean(user->Psi,&Avg_Psi);
2151
2152 LOG_ALLOW(GLOBAL,LOG_DEBUG," Average of Scalar(Psi) after scatter: %.4f.\n",Avg_Psi);
2153 } else {
2154 // Only log a warning if the target Eulerian field is missing in the context.
2155 LOG_ALLOW(GLOBAL, LOG_WARNING, "Skipping scatter for 'Psi': UserCtx->Psi is NULL.\n");
2156 }
2157
2158 // --- (Commented Out) Scatter Other Fields ---
2159 // To enable scattering for other fields, uncomment the relevant block
2160 // AND ensure:
2161 // 1. The corresponding particle field (e.g., "Nvert", "Ucat") exists in user->swarm.
2162 // 2. The corresponding target Eulerian Vec (e.g., user->Nvert, user->Ucat) exists in user->ctx.
2163 // 3. The target Eulerian Vec is associated with the correct DM (da for Nvert, fda for Ucat).
2164
2165 /*
2166 // Scatter Particle Field "Nvert" -> Eulerian Field user->Nvert (on da)
2167 if (user->Nvert) {
2168 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering particle field 'Nvert' to user->Nvert.\n");
2169 ierr = VecSet(user->Nvert, 0.0); CHKERRQ(ierr);
2170 ierr = ScatterParticleFieldToEulerField(user, "Nvert", user->Nvert); CHKERRQ(ierr);
2171 } else {
2172 LOG_ALLOW(GLOBAL, LOG_WARNING, "Skipping scatter for 'Nvert': UserCtx->Nvert is NULL.\n");
2173 }
2174 */
2175
2176 /*
2177 // Scatter Particle Field "Ucat" -> Eulerian Field user->Ucat (on fda)
2178 if (user->Ucat) {
2179 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering particle field 'Ucat' to user->Ucat.\n");
2180 ierr = VecSet(user->Ucat, 0.0); CHKERRQ(ierr);
2181 ierr = ScatterParticleFieldToEulerField(user, "Ucat", user->Ucat); CHKERRQ(ierr);
2182 } else {
2183 LOG_ALLOW(GLOBAL, LOG_WARNING, "Skipping scatter for 'Ucat': UserCtx->Ucat is NULL.\n");
2184 }
2185 */
2186
2187 /*
2188 // Scatter Particle Field "Ucont" -> Eulerian Field user->Ucont (on fda)
2189 if (user->Ucont) {
2190 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering particle field 'Ucont' to user->Ucont.\n");
2191 ierr = VecSet(user->Ucont, 0.0); CHKERRQ(ierr);
2192 ierr = ScatterParticleFieldToEulerField(user, "Ucont", user->Ucont); CHKERRQ(ierr);
2193 } else {
2194 LOG_ALLOW(GLOBAL, LOG_WARNING, "Skipping scatter for 'Ucont': UserCtx->Ucont is NULL.\n");
2195 }
2196 */
2197
2198 // Add more fields as needed following the pattern above...
2199
2200 LOG_ALLOW(GLOBAL, LOG_INFO, "Finished scattering specified particle fields.\n");
2202 PetscFunctionReturn(0);
2203}
PetscErrorCode ScatterParticleFieldToEulerField(UserCtx *user, const char *particleFieldName, Vec eulerFieldAverageVec)
Scatters a particle field (scalar or vector) to the corresponding Eulerian field average.
@ LOG_WARNING
Non-critical issues that warrant attention.
Definition logging.h:29
Vec Psi
Definition variables.h:801
Here is the call graph for this function:
Here is the caller graph for this function:

◆ CalculateParticleCountPerCell()

PetscErrorCode CalculateParticleCountPerCell ( UserCtx user)

Counts particles in each cell of the DMDA 'da' and stores the result in user->ParticleCount.

Zeros the user->ParticleCount vector, then iterates through local particles. Reads the GLOBAL cell index (I, J, K) stored in the "DMSwarm_CellID" field. Uses DMDAVecGetArray to access the local portion of the count vector and increments the count at the global index (I, J, K) if it belongs to the local patch (including ghosts).

Parameters
[in,out]userPointer to the UserCtx structure containing da, swarm, and ParticleCount.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 601 of file ParticleMotion.c.

601 {
602 PetscErrorCode ierr;
603 DM da = user->da;
604 DM swarm = user->swarm;
605 Vec countVec = user->ParticleCount;
606 Vec localcountVec = user->lParticleCount;
607 PetscInt nlocal, p;
608 PetscInt *global_cell_id_arr; // Read GLOBAL cell IDs
609 PetscScalar ***count_arr_3d; // Use 3D accessor
610 PetscInt64 *PID_arr;
611 PetscMPIInt rank;
612 char msg[ERROR_MSG_BUFFER_SIZE];
613 PetscInt particles_counted_locally = 0;
614
615 PetscFunctionBeginUser;
617 ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank); CHKERRQ(ierr);
618
619 // --- Input Validation ---
620 if (!da) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->da is NULL.");
621 if (!swarm) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->swarm is NULL.");
622 if (!countVec) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->ParticleCount is NULL.");
623 // Check DOF of da
624 PetscInt count_dof;
625 ierr = DMDAGetInfo(da, NULL, NULL, NULL, NULL, NULL, NULL, NULL, &count_dof, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
626 if (count_dof != 1) { PetscSNPrintf(msg, sizeof(msg), "countDM must have DOF=1, got %d.", count_dof); SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg); }
627
628 // --- Zero the local count vector ---
629 ierr = VecSet(localcountVec, 0.0); CHKERRQ(ierr);
630
631 // --- Get Particle Data ---
632 LOG_ALLOW(GLOBAL,LOG_DEBUG, "Accessing particle data.\n");
633 ierr = DMSwarmGetLocalSize(swarm, &nlocal); CHKERRQ(ierr);
634 ierr = DMSwarmGetField(swarm,"DMSwarm_CellID", NULL, NULL, (void **)&global_cell_id_arr); CHKERRQ(ierr);
635 ierr = DMSwarmGetField(swarm,"DMSwarm_pid",NULL,NULL,(void **)&PID_arr);CHKERRQ(ierr);
636
637 // --- Get Grid Vector Array using DMDA accessor ---
638 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Accessing ParticleCount vector array (using DMDAVecGetArray).\n");
639 ierr = DMDAVecGetArray(da, localcountVec, &count_arr_3d); CHKERRQ(ierr);
640
641 // Get local owned + ghosted range for writing into ghost slots.
642 PetscInt gxs, gys, gzs, gxm, gym, gzm;
643 ierr = DMDAGetGhostCorners(da, &gxs, &gys, &gzs, &gxm, &gym, &gzm); CHKERRQ(ierr);
644
645 // --- Accumulate Counts Locally ---
646 LOG_ALLOW(LOCAL, LOG_DEBUG, "CalculateParticleCountPerCell (Rank %d): Processing %d local particles using GLOBAL CellIDs.\n",rank,nlocal);
647 for (p = 0; p < nlocal; p++) {
648 // Read the GLOBAL indices stored for this particle
649 PetscInt i_geom = global_cell_id_arr[p * 3 + 0]; // Global i index
650 PetscInt j_geom = global_cell_id_arr[p * 3 + 1]; // Global j index
651 PetscInt k_geom = global_cell_id_arr[p * 3 + 2]; // Global k index
652
653 // Apply the shift to ensure ParticleCount follows the indexing convention for cell-centered data in this codebase.
654 PetscInt i = (PetscInt)i_geom + 1; // Shift for cell-centered
655 PetscInt j = (PetscInt)j_geom + 1; // Shift for cell-centered
656 PetscInt k = (PetscInt)k_geom + 1; // Shift for cell-centered
657
658 // *** Bounds check is implicitly handled by DMDAVecGetArray for owned+ghost region ***
659 // However, accessing outside this region using global indices WILL cause an error.
660 // A preliminary check might still be wise if global IDs could be wild.
661 // We rely on LocateAllParticles to provide valid global indices [0..IM-1] etc.
662
663 LOG_LOOP_ALLOW(LOCAL,LOG_VERBOSE,p,100,"[Rank %d] Read CellID for p=%d, PID = %ld: (%ld, %ld, %ld)\n", rank, p,PID_arr[p],i, j, k);
664
665 // Check if the global index (i,j,k) falls within the local + ghost range
666 if (i >= gxs && i < gxs + gxm &&
667 j >= gys && j < gys + gym && // Adjust based on actual ghost width
668 k >= gzs && k < gzs + gzm ) // This check prevents definite crashes but doesn't guarantee ownership
669 {
670
671 // Increment count at the location corresponding to GLOBAL index (I,J,K)
672 // LOG_ALLOW(LOCAL, LOG_DEBUG, "CalculateParticleCountPerCell (Rank %d): Particle %d with global CellID (%d, %d, %d) incremented with a particle.\n",rank, p, i, j, k);
673 count_arr_3d[k][j][i] += 1.0;
674 particles_counted_locally++;
675 } else {
676 // This particle's global ID is likely outside the range this rank handles (even ghosts)
677 // note: this is not necessarily an error if the particle is legitimately outside the local+ghost region
678 LOG_ALLOW(LOCAL, LOG_VERBOSE, "(Rank %d): Skipping particle %ld with global CellID (%ld, %ld, %ld) - likely outside local+ghost range.\n",rank, PID_arr[p] , i, j, k);
679 }
680 }
681 LOG_ALLOW(LOCAL, LOG_DEBUG, "(Rank %d): Local counting finished. Processed %d particles locally.\n", rank, particles_counted_locally);
682
683 // --- Restore Access ---
684 ierr = DMDAVecRestoreArray(da, localcountVec, &count_arr_3d); CHKERRQ(ierr);
685 ierr = DMSwarmRestoreField(swarm,"DMSwarm_CellID", NULL, NULL, (void **)&global_cell_id_arr); CHKERRQ(ierr);
686 ierr = DMSwarmRestoreField(swarm,"DMSwarm_pid",NULL,NULL,(void **)&PID_arr);CHKERRQ(ierr);
687
688 // --- Assemble Global Vector ---
689 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Assembling global ParticleCount vector.\n");
690 ierr = VecZeroEntries(countVec); CHKERRQ(ierr); // Ensure global vector is zeroed before accumulation
691 ierr = DMLocalToGlobalBegin(da, localcountVec, ADD_VALUES, countVec); CHKERRQ(ierr);
692 ierr = DMLocalToGlobalEnd(da, localcountVec, ADD_VALUES, countVec); CHKERRQ(ierr);
693 /*
694 * OPTIONAL: Synchronize Ghosts for Stencil Operations
695 * If a future function needs to read ParticleCount from neighbor cells (e.g., density smoothing
696 * or gradient calculations), uncomment the following lines to update the ghost slots
697 * in user->lParticleCount with the final summed values.
698 *
699 ierr = UpdateLocalGhosts(user,"ParticleCount"); CHKERRQ(ierr);
700 */
701
702 // --- Verification Logging ---
703 PetscReal total_counted_particles = 0.0, max_count_in_cell = 0.0;
704 ierr = VecSum(countVec, &total_counted_particles); CHKERRQ(ierr);
705 PetscInt max_idx_global = -1;
706 ierr = VecMax(countVec, &max_idx_global, &max_count_in_cell); CHKERRQ(ierr);
707 LOG_ALLOW(GLOBAL, LOG_INFO, "Total counted globally = %.0f, Max count in cell = %.0f\n",
708 total_counted_particles, max_count_in_cell);
709
710 // --- ADD THIS DEBUGGING BLOCK ---
711 if (max_idx_global >= 0) { // Check if VecMax found a location
712 // Need to convert the flat global index back to 3D global index (I, J, K)
713 // Get global grid dimensions (Nodes, NOT Cells IM/JM/KM)
714 PetscInt M, N, P;
715 ierr = DMDAGetInfo(da, NULL, &M, &N, &P, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
716 // Note: Assuming DOF=1 for countVec, index mapping uses node dimensions M,N,P from DMDA creation (IM+1, etc)
717 // Re-check if your DMDA uses cell counts (IM) or node counts (IM+1) for Vec layout. Let's assume Node counts M,N,P.
718 PetscInt Kmax = max_idx_global / (M * N);
719 PetscInt Jmax = (max_idx_global % (M * N)) / M;
720 PetscInt Imax = max_idx_global % M;
721 LOG_ALLOW(GLOBAL, LOG_INFO, " -> Max count located at global index (I,J,K) = (%d, %d, %d) [Flat index: %d]\n",
722 (int)Imax, (int)Jmax, (int)Kmax, (int)max_idx_global);
723
724 // Also, let's explicitly check the count at (0,0,0)
725 PetscScalar count_at_origin = 0.0;
726 PetscScalar ***count_arr_for_check;
727 ierr = DMDAVecGetArrayRead(da, countVec, &count_arr_for_check); CHKERRQ(ierr);
728 // Check bounds before accessing - crucial if using global indices
729 PetscInt xs, ys, zs, xm, ym, zm;
730 ierr = DMDAGetCorners(da, &xs, &ys, &zs, &xm, &ym, &zm); CHKERRQ(ierr);
731 if (0 >= xs && 0 < xs+xm && 0 >= ys && 0 < ys+ym && 0 >= zs && 0 < zs+zm) {
732 count_at_origin = count_arr_for_check[0][0][0]; // Access using global index (0,0,0)
733 } else {
734 // Origin is not on this rank (relevant for parallel, but check anyway)
735 count_at_origin = -999.0; // Indicate it wasn't accessible locally
736 }
737 ierr = DMDAVecRestoreArrayRead(da, countVec, &count_arr_for_check); CHKERRQ(ierr);
738 LOG_ALLOW(GLOBAL, LOG_INFO, " -> Count at global index (0,0,0) = %.1f\n", count_at_origin);
739
740 } else {
741 LOG_ALLOW(GLOBAL, LOG_WARNING, " -> VecMax did not return a location for the maximum value.\n");
742 }
743 // --- END DEBUGGING BLOCK ---
744
745 LOG_ALLOW(GLOBAL, LOG_INFO, "Particle counting complete.\n");
746
747
749 PetscFunctionReturn(0);
750}
#define ERROR_MSG_BUFFER_SIZE
#define LOG_LOOP_ALLOW(scope, level, iterVar, interval, fmt,...)
Logs a message inside a loop, but only every interval iterations.
Definition logging.h:312
#define LOCAL
Logging scope definitions for controlling message output.
Definition logging.h:45
@ LOG_VERBOSE
Extremely detailed logs, typically for development use only.
Definition logging.h:34
Vec lParticleCount
Definition variables.h:800
Here is the caller graph for this function: