PICurv 0.1.0
A Parallel Particle-In-Cell Solver for Curvilinear LES
Loading...
Searching...
No Matches
Particle-to-Grid Scattering

Functions for scattering particle data (scalar or vector) onto Eulerian grid fields by averaging contributions within each cell. More...

Collaboration diagram for Particle-to-Grid Scattering:

Modules

 Internal Scattering Helpers
 Lower-level functions used by the main scattering routines.
 

Functions

PetscErrorCode GetScatterTargetInfo (UserCtx *user, const char *particleFieldName, DM *targetDM, PetscInt *expected_dof)
 Determines the target Eulerian DM and expected DOF for scattering a given particle field.
 
PetscErrorCode ScatterParticleFieldToEulerField (UserCtx *user, const char *particleFieldName, Vec eulerFieldAverageVec)
 Scatters a particle field (scalar or vector) to the corresponding Eulerian field average.
 
PetscErrorCode ScatterAllParticleFieldsToEulerFields (UserCtx *user)
 Scatters a predefined set of particle fields to their corresponding Eulerian fields.
 
PetscErrorCode CalculateParticleCountPerCell (UserCtx *user)
 Counts particles in each cell of the DMDA 'da' and stores the result in user->ParticleCount.
 

Detailed Description

Functions for scattering particle data (scalar or vector) onto Eulerian grid fields by averaging contributions within each cell.

This file provides a modular set of functions to perform particle-to-grid projection, specifically calculating cell-averaged quantities from particle properties. It assumes a PETSc environment using DMDA for the grids and DMSwarm for particles.

Key Features:

Dependencies:

Function Documentation

◆ GetScatterTargetInfo()

PetscErrorCode GetScatterTargetInfo ( UserCtx user,
const char *  particleFieldName,
DM *  targetDM,
PetscInt *  expected_dof 
)

Determines the target Eulerian DM and expected DOF for scattering a given particle field.

Based on hardcoded rules mapping particle field names to user context DMs (da/fda). This function encapsulates the policy of where different fields should be scattered.

Parameters
[in]userPointer to the UserCtx containing da and fda.
[in]particleFieldNameName of the particle field (e.g., "P", "Ucat").
[out]targetDMPointer to store the determined target DM (da or fda).
[out]expected_dofPointer to store the expected DOF (1 or 3) for this field.
Returns
PetscErrorCode Returns 0 on success, PETSC_ERR_ARG_UNKNOWN if field name is not recognized, or PETSC_ERR_ARG_NULL for NULL inputs.

Based on hardcoded rules mapping particle field names ("P", "Nvert", "Ucat", "Ucont") to user context DMs (user->da or user->fda). This function encapsulates the policy of where different fields should be scattered. Modify this function to add rules for custom fields.

Parameters
[in]userPointer to the UserCtx containing required DMs (da, fda).
[in]particleFieldNameName of the particle field.
[out]targetDMPointer to store the determined target DM (user->da or user->fda).
[out]expected_dofPointer to store the expected DOF (1 or 3) for this field.
Returns
PetscErrorCode Returns 0 on success. Error codes:
  • PETSC_ERR_ARG_NULL if required inputs are NULL.
  • PETSC_ERR_ARG_WRONG if particleFieldName is not recognized.

Definition at line 1395 of file interpolation.c.

1397{
1398 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
1399 PetscFunctionBeginUser;
1400
1402
1403 // --- Input Validation ---
1404 // Check for NULL pointers in essential inputs
1405 if (!user) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx pointer is NULL.");
1406 if (!user->da) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->da is NULL.");
1407 if (!user->fda) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->fda is NULL."); // Needed for vector fields
1408 if (!particleFieldName) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "particleFieldName is NULL.");
1409 if (!targetDM) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output targetDM pointer is NULL.");
1410 if (!expected_dof) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output expected_dof pointer is NULL.");
1411
1412 // --- Determine Target DM and DOF based on Field Name ---
1413 // Compare the input field name with known scalar fields targeting 'da'
1414 if (strcmp(particleFieldName, "Psi") == 0 || strcmp(particleFieldName, "Nvert") == 0) {
1415 *expected_dof = 1; // Scalar fields have DOF 1
1416 *targetDM = user->da; // Target the primary scalar DMDA
1417 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Field '%s' targets DM 'da' (DOF=1).\n", particleFieldName);
1418 }
1419 // Compare with known vector fields targeting 'fda'
1420 else if (strcmp(particleFieldName, "Ucat") == 0 || strcmp(particleFieldName, "Ucont") == 0) {
1421 *expected_dof = 3; // Vector fields have DOF 3
1422 *targetDM = user->fda; // Target the vector DMDA (often node-based)
1423 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Field '%s' targets DM 'fda' (DOF=3).\n", particleFieldName);
1424 }
1425 // --- Add rules for other fields here ---
1426 // else if (strcmp(particleFieldName, "SomeOtherScalar") == 0) { *expected_dof = 1; *targetDM = user->da; }
1427 // else if (strcmp(particleFieldName, "SomeOtherVector") == 0) { *expected_dof = 3; *targetDM = user->someOtherDM; }
1428 else {
1429 // The provided field name doesn't match any known rules
1430 *targetDM = NULL; // Indicate failure
1431 *expected_dof = 0;
1432 // Format the error message manually
1433 PetscSNPrintf(msg, sizeof(msg), "Field name '%s' is not recognized for automatic DM selection.", particleFieldName);
1434 // Use SETERRQ with the formatted message and appropriate error code
1435 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg); // Use WRONG argument error code
1436 }
1437
1439
1440 PetscFunctionReturn(0);
1441}
#define ERROR_MSG_BUFFER_SIZE
#define GLOBAL
Scope for global logging across all processes.
Definition logging.h:47
#define LOG_ALLOW(scope, level, fmt,...)
Logging macro that checks both the log level and whether the calling function is in the allowed-funct...
Definition logging.h:201
#define PROFILE_FUNCTION_END
Marks the end of a profiled code block.
Definition logging.h:740
@ LOG_DEBUG
Detailed debugging information.
Definition logging.h:33
#define PROFILE_FUNCTION_BEGIN
Marks the beginning of a profiled code block (typically a function).
Definition logging.h:731
Here is the caller graph for this function:

◆ ScatterParticleFieldToEulerField()

PetscErrorCode ScatterParticleFieldToEulerField ( UserCtx user,
const char *  particleFieldName,
Vec  eulerFieldAverageVec 
)

Scatters a particle field (scalar or vector) to the corresponding Eulerian field average.

This is the main user-facing function. It determines the target Eulerian DM based on the particleFieldName, validates the provided eulerFieldAverageVec against the target DM, and then orchestrates the scatter operation by calling the internal helper function ScatterParticleFieldToEulerField_Internal. The final averaged result is stored IN-PLACE in eulerFieldAverageVec.

Parameters
[in]userPointer to UserCtx containing da, fda, swarm, ParticleCount.
[in]particleFieldNameName of the field in the DMSwarm (e.g., "P", "Ucat").
[in,out]eulerFieldAverageVecPre-created Vec associated with the correct target DM (implicitly da or fda). Result stored here.
Returns
PetscErrorCode 0 on success. Errors on NULL input, unrecognized field name, or incompatible target vector.

This is the primary user-facing function for scattering a single field. It determines the target Eulerian DM (user->da or user->fda) based on the particleFieldName, validates that the provided eulerFieldAverageVec is compatible with that target DM, and then orchestrates the scatter operation by calling an internal helper function. The final averaged result is stored IN-PLACE in the eulerFieldAverageVec.

Parameters
[in]userPointer to UserCtx containing da, fda, swarm, ParticleCount.
[in]particleFieldNameName of the field in the DMSwarm (e.g., "Psi", "Ucat").
[in,out]eulerFieldAverageVecPre-created Vec associated with the correct target DM (implicitly da or fda). Result stored here. This vector should be zeroed by the caller if desired before this function (e.g., using VecSet). The wrapper ScatterAllParticleFieldsToEulerFields handles this zeroing.
Returns
PetscErrorCode 0 on success. Errors on NULL input, unrecognized field name, incompatible target vector, or if the particle field doesn't exist.

Definition at line 1798 of file interpolation.c.

1801{
1802 PetscErrorCode ierr;
1803 DM targetDM = NULL; // Will point to user->da or user->fda
1804 PetscInt expected_dof = 0; // Will be 1 or 3
1805 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
1806
1807 PetscFunctionBeginUser;
1808
1810
1811 // --- Essential Input Validation ---
1812 if (!user) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx pointer is NULL.");
1813 if (!user->swarm) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->swarm is NULL.");
1814 if (!user->ParticleCount) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->ParticleCount is NULL.");
1815 if (!eulerFieldAverageVec) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output eulerFieldAverageVec is NULL.");
1816 // particleFieldName validity checked within GetScatterTargetInfo
1817
1818 // --- Determine Target DM & DOF using the helper function ---
1819 ierr = GetScatterTargetInfo(user, particleFieldName, &targetDM, &expected_dof); CHKERRQ(ierr);
1820 // If GetScatterTargetInfo failed (e.g., unknown name), CHKERRQ would have returned.
1821
1822 // --- Validate the provided Target Vec's Compatibility ---
1823 DM vec_dm;
1824 PetscInt vec_dof;
1825 // Check that the provided average vector has a DM associated with it
1826 ierr = VecGetDM(eulerFieldAverageVec, &vec_dm); CHKERRQ(ierr);
1827 if (!vec_dm) {
1828 PetscSNPrintf(msg, sizeof(msg), "Provided eulerFieldAverageVec for field '%s' does not have an associated DM.", particleFieldName);
1829 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONGSTATE, msg);
1830 }
1831 // Get the block size (DOF) of the provided vector
1832 ierr = VecGetBlockSize(eulerFieldAverageVec, &vec_dof); CHKERRQ(ierr);
1833 // Compare the vector's associated DM with the one determined by the field name
1834 if (vec_dm != targetDM) {
1835 const char *target_dm_name = "targetDM", *vec_dm_name = "vec_dm";
1836 // Get actual names if possible for a more informative error message
1837 PetscObjectGetName((PetscObject)targetDM, &target_dm_name);
1838 PetscObjectGetName((PetscObject)vec_dm, &vec_dm_name);
1839 PetscSNPrintf(msg, sizeof(msg), "Provided eulerFieldAverageVec associated with DM '%s', but field '%s' requires scatter to DM '%s'.", vec_dm_name, particleFieldName, target_dm_name);
1840 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_INCOMP, msg);
1841 }
1842 // Compare the vector's DOF with the one expected for the field name
1843 if (vec_dof != expected_dof) {
1844 PetscSNPrintf(msg, sizeof(msg), "Field '%s' requires DOF %d, but provided eulerFieldAverageVec has DOF %d.", particleFieldName, expected_dof, vec_dof);
1845 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_INCOMP, msg);
1846 }
1847
1848 // --- Perform Scatter using Internal Helper ---
1849 // Log intent before calling the core logic
1850 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering field '%s' (DOF=%d).\n", particleFieldName, expected_dof);
1851 ierr = ScatterParticleFieldToEulerField_Internal(user, // Pass user context
1852 particleFieldName, // Name of particle field
1853 targetDM, // Determined target DM (da or fda)
1854 expected_dof, // Determined DOF (1 or 3)
1855 eulerFieldAverageVec); // The output vector
1856 CHKERRQ(ierr); // Handle potential errors from the internal function
1857
1858 LOG_ALLOW(GLOBAL, LOG_INFO, "Successfully scattered field '%s'.\n", particleFieldName);
1859
1861
1862 PetscFunctionReturn(0);
1863}
PetscErrorCode GetScatterTargetInfo(UserCtx *user, const char *particleFieldName, DM *targetDM, PetscInt *expected_dof)
Determines the target Eulerian DM and expected DOF for scattering a given particle field.
static PetscErrorCode ScatterParticleFieldToEulerField_Internal(UserCtx *user, const char *particleFieldName, DM targetDM, PetscInt expected_dof, Vec eulerFieldAverageVec)
Internal helper function to orchestrate the scatter operation (accumulate + normalize).
@ LOG_INFO
Informational messages about program execution.
Definition logging.h:32
Vec ParticleCount
Definition variables.h:729
Here is the call graph for this function:
Here is the caller graph for this function:

◆ ScatterAllParticleFieldsToEulerFields()

PetscErrorCode ScatterAllParticleFieldsToEulerFields ( UserCtx user)

Scatters a predefined set of particle fields to their corresponding Eulerian fields.

This convenience function calls the unified ScatterParticleFieldToEulerField for a standard set of fields ("P", potentially others). It assumes the target Eulerian Vec objects (e.g., user->P, user->Ucat) exist in the UserCtx structure and are correctly associated with their respective DMs (user->da or user->fda). It zeros the target Vecs before scattering.

Parameters
[in,out]userPointer to the UserCtx structure containing all required DMs, Vecs (ParticleCount, target Eulerian fields like P, Ucat), and swarm.
Returns
PetscErrorCode 0 on success. Errors if prerequisites (like ParticleCount) are missing or if underlying scatter calls fail.

This convenience function calls the unified ScatterParticleFieldToEulerField for a standard set of fields (currently just "Psi", others commented out). It assumes the target Eulerian Vec objects (e.g., user->Psi, user->Ucat) exist in the UserCtx structure and are correctly associated with their respective DMs (user->da or user->fda).

IMPORTANT: It zeros the target Vecs before scattering. Ensure user->ParticleCount has been computed accurately before calling this function, reflecting the current particle distribution.

Parameters
[in,out]userPointer to the UserCtx structure containing all required DMs, Vecs (ParticleCount, target Eulerian fields like P, Ucat), and swarm.
Returns
PetscErrorCode 0 on success. Errors if prerequisites (like ParticleCount) are missing or if underlying scatter calls fail (e.g., particle field missing).

Definition at line 1887 of file interpolation.c.

1888{
1889 PetscErrorCode ierr;
1890 PetscFunctionBeginUser;
1892
1893 LOG_ALLOW(GLOBAL, LOG_INFO, "Starting scattering of specified particle fields to Eulerian grids.\n");
1894
1895 // --- Pre-computation Check: Ensure Particle Counts are Ready ---
1896 if (!user->ParticleCount) {
1897 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONGSTATE, "UserCtx->ParticleCount is NULL. Compute counts before calling ScatterAllParticleFieldsToEulerFields.");
1898 }
1899
1900 // --- Scatter Particle Field "Psi" -> Eulerian Field user->Psi (on da) ---
1901 // Check if the target Eulerian vector 'user->Psi' exists.
1902 if (user->Psi) {
1903
1904 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering particle field 'Psi' to user->Psi.\n");
1905 // Zero the target vector before accumulating the new average for this step/call.
1906
1907 // Debug Verification ------------------------------------------------
1908 Vec swarm_Psi;
1909 PetscReal Avg_Psi,Avg_swarm_Psi;
1910
1911 ierr = VecMean(user->Psi,&Avg_Psi);
1912 LOG_ALLOW(GLOBAL,LOG_DEBUG," Average of Scalar(Psi) before scatter: %.4f.\n",Avg_Psi);
1913
1914 ierr = DMSwarmCreateGlobalVectorFromField(user->swarm,"Psi",&swarm_Psi);
1915 ierr = VecMean(swarm_Psi,&Avg_swarm_Psi);
1916
1917 LOG_ALLOW(GLOBAL,LOG_DEBUG," Average of Particle Scalar(Psi): %.4f.\n",Avg_swarm_Psi);
1918
1919 ierr = DMSwarmDestroyGlobalVectorFromField(user->swarm,"Psi",&swarm_Psi);
1920 // Debug----------------------------------------------------------------
1921
1922 //ierr = VecSet(user->P, 0.0); CHKERRQ(ierr);
1923 // Call the unified scatter function. It will handle DM determination and validation.
1924 // It will also error out if the *particle* field "Psi" doesn't exist in the swarm.
1925 ierr = ScatterParticleFieldToEulerField(user, "Psi", user->Psi); CHKERRQ(ierr);
1926 ierr = VecMean(user->Psi,&Avg_Psi);
1927
1928 LOG_ALLOW(GLOBAL,LOG_DEBUG," Average of Scalar(Psi) after scatter: %.4f.\n",Avg_Psi);
1929 } else {
1930 // Only log a warning if the target Eulerian field is missing in the context.
1931 LOG_ALLOW(GLOBAL, LOG_WARNING, "Skipping scatter for 'Psi': UserCtx->Psi is NULL.\n");
1932 }
1933
1934 // --- (Commented Out) Scatter Other Fields ---
1935 // To enable scattering for other fields, uncomment the relevant block
1936 // AND ensure:
1937 // 1. The corresponding particle field (e.g., "Nvert", "Ucat") exists in user->swarm.
1938 // 2. The corresponding target Eulerian Vec (e.g., user->Nvert, user->Ucat) exists in user->ctx.
1939 // 3. The target Eulerian Vec is associated with the correct DM (da for Nvert, fda for Ucat).
1940
1941 /*
1942 // Scatter Particle Field "Nvert" -> Eulerian Field user->Nvert (on da)
1943 if (user->Nvert) {
1944 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering particle field 'Nvert' to user->Nvert.\n");
1945 ierr = VecSet(user->Nvert, 0.0); CHKERRQ(ierr);
1946 ierr = ScatterParticleFieldToEulerField(user, "Nvert", user->Nvert); CHKERRQ(ierr);
1947 } else {
1948 LOG_ALLOW(GLOBAL, LOG_WARNING, "Skipping scatter for 'Nvert': UserCtx->Nvert is NULL.\n");
1949 }
1950 */
1951
1952 /*
1953 // Scatter Particle Field "Ucat" -> Eulerian Field user->Ucat (on fda)
1954 if (user->Ucat) {
1955 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering particle field 'Ucat' to user->Ucat.\n");
1956 ierr = VecSet(user->Ucat, 0.0); CHKERRQ(ierr);
1957 ierr = ScatterParticleFieldToEulerField(user, "Ucat", user->Ucat); CHKERRQ(ierr);
1958 } else {
1959 LOG_ALLOW(GLOBAL, LOG_WARNING, "Skipping scatter for 'Ucat': UserCtx->Ucat is NULL.\n");
1960 }
1961 */
1962
1963 /*
1964 // Scatter Particle Field "Ucont" -> Eulerian Field user->Ucont (on fda)
1965 if (user->Ucont) {
1966 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Scattering particle field 'Ucont' to user->Ucont.\n");
1967 ierr = VecSet(user->Ucont, 0.0); CHKERRQ(ierr);
1968 ierr = ScatterParticleFieldToEulerField(user, "Ucont", user->Ucont); CHKERRQ(ierr);
1969 } else {
1970 LOG_ALLOW(GLOBAL, LOG_WARNING, "Skipping scatter for 'Ucont': UserCtx->Ucont is NULL.\n");
1971 }
1972 */
1973
1974 // Add more fields as needed following the pattern above...
1975
1976 LOG_ALLOW(GLOBAL, LOG_INFO, "Finished scattering specified particle fields.\n");
1978 PetscFunctionReturn(0);
1979}
PetscErrorCode ScatterParticleFieldToEulerField(UserCtx *user, const char *particleFieldName, Vec eulerFieldAverageVec)
Scatters a particle field (scalar or vector) to the corresponding Eulerian field average.
@ LOG_WARNING
Non-critical issues that warrant attention.
Definition logging.h:30
Vec Psi
Definition variables.h:730
Here is the call graph for this function:

◆ CalculateParticleCountPerCell()

PetscErrorCode CalculateParticleCountPerCell ( UserCtx user)

Counts particles in each cell of the DMDA 'da' and stores the result in user->ParticleCount.

Zeros the user->ParticleCount vector, then iterates through local particles. Reads the GLOBAL cell index (I, J, K) stored in the "DMSwarm_CellID" field. Uses DMDAVecGetArray to access the local portion of the count vector and increments the count at the global index (I, J, K) if it belongs to the local patch (including ghosts).

Parameters
[in,out]userPointer to the UserCtx structure containing da, swarm, and ParticleCount.
Returns
PetscErrorCode Returns 0 on success, non-zero on failure.

Definition at line 485 of file ParticleMotion.c.

485 {
486 PetscErrorCode ierr;
487 DM da = user->da;
488 DM swarm = user->swarm;
489 Vec countVec = user->ParticleCount;
490 PetscInt nlocal, p;
491 PetscInt *global_cell_id_arr; // Read GLOBAL cell IDs
492 PetscScalar ***count_arr_3d; // Use 3D accessor
493 PetscInt64 *PID_arr;
494 PetscMPIInt rank;
495 char msg[ERROR_MSG_BUFFER_SIZE];
496 PetscInt particles_counted_locally = 0;
497
498 PetscFunctionBeginUser;
500 ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank); CHKERRQ(ierr);
501
502 // --- Input Validation ---
503 if (!da) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->da is NULL.");
504 if (!swarm) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->swarm is NULL.");
505 if (!countVec) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->ParticleCount is NULL.");
506 // Check DOF of da
507 PetscInt count_dof;
508 ierr = DMDAGetInfo(da, NULL, NULL, NULL, NULL, NULL, NULL, NULL, &count_dof, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
509 if (count_dof != 1) { PetscSNPrintf(msg, sizeof(msg), "countDM must have DOF=1, got %d.", count_dof); SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg); }
510
511 // --- Zero the count vector ---
512 ierr = VecSet(countVec, 0.0); CHKERRQ(ierr);
513
514 // --- Get Particle Data ---
515 LOG_ALLOW(GLOBAL,LOG_DEBUG, "Accessing particle data.\n");
516 ierr = DMSwarmGetLocalSize(swarm, &nlocal); CHKERRQ(ierr);
517 ierr = DMSwarmGetField(swarm,"DMSwarm_CellID", NULL, NULL, (void **)&global_cell_id_arr); CHKERRQ(ierr);
518 ierr = DMSwarmGetField(swarm,"DMSwarm_pid",NULL,NULL,(void **)&PID_arr);CHKERRQ(ierr);
519
520 // --- Get Grid Vector Array using DMDA accessor ---
521 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Accessing ParticleCount vector array (using DMDAVecGetArray).\n");
522 ierr = DMDAVecGetArray(da, countVec, &count_arr_3d); CHKERRQ(ierr);
523
524 // Get local owned range for validation/logging if needed, but not for indexing with DMDAVecGetArray
525 PetscInt xs, ys, zs, xm, ym, zm;
526 ierr = DMDAGetCorners(da, &xs, &ys, &zs, &xm, &ym, &zm); CHKERRQ(ierr);
527
528 // --- Accumulate Counts Locally ---
529 LOG_ALLOW(LOCAL, LOG_DEBUG, "CalculateParticleCountPerCell (Rank %d): Processing %d local particles using GLOBAL CellIDs.\n",rank,nlocal);
530 for (p = 0; p < nlocal; p++) {
531 // Read the GLOBAL indices stored for this particle
532 PetscInt64 i_geom = global_cell_id_arr[p * 3 + 0]; // Global i index
533 PetscInt64 j_geom = global_cell_id_arr[p * 3 + 1]; // Global j index
534 PetscInt64 k_geom = global_cell_id_arr[p * 3 + 2]; // Global k index
535
536 // Apply the shift to ensure ParticleCount follows the indexing convention for cell-centered data in this codebase.
537 PetscInt i = (PetscInt)i_geom + 1; // Shift for cell-centered
538 PetscInt j = (PetscInt)j_geom + 1; // Shift for cell-centered
539 PetscInt k = (PetscInt)k_geom + 1; // Shift for cell-centered
540
541 // *** Bounds check is implicitly handled by DMDAVecGetArray for owned+ghost region ***
542 // However, accessing outside this region using global indices WILL cause an error.
543 // A preliminary check might still be wise if global IDs could be wild.
544 // We rely on LocateAllParticles to provide valid global indices [0..IM-1] etc.
545
546 LOG_LOOP_ALLOW(LOCAL,LOG_VERBOSE,p,100,"[Rank %d] Read CellID for p=%d, PID = %ld: (%ld, %ld, %ld)\n", rank, p,PID_arr[p],i, j, k);
547
548 // *** Access the local array using GLOBAL indices ***
549 // DMDAVecGetArray allows this, mapping global (I,J,K) to the correct
550 // location within the local ghosted array segment.
551 // This assumes (I,J,K) corresponds to a cell within the local owned+ghost region.
552 // If (I,J,K) is for a cell owned by *another* rank (and not in the ghost region
553 // of this rank), accessing count_arr_3d[K][J][I] will likely lead to a crash
554 // or incorrect results. This highlights why storing LOCAL indices is preferred
555 // for parallel runs. But proceeding with GLOBAL as requested:
556 if (i >= xs && i < xs + xm &&
557 j >= ys && j < ys + ym && // Adjust based on actual ghost width
558 k >= zs && k < zs + zm ) // This check prevents definite crashes but doesn't guarantee ownership
559 {
560
561 // Increment count at the location corresponding to GLOBAL index (I,J,K)
562 // LOG_ALLOW(LOCAL, LOG_DEBUG, "CalculateParticleCountPerCell (Rank %d): Particle %d with global CellID (%d, %d, %d) incremented with a particle.\n",rank, p, i, j, k);
563 count_arr_3d[k][j][i] += 1.0;
564 particles_counted_locally++;
565 } else {
566 // This particle's global ID is likely outside the range this rank handles (even ghosts)
567 LOG_ALLOW(LOCAL, LOG_DEBUG, "CalculateParticleCountPerCell (Rank %d): Skipping particle %ld with global CellID (%ld, %ld, %ld) - likely outside local+ghost range.\n",rank, PID_arr[p] , i, j, k);
568 }
569 }
570 LOG_ALLOW(LOCAL, LOG_DEBUG, "CalculateParticleCountPerCell (Rank %d): Local counting finished. Processed %d particles locally.\n", rank, particles_counted_locally);
571
572 // --- Restore Access ---
573 ierr = DMDAVecRestoreArray(da, countVec, &count_arr_3d); CHKERRQ(ierr);
574 ierr = DMSwarmRestoreField(swarm,"DMSwarm_CellID", NULL, NULL, (void **)&global_cell_id_arr); CHKERRQ(ierr);
575 ierr = DMSwarmRestoreField(swarm,"DMSwarm_pid",NULL,NULL,(void **)&PID_arr);CHKERRQ(ierr);
576
577 // --- Assemble Global Vector ---
578 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Assembling global ParticleCount vector.\n");
579 ierr = VecAssemblyBegin(countVec); CHKERRQ(ierr);
580 ierr = VecAssemblyEnd(countVec); CHKERRQ(ierr);
581
582 // --- Verification Logging ---
583 PetscReal total_counted_particles = 0.0, max_count_in_cell = 0.0;
584 ierr = VecSum(countVec, &total_counted_particles); CHKERRQ(ierr);
585 PetscInt max_idx_global = -1;
586 ierr = VecMax(countVec, &max_idx_global, &max_count_in_cell); CHKERRQ(ierr);
587 LOG_ALLOW(GLOBAL, LOG_INFO, "Total counted globally = %.0f, Max count in cell = %.0f\n",
588 total_counted_particles, max_count_in_cell);
589
590 // --- ADD THIS DEBUGGING BLOCK ---
591 if (max_idx_global >= 0) { // Check if VecMax found a location
592 // Need to convert the flat global index back to 3D global index (I, J, K)
593 // Get global grid dimensions (Nodes, NOT Cells IM/JM/KM)
594 PetscInt M, N, P;
595 ierr = DMDAGetInfo(da, NULL, &M, &N, &P, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
596 // Note: Assuming DOF=1 for countVec, index mapping uses node dimensions M,N,P from DMDA creation (IM+1, etc)
597 // Re-check if your DMDA uses cell counts (IM) or node counts (IM+1) for Vec layout. Let's assume Node counts M,N,P.
598 PetscInt Kmax = max_idx_global / (M * N);
599 PetscInt Jmax = (max_idx_global % (M * N)) / M;
600 PetscInt Imax = max_idx_global % M;
601 LOG_ALLOW(GLOBAL, LOG_INFO, " -> Max count located at global index (I,J,K) = (%d, %d, %d) [Flat index: %d]\n",
602 (int)Imax, (int)Jmax, (int)Kmax, (int)max_idx_global);
603
604 // Also, let's explicitly check the count at (0,0,0)
605 PetscScalar count_at_origin = 0.0;
606 PetscScalar ***count_arr_for_check;
607 ierr = DMDAVecGetArrayRead(da, countVec, &count_arr_for_check); CHKERRQ(ierr);
608 // Check bounds before accessing - crucial if using global indices
609 PetscInt xs, ys, zs, xm, ym, zm;
610 ierr = DMDAGetCorners(da, &xs, &ys, &zs, &xm, &ym, &zm); CHKERRQ(ierr);
611 if (0 >= xs && 0 < xs+xm && 0 >= ys && 0 < ys+ym && 0 >= zs && 0 < zs+zm) {
612 count_at_origin = count_arr_for_check[0][0][0]; // Access using global index (0,0,0)
613 } else {
614 // Origin is not on this rank (relevant for parallel, but check anyway)
615 count_at_origin = -999.0; // Indicate it wasn't accessible locally
616 }
617 ierr = DMDAVecRestoreArrayRead(da, countVec, &count_arr_for_check); CHKERRQ(ierr);
618 LOG_ALLOW(GLOBAL, LOG_INFO, " -> Count at global index (0,0,0) = %.1f\n", count_at_origin);
619
620 } else {
621 LOG_ALLOW(GLOBAL, LOG_WARNING, " -> VecMax did not return a location for the maximum value.\n");
622 }
623 // --- END DEBUGGING BLOCK ---
624
625 LOG_ALLOW(GLOBAL, LOG_INFO, "Particle counting complete.\n");
626
627
629 PetscFunctionReturn(0);
630}
#define ERROR_MSG_BUFFER_SIZE
#define LOG_LOOP_ALLOW(scope, level, iterVar, interval, fmt,...)
Logs a message inside a loop, but only every interval iterations.
Definition logging.h:313
#define LOCAL
Logging scope definitions for controlling message output.
Definition logging.h:46
@ LOG_VERBOSE
Extremely detailed logs, typically for development use only.
Definition logging.h:35
Here is the caller graph for this function: