PICurv 0.1.0
A Parallel Particle-In-Cell Solver for Curvilinear LES
Loading...
Searching...
No Matches
Internal Scattering Helpers

Lower-level functions used by the main scattering routines. More...

Collaboration diagram for Internal Scattering Helpers:

Macros

#define __FUNCT__   "InterpolateFieldFromCornerToCenter_Vector"
 
#define __FUNCT__   "InterpolateFieldFromCornerToCenter_Scalar"
 
#define __FUNCT__   "InterpolateFieldFromCenterToCorner_Vector_Petsc"
 
#define __FUNCT__   "InterpolateFieldFromCenterToCorner_Scalar_Petsc"
 
#define __FUNCT__   "InterpolateEulerFieldToSwarm"
 
#define __FUNCT__   "InterpolateAllFieldsToSwarm"
 
#define __FUNCT__   "GetScatterTargetInfo"
 
#define __FUNCT__   "AccumulateParticleField"
 
#define __FUNCT__   "NormalizeGridVectorByCount"
 
#define __FUNCT__   "ScatterParticleFieldToEulerField_Internal"
 
#define __FUNCT__   "ScatterParticleFieldToEulerField"
 
#define __FUNCT__   "ScatterAllParticleFieldsToEulerFields"
 
#define __FUNCT__   "InterpolateCornerToFaceCenter_Scalar"
 
#define __FUNCT__   "InterpolateCornerToFaceCenter_Vector"
 

Functions

PetscErrorCode AccumulateParticleField (DM swarm, const char *particleFieldName, DM gridSumDM, Vec gridSumVec)
 Accumulates a particle field (scalar or vector) into a target grid sum vector.
 
PetscErrorCode NormalizeGridVectorByCount (DM countDM, Vec countVec, DM dataDM, Vec sumVec, Vec avgVec)
 Normalizes a grid vector of sums by a grid vector of counts to produce an average.
 
PetscErrorCode GetScatterTargetInfo (UserCtx *user, const char *particleFieldName, DM *targetDM, PetscInt *expected_dof)
 Determines the target Eulerian DM and expected DOF for scattering a given particle field.
 
static PetscErrorCode ScatterParticleFieldToEulerField_Internal (UserCtx *user, const char *particleFieldName, DM targetDM, PetscInt expected_dof, Vec eulerFieldAverageVec)
 Internal helper function to orchestrate the scatter operation (accumulate + normalize).
 

Detailed Description

Lower-level functions used by the main scattering routines.

Macro Definition Documentation

◆ __FUNCT__ [1/14]

#define __FUNCT__   "InterpolateFieldFromCornerToCenter_Vector"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [2/14]

#define __FUNCT__   "InterpolateFieldFromCornerToCenter_Scalar"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [3/14]

#define __FUNCT__   "InterpolateFieldFromCenterToCorner_Vector_Petsc"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [4/14]

#define __FUNCT__   "InterpolateFieldFromCenterToCorner_Scalar_Petsc"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [5/14]

#define __FUNCT__   "InterpolateEulerFieldToSwarm"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [6/14]

#define __FUNCT__   "InterpolateAllFieldsToSwarm"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [7/14]

#define __FUNCT__   "GetScatterTargetInfo"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [8/14]

#define __FUNCT__   "AccumulateParticleField"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [9/14]

#define __FUNCT__   "NormalizeGridVectorByCount"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [10/14]

#define __FUNCT__   "ScatterParticleFieldToEulerField_Internal"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [11/14]

#define __FUNCT__   "ScatterParticleFieldToEulerField"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [12/14]

#define __FUNCT__   "ScatterAllParticleFieldsToEulerFields"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [13/14]

#define __FUNCT__   "InterpolateCornerToFaceCenter_Scalar"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [14/14]

#define __FUNCT__   "InterpolateCornerToFaceCenter_Vector"

Definition at line 20 of file interpolation.c.

Function Documentation

◆ AccumulateParticleField()

PetscErrorCode AccumulateParticleField ( DM  swarm,
const char *  particleFieldName,
DM  gridSumDM,
Vec  gridSumVec 
)

Accumulates a particle field (scalar or vector) into a target grid sum vector.

This function iterates through local particles, identifies their cell using the "DMSwarm_CellID" field, and adds the particle's field value (particleFieldName) to the corresponding cell location in the gridSumVec. It handles both scalar (DOF=1) and vector (DOF=3) fields automatically based on the DOF of gridSumDM.

IMPORTANT: The caller must ensure gridSumVec is zeroed before calling this function if a fresh sum calculation is desired.

Parameters
[in]swarmThe DMSwarm containing particles.
[in]particleFieldNameName of the field on the particles (must match DOF).
[in]gridSumDMThe DMDA associated with gridSumVec. Its DOF determines how many components are accumulated.
[in,out]gridSumVecThe Vec (associated with gridSumDM) to accumulate sums into.
Returns
PetscErrorCode 0 on success. Errors if fields don't exist or DMs are incompatible.

This function iterates through local particles, identifies their cell using the "DMSwarm_CellID" field, and adds the particle's field value (particleFieldName) to the corresponding cell location in the gridSumVec. It handles both scalar (DOF=1) and vector (DOF=3) fields automatically based on the DOF of gridSumDM.

IMPORTANT: The caller must ensure gridSumVec is zeroed before calling this function if a fresh sum calculation is desired.

Parameters
[in]swarmThe DMSwarm containing particles.
[in]particleFieldNameName of the field on the particles (must match DOF of gridSumDM).
[in]gridSumDMThe DMDA associated with gridSumVec. Its DOF determines how many components are accumulated.
[in,out]gridSumVecThe Vec (associated with gridSumDM) to accumulate sums into.
Returns
PetscErrorCode 0 on success. Errors if fields don't exist, DMs are incompatible, or memory access fails.

Definition at line 1466 of file interpolation.c.

1468{
1469 PetscErrorCode ierr;
1470 PetscInt dof; // DOF determined from gridSumDM
1471 PetscInt nlocal, p; // Local particle count and loop index
1472 const PetscReal *particle_arr = NULL; // Pointer to particle field data array (assuming Real)
1473 const PetscInt *cell_id_arr = NULL; // Pointer to particle cell ID array ("DMSwarm_CellID", Int)
1474 PetscScalar *sum_arr_ptr = NULL; // Pointer to grid sum vector data array (Scalar)
1475 PetscInt gxs, gys, gzs; // Start indices of local ghosted patch (often 0)
1476 PetscInt gxm, gym, gzm; // Dimensions of local ghosted patch (including ghosts)
1477 PetscMPIInt rank; // MPI rank for logging
1478 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
1479
1480 PetscFunctionBeginUser;
1481
1483
1484 ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank); CHKERRQ(ierr);
1485
1486 // --- Get DOF from the target grid DM ---
1487 ierr = DMDAGetInfo(gridSumDM, NULL, NULL, NULL, NULL, NULL, NULL, NULL, &dof, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
1488 // Validate that the DOF is supported (currently 1 or 3)
1489 if (dof != 1 && dof != 3) {
1490 PetscSNPrintf(msg, sizeof(msg), "gridSumDM DOF must be 1 or 3, got %d.", dof);
1491 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg);
1492 }
1493
1494 // --- Get Particle Data Arrays ---
1495 // DMSwarmGetField will return an error if the field doesn't exist, caught by CHKERRQ.
1496 ierr = DMSwarmGetField(swarm, particleFieldName, NULL, NULL, (void **)&particle_arr); CHKERRQ(ierr);
1497 ierr = DMSwarmGetField(swarm, "DMSwarm_CellID", NULL, NULL, (void **)&cell_id_arr); CHKERRQ(ierr);
1498
1499 // Get number of local particles *after* successfully getting field pointers
1500 ierr = DMSwarmGetLocalSize(swarm, &nlocal); CHKERRQ(ierr);
1501
1502 // --- Get Grid Sum Vector Array & Dimensions ---
1503 ierr = VecGetArray(gridSumVec, &sum_arr_ptr); CHKERRQ(ierr);
1504 // Get dimensions needed for calculating flat index within the local ghosted array
1505 ierr = DMDAGetGhostCorners(gridSumDM, &gxs, &gys, &gzs, &gxm, &gym, &gzm); CHKERRQ(ierr);
1506
1507 // --- Accumulate Locally ---
1508 LOG_ALLOW(LOCAL, LOG_DEBUG, "(Rank %d): Accumulating '%s' (DOF=%d) from %d particles using CellID field 'DMSwarm_CellID'.\n", rank, particleFieldName, dof, nlocal);
1509 // Loop over all particles owned by this process
1510 for (p = 0; p < nlocal; ++p) {
1511 // Extract local cell indices (relative to start of ghosted patch, [0..gxm-1], etc.)
1512 // Assumes DMSwarm_CellID stores (i, j, k) contiguously for each particle.
1513 PetscInt pidx_geom = cell_id_arr[p * 3 + 0]; // Local i-index
1514 PetscInt pidy_geom = cell_id_arr[p * 3 + 1]; // Local j-index
1515 PetscInt pidz_geom = cell_id_arr[p * 3 + 2]; // Local k-index
1516
1517 // Apply the index shift to convert from geometric to cell-centered field indexing
1518 PetscInt pidx_field = pidx_geom + 1;
1519 PetscInt pidy_field = pidy_geom + 1;
1520 PetscInt pidz_field = pidz_geom + 1;
1521
1522 // Bounds Check: Ensure the particle's cell index is within the valid local ghosted region
1523 if (pidx_field >= 0 && pidx_field < gxm && pidy_field >= 0 && pidy_field < gym && pidz_field >= 0 && pidz_field < gzm)
1524 {
1525 // Calculate the flat 1D index for this cell within the linear ghosted array
1526 // Uses PETSc's standard C-style row-major ordering (k-slowest, j-middle, i-fastest)
1527 // Corrected: k (slowest), j, i (fastest)
1528 PetscInt cell_flat_idx = (pidz_field * gym + pidy_field) * gxm + pidx_field;
1529
1530 // Calculate the base index for this particle's data in particle_arr
1531 PetscInt particle_base_idx = p * dof;
1532 // Calculate the base index for this cell's data in sum_arr_ptr
1533 PetscInt grid_base_idx = cell_flat_idx * dof;
1534
1535 // Add particle components to the grid sum vector components
1536 for (PetscInt c = 0; c < dof; ++c) {
1537 sum_arr_ptr[grid_base_idx + c] += particle_arr[particle_base_idx + c];
1538 }
1539 } else {
1540 // Log a warning if a particle's CellID is outside the expected local region.
1541 // This might indicate particles needing migration or boundary issues.
1542 LOG_ALLOW(LOCAL, LOG_WARNING, "(Rank %d): Particle %d (field '%s') has out-of-bounds CellID (%d, %d, %d). Ghosted dims: %dx%dx%d. Skipping.\n",
1543 rank, p, particleFieldName, pidx_field, pidy_field, pidz_field, gxm, gym, gzm);
1544 }
1545 } // End of particle loop
1546
1547 // --- Restore Access to Arrays ---
1548 ierr = VecRestoreArray(gridSumVec, &sum_arr_ptr); CHKERRQ(ierr);
1549 ierr = DMSwarmRestoreField(swarm, particleFieldName, NULL, NULL, (void **)&particle_arr); CHKERRQ(ierr);
1550 ierr = DMSwarmRestoreField(swarm, "DMSwarm_CellID", NULL, NULL, (void **)&cell_id_arr); CHKERRQ(ierr);
1551
1552 // --- Assemble Global Sum Vector ---
1553 // Crucial for parallel execution: sums contributions for cells shared across process boundaries.
1554 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Assembling global sum vector for '%s'.\n", particleFieldName);
1555 ierr = VecAssemblyBegin(gridSumVec); CHKERRQ(ierr);
1556 ierr = VecAssemblyEnd(gridSumVec); CHKERRQ(ierr);
1557
1559
1560 PetscFunctionReturn(0);
1561}
#define ERROR_MSG_BUFFER_SIZE
#define LOCAL
Logging scope definitions for controlling message output.
Definition logging.h:46
#define GLOBAL
Scope for global logging across all processes.
Definition logging.h:47
#define LOG_ALLOW(scope, level, fmt,...)
Logging macro that checks both the log level and whether the calling function is in the allowed-funct...
Definition logging.h:201
#define PROFILE_FUNCTION_END
Marks the end of a profiled code block.
Definition logging.h:740
@ LOG_WARNING
Non-critical issues that warrant attention.
Definition logging.h:30
@ LOG_DEBUG
Detailed debugging information.
Definition logging.h:33
#define PROFILE_FUNCTION_BEGIN
Marks the beginning of a profiled code block (typically a function).
Definition logging.h:731
Here is the caller graph for this function:

◆ NormalizeGridVectorByCount()

PetscErrorCode NormalizeGridVectorByCount ( DM  countDM,
Vec  countVec,
DM  dataDM,
Vec  sumVec,
Vec  avgVec 
)

Normalizes a grid vector of sums by a grid vector of counts to produce an average.

Calculates avgVec[i] = sumVec[i] / countVec[i] for each component of each OWNED cell where countVec[i] > 0. Sets avgVec[i] = 0 otherwise. Handles both scalar (DOF=1) and vector (DOF=3) data fields based on dataDM. Uses basic VecGetArray/VecGetArrayRead and manual index calculation.

Parameters
[in]countDMThe DMDA associated with countVec (must have DOF=1).
[in]countVecThe Vec containing particle counts per cell (read-only).
[in]dataDMThe DMDA associated with sumVec and avgVec (must have DOF=1 or DOF=3).
[in]sumVecThe Vec containing the accumulated sums per cell (read-only).
[in,out]avgVecThe Vec where the calculated averages will be stored (overwritten).
Returns
PetscErrorCode 0 on success.

Calculates avgVec[i] = sumVec[i] / countVec[i] for each component of each cell owned by the current process, provided countVec[i] > 0. Otherwise, sets avgVec[i] = 0. Handles both scalar (DOF=1) and vector (DOF=3) data fields based on the DOF of dataDM. Uses DMDA multi-dimensional array accessors (DMDAVecGetArray...) for safe and convenient indexing.

Parameters
[in]countDMThe DMDA associated with countVec (must have DOF=1).
[in]countVecThe Vec containing particle counts per cell (read-only).
[in]dataDMThe DMDA associated with sumVec and avgVec (must have DOF=1 or DOF=3).
[in]sumVecThe Vec containing the accumulated sums per cell (read-only).
[in,out]avgVecThe Vec where the calculated averages will be stored (overwritten). Must be associated with dataDM.
Returns
PetscErrorCode 0 on success. Errors on incompatible DMs or memory access failure.

Definition at line 1584 of file interpolation.c.

1586{
1587 PetscErrorCode ierr;
1588 PetscInt data_dof;
1589 PetscInt count_dof;
1590 PetscMPIInt rank;
1591 char msg[ERROR_MSG_BUFFER_SIZE];
1592
1593 // Pointers for DMDA array accessors - declare specific types
1594 PetscScalar ***count_arr_3d = NULL; // For DOF=1 count vector (3D DMDA)
1595 PetscScalar ***sum_arr_scalar = NULL; // For DOF=1 sum vector (3D DMDA)
1596 PetscScalar ***avg_arr_scalar = NULL; // For DOF=1 avg vector (3D DMDA)
1597 PetscScalar ***sum_arr_vector = NULL; // For DOF=3 sum vector (3D DMDA + DOF)
1598 PetscScalar ***avg_arr_vector = NULL; // For DOF=3 avg vector (3D DMDA + DOF)
1599
1600
1601 PetscFunctionBeginUser;
1602
1604
1605 ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank); CHKERRQ(ierr);
1606
1607 // --- Validation ---
1608 ierr = DMDAGetInfo(countDM, NULL, NULL, NULL, NULL, NULL, NULL, NULL, &count_dof, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
1609 ierr = DMDAGetInfo(dataDM, NULL, NULL, NULL, NULL, NULL, NULL, NULL, &data_dof, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
1610 if (count_dof != 1) { PetscSNPrintf(msg, sizeof(msg), "countDM must have DOF=1, got %d.", count_dof); SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg); }
1611 if (data_dof != 1 && data_dof != 3) { PetscSNPrintf(msg, sizeof(msg), "dataDM DOF must be 1 or 3, got %d.", data_dof); SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg); }
1612
1613 // --- Get Array Access using appropriate DMDA accessors ---
1614 ierr = DMDAVecGetArrayRead(countDM, countVec, &count_arr_3d); CHKERRQ(ierr);
1615
1616 if (data_dof == 1) {
1617 ierr = DMDAVecGetArrayRead(dataDM, sumVec, &sum_arr_scalar); CHKERRQ(ierr);
1618 ierr = DMDAVecGetArray(dataDM, avgVec, &avg_arr_scalar); CHKERRQ(ierr);
1619 } else { // data_dof == 3
1620 ierr = DMDAVecGetArrayDOFRead(dataDM, sumVec, &sum_arr_vector); CHKERRQ(ierr);
1621 ierr = DMDAVecGetArrayDOF(dataDM, avgVec, &avg_arr_vector); CHKERRQ(ierr);
1622 }
1623
1624 // Get the corners (global start indices) and dimensions of the *local owned* region
1625 PetscInt xs, ys, zs, xm, ym, zm;
1626 ierr = DMDAGetCorners(countDM, &xs, &ys, &zs, &xm, &ym, &zm); CHKERRQ(ierr);
1627
1628 // --- Normalize Over Owned Cells ---
1629 LOG_ALLOW(LOCAL, LOG_DEBUG, "(Rank %d): Normalizing DOF=%d data over owned range [%d:%d, %d:%d, %d:%d].\n",
1630 rank, data_dof, xs, xs+xm, ys, ys+ym, zs, zs+zm);
1631
1632 // Loop using GLOBAL indices (i, j, k) over the range owned by this process
1633 for (PetscInt k = zs; k < zs + zm; ++k) {
1634 for (PetscInt j = ys; j < ys + ym; ++j) {
1635 for (PetscInt i = xs; i < xs + xm; ++i) {
1636
1637 // Access the count using standard 3D indexing
1638 PetscScalar count = count_arr_3d[k][j][i];
1639
1640 if (PetscRealPart(count) > 0.5) { // Use tolerance for float comparison
1641 if (data_dof == 1) {
1642 // Access scalar sum/avg using standard 3D indexing
1643 avg_arr_scalar[k][j][i] = sum_arr_scalar[k][j][i] / count;
1644 } else { // data_dof == 3
1645 // Access vector components using DOF indexing on the last dimension
1646 for (PetscInt c = 0; c < data_dof; ++c) {
1647 avg_arr_vector[k][j][i * data_dof + c] = sum_arr_vector[k][j][i * data_dof + c] / count;
1648 }
1649 }
1650 } else { // count is zero or negative
1651 // Set average to zero
1652 if (data_dof == 1) {
1653 avg_arr_scalar[k][j][i] = 0.0;
1654 } else { // data_dof == 3
1655 for (PetscInt c = 0; c < data_dof; ++c) {
1656 avg_arr_vector[k][j][i * data_dof + c] = 0.0;
1657 }
1658 }
1659 } // end if count > 0.5
1660 } // end i loop
1661 } // end j loop
1662 } // end k loop
1663
1664 // --- Restore Arrays using appropriate functions ---
1665 ierr = DMDAVecRestoreArrayRead(countDM, countVec, &count_arr_3d); CHKERRQ(ierr);
1666 if (data_dof == 1) {
1667 ierr = DMDAVecRestoreArrayRead(dataDM, sumVec, &sum_arr_scalar); CHKERRQ(ierr);
1668 ierr = DMDAVecRestoreArray(dataDM, avgVec, &avg_arr_scalar); CHKERRQ(ierr);
1669 } else { // data_dof == 3
1670 ierr = DMDAVecRestoreArrayDOFRead(dataDM, sumVec, &sum_arr_vector); CHKERRQ(ierr);
1671 ierr = DMDAVecRestoreArrayDOF(dataDM, avgVec, &avg_arr_vector); CHKERRQ(ierr);
1672 }
1673
1674 // --- Assemble Final Average Vector ---
1675 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Assembling final average vector (DOF=%d).\n", data_dof);
1676 ierr = VecAssemblyBegin(avgVec); CHKERRQ(ierr);
1677 ierr = VecAssemblyEnd(avgVec); CHKERRQ(ierr);
1678
1679
1681
1682 PetscFunctionReturn(0);
1683}
Here is the caller graph for this function:

◆ GetScatterTargetInfo()

PetscErrorCode GetScatterTargetInfo ( UserCtx user,
const char *  particleFieldName,
DM *  targetDM,
PetscInt *  expected_dof 
)

Determines the target Eulerian DM and expected DOF for scattering a given particle field.

Based on hardcoded rules mapping particle field names ("P", "Nvert", "Ucat", "Ucont") to user context DMs (user->da or user->fda). This function encapsulates the policy of where different fields should be scattered. Modify this function to add rules for custom fields.

Parameters
[in]userPointer to the UserCtx containing required DMs (da, fda).
[in]particleFieldNameName of the particle field.
[out]targetDMPointer to store the determined target DM (user->da or user->fda).
[out]expected_dofPointer to store the expected DOF (1 or 3) for this field.
Returns
PetscErrorCode Returns 0 on success. Error codes:
  • PETSC_ERR_ARG_NULL if required inputs are NULL.
  • PETSC_ERR_ARG_WRONG if particleFieldName is not recognized.

Definition at line 1395 of file interpolation.c.

1397{
1398 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
1399 PetscFunctionBeginUser;
1400
1402
1403 // --- Input Validation ---
1404 // Check for NULL pointers in essential inputs
1405 if (!user) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx pointer is NULL.");
1406 if (!user->da) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->da is NULL.");
1407 if (!user->fda) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->fda is NULL."); // Needed for vector fields
1408 if (!particleFieldName) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "particleFieldName is NULL.");
1409 if (!targetDM) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output targetDM pointer is NULL.");
1410 if (!expected_dof) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output expected_dof pointer is NULL.");
1411
1412 // --- Determine Target DM and DOF based on Field Name ---
1413 // Compare the input field name with known scalar fields targeting 'da'
1414 if (strcmp(particleFieldName, "Psi") == 0 || strcmp(particleFieldName, "Nvert") == 0) {
1415 *expected_dof = 1; // Scalar fields have DOF 1
1416 *targetDM = user->da; // Target the primary scalar DMDA
1417 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Field '%s' targets DM 'da' (DOF=1).\n", particleFieldName);
1418 }
1419 // Compare with known vector fields targeting 'fda'
1420 else if (strcmp(particleFieldName, "Ucat") == 0 || strcmp(particleFieldName, "Ucont") == 0) {
1421 *expected_dof = 3; // Vector fields have DOF 3
1422 *targetDM = user->fda; // Target the vector DMDA (often node-based)
1423 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Field '%s' targets DM 'fda' (DOF=3).\n", particleFieldName);
1424 }
1425 // --- Add rules for other fields here ---
1426 // else if (strcmp(particleFieldName, "SomeOtherScalar") == 0) { *expected_dof = 1; *targetDM = user->da; }
1427 // else if (strcmp(particleFieldName, "SomeOtherVector") == 0) { *expected_dof = 3; *targetDM = user->someOtherDM; }
1428 else {
1429 // The provided field name doesn't match any known rules
1430 *targetDM = NULL; // Indicate failure
1431 *expected_dof = 0;
1432 // Format the error message manually
1433 PetscSNPrintf(msg, sizeof(msg), "Field name '%s' is not recognized for automatic DM selection.", particleFieldName);
1434 // Use SETERRQ with the formatted message and appropriate error code
1435 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg); // Use WRONG argument error code
1436 }
1437
1439
1440 PetscFunctionReturn(0);
1441}
Here is the caller graph for this function:

◆ ScatterParticleFieldToEulerField_Internal()

static PetscErrorCode ScatterParticleFieldToEulerField_Internal ( UserCtx user,
const char *  particleFieldName,
DM  targetDM,
PetscInt  expected_dof,
Vec  eulerFieldAverageVec 
)
static

Internal helper function to orchestrate the scatter operation (accumulate + normalize).

Manages the temporary sum vector and calls the accumulation and normalization functions. Assumes caller determined target DM and DOF. Checks for particle field existence. NOTE: If DMSwarmGetField fails, the subsequent SETERRQ will overwrite the original error.

Parameters
[in]userPointer to UserCtx containing swarm, ParticleCount, da.
[in]particleFieldNameName of the field in the DMSwarm.
[in]targetDMThe DMDA where the final average and intermediate sum reside.
[in]expected_dofThe expected DOF (1 or 3) for the targetDM and field.
[in,out]eulerFieldAverageVecThe pre-created Vec associated with targetDM to store the result.
Returns
PetscErrorCode 0 on success. Errors if particle field doesn't exist or underlying helpers fail.

Definition at line 1713 of file interpolation.c.

1718{
1719 PetscErrorCode ierr;
1720 Vec sumVec = NULL;
1721 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
1722
1723 PetscFunctionBeginUser;
1724
1726
1727 if (!user || !user->swarm || !user->ParticleCount || !particleFieldName || !targetDM || !eulerFieldAverageVec)
1728 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "NULL input provided to ScatterParticleFieldToEulerField_Internal.");
1729
1730 // --- Check if Particle Field Exists ---
1731 // Attempt a GetField call; if it fails, the field doesn't exist.
1732 // We let CHKERRQ handle the error directly if the field doesn't exist OR
1733 // we catch it specifically to provide a more tailored message.
1734
1735 /*
1736 LOG_ALLOW(GLOBAL,LOG_DEBUG,"Field %s being accessed to check existence \n",particleFieldName);
1737 ierr = DMSwarmGetField(user->swarm, particleFieldName, NULL, NULL, NULL);
1738 if (ierr) { // If GetField returns an error
1739 PetscSNPrintf(msg, sizeof(msg), "Particle field '%s' not found in DMSwarm for scattering.", particleFieldName);
1740 // Directly set the error, overwriting the one from GetField
1741 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONGSTATE, msg);
1742 }
1743 ierr = DMSwarmRestoreField(user->swarm, particleFieldName, NULL, NULL, NULL);
1744 */
1745
1746 // --- Setup Temporary Sum Vector ---
1747 ierr = VecDuplicate(eulerFieldAverageVec, &sumVec); CHKERRQ(ierr);
1748 ierr = VecSet(sumVec, 0.0); CHKERRQ(ierr);
1749 ierr = PetscSNPrintf(msg, sizeof(msg), "TempSum_%s", particleFieldName); CHKERRQ(ierr);
1750 ierr = PetscObjectSetName((PetscObject)sumVec, msg); CHKERRQ(ierr);
1751
1752 // --- Accumulate ---
1753 // This will call DMSwarmGetField again. If it failed above, it will likely fail here too,
1754 // unless the error was cleared somehow between the check and here (unlikely).
1755 // If the check above was skipped (Option 1), this is where the error for non-existent
1756 // field will be caught by CHKERRQ.
1757 ierr = AccumulateParticleField(user->swarm, particleFieldName, targetDM, sumVec); CHKERRQ(ierr);
1758
1759 // Calculate the number of particles per cell.
1760 ierr = CalculateParticleCountPerCell(user);
1761 // --- Normalize ---
1762 ierr = NormalizeGridVectorByCount(user->da, user->ParticleCount, targetDM, sumVec, eulerFieldAverageVec); CHKERRQ(ierr);
1763
1764 // --- Cleanup ---
1765 ierr = VecDestroy(&sumVec); CHKERRQ(ierr);
1766
1767
1769
1770 PetscFunctionReturn(0);
1771}
PetscErrorCode NormalizeGridVectorByCount(DM countDM, Vec countVec, DM dataDM, Vec sumVec, Vec avgVec)
Normalizes a grid vector of sums by a grid vector of counts to produce an average.
PetscErrorCode AccumulateParticleField(DM swarm, const char *particleFieldName, DM gridSumDM, Vec gridSumVec)
Accumulates a particle field (scalar or vector) into a target grid sum vector.
PetscErrorCode CalculateParticleCountPerCell(UserCtx *user)
Counts particles in each cell of the DMDA 'da' and stores the result in user->ParticleCount.
Vec ParticleCount
Definition variables.h:729
Here is the call graph for this function:
Here is the caller graph for this function: