PICurv 0.1.0
A Parallel Particle-In-Cell Solver for Curvilinear LES
Loading...
Searching...
No Matches
Internal Scattering Helpers

Lower-level functions used by the main scattering routines. More...

Collaboration diagram for Internal Scattering Helpers:

Functions

PetscErrorCode AccumulateParticleField (DM swarm, const char *particleFieldName, DM gridSumDM, Vec gridSumVec)
 Accumulates a particle field (scalar or vector) into a target grid sum vector.
 
PetscErrorCode NormalizeGridVectorByCount (DM countDM, Vec countVec, DM dataDM, Vec sumVec, Vec avgVec)
 Normalizes a grid vector of sums by a grid vector of counts to produce an average.
 
PetscErrorCode GetScatterTargetInfo (UserCtx *user, const char *particleFieldName, DM *targetDM, PetscInt *expected_dof)
 Determines the target Eulerian DM and expected DOF for scattering a given particle field.
 
static PetscErrorCode ScatterParticleFieldToEulerField_Internal (UserCtx *user, const char *particleFieldName, DM targetDM, PetscInt expected_dof, Vec eulerFieldAverageVec)
 Internal helper function to orchestrate the scatter operation (accumulate + normalize).
 

Detailed Description

Lower-level functions used by the main scattering routines.

Function Documentation

◆ AccumulateParticleField()

PetscErrorCode AccumulateParticleField ( DM  swarm,
const char *  particleFieldName,
DM  gridSumDM,
Vec  gridSumVec 
)

Accumulates a particle field (scalar or vector) into a target grid sum vector.

This function iterates through local particles, identifies their cell using the "DMSwarm_CellID" field, and adds the particle's field value (particleFieldName) to the corresponding cell location in the gridSumVec. It handles both scalar (DOF=1) and vector (DOF=3) fields automatically based on the DOF of gridSumDM.

IMPORTANT: The caller must ensure gridSumVec is zeroed before calling this function if a fresh sum calculation is desired.

Parameters
[in]swarmThe DMSwarm containing particles.
[in]particleFieldNameName of the field on the particles (must match DOF).
[in]gridSumDMThe DMDA associated with gridSumVec. Its DOF determines how many components are accumulated.
[in,out]gridSumVecThe Vec (associated with gridSumDM) to accumulate sums into.
Returns
PetscErrorCode 0 on success. Errors if fields don't exist or DMs are incompatible.

This function iterates through local particles, identifies their cell using the "DMSwarm_CellID" field, and adds the particle's field value (particleFieldName) to the corresponding cell location in the gridSumVec. It handles both scalar (DOF=1) and vector (DOF=3) fields automatically based on the DOF of gridSumDM.

IMPORTANT: The caller must ensure gridSumVec is zeroed before calling this function if a fresh sum calculation is desired.

Parameters
[in]swarmThe DMSwarm containing particles.
[in]particleFieldNameName of the field on the particles (must match DOF of gridSumDM).
[in]gridSumDMThe DMDA associated with gridSumVec. Its DOF determines how many components are accumulated.
[in,out]gridSumVecThe Vec (associated with gridSumDM) to accumulate sums into.
Returns
PetscErrorCode 0 on success. Errors if fields don't exist, DMs are incompatible, or memory access fails.

Definition at line 1849 of file interpolation.c.

1851{
1852 PetscErrorCode ierr;
1853 PetscInt dof; // DOF determined from gridSumDM
1854 PetscInt nlocal, p; // Local particle count and loop index
1855 const PetscReal *particle_arr = NULL; // Pointer to particle field data array (assuming Real)
1856 const PetscInt *cell_id_arr = NULL; // Pointer to particle cell ID array ("DMSwarm_CellID", Int)
1857 PetscScalar *sum_arr_ptr = NULL; // Pointer to grid sum vector data array (Scalar)
1858 PetscInt gxs, gys, gzs; // Start indices of local ghosted patch (often 0)
1859 PetscInt gxm, gym, gzm; // Dimensions of local ghosted patch (including ghosts)
1860 PetscMPIInt rank; // MPI rank for logging
1861 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
1862
1863 PetscFunctionBeginUser;
1864 ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank); CHKERRQ(ierr);
1865
1866 // --- Get DOF from the target grid DM ---
1867 ierr = DMDAGetInfo(gridSumDM, NULL, NULL, NULL, NULL, NULL, NULL, NULL, &dof, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
1868 // Validate that the DOF is supported (currently 1 or 3)
1869 if (dof != 1 && dof != 3) {
1870 PetscSNPrintf(msg, sizeof(msg), "AccumulateParticleField: gridSumDM DOF must be 1 or 3, got %d.", dof);
1871 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg);
1872 }
1873
1874 // --- Get Particle Data Arrays ---
1875 // DMSwarmGetField will return an error if the field doesn't exist, caught by CHKERRQ.
1876 ierr = DMSwarmGetField(swarm, particleFieldName, NULL, NULL, (void **)&particle_arr); CHKERRQ(ierr);
1877 ierr = DMSwarmGetField(swarm, "DMSwarm_CellID", NULL, NULL, (void **)&cell_id_arr); CHKERRQ(ierr);
1878
1879 // Get number of local particles *after* successfully getting field pointers
1880 ierr = DMSwarmGetLocalSize(swarm, &nlocal); CHKERRQ(ierr);
1881
1882 // --- Get Grid Sum Vector Array & Dimensions ---
1883 ierr = VecGetArray(gridSumVec, &sum_arr_ptr); CHKERRQ(ierr);
1884 // Get dimensions needed for calculating flat index within the local ghosted array
1885 ierr = DMDAGetGhostCorners(gridSumDM, &gxs, &gys, &gzs, &gxm, &gym, &gzm); CHKERRQ(ierr);
1886
1887 // --- Accumulate Locally ---
1888 LOG_ALLOW(LOCAL, LOG_DEBUG, "AccumulateParticleField (Rank %d): Accumulating '%s' (DOF=%d) from %d particles using CellID field 'DMSwarm_CellID'.\n", rank, particleFieldName, dof, nlocal);
1889 // Loop over all particles owned by this process
1890 for (p = 0; p < nlocal; ++p) {
1891 // Extract local cell indices (relative to start of ghosted patch, [0..gxm-1], etc.)
1892 // Assumes DMSwarm_CellID stores (i, j, k) contiguously for each particle.
1893 PetscInt pidx = cell_id_arr[p * 3 + 0]; // Local i-index
1894 PetscInt pidy = cell_id_arr[p * 3 + 1]; // Local j-index
1895 PetscInt pidz = cell_id_arr[p * 3 + 2]; // Local k-index
1896
1897 // Bounds Check: Ensure the particle's cell index is within the valid local ghosted region
1898 if (pidx >= 0 && pidx < gxm && pidy >= 0 && pidy < gym && pidz >= 0 && pidz < gzm)
1899 {
1900 // Calculate the flat 1D index for this cell within the linear ghosted array
1901 // Uses PETSc's standard C-style row-major ordering (k-slowest, j-middle, i-fastest)
1902 // Corrected: k (slowest), j, i (fastest)
1903 PetscInt cell_flat_idx = (pidz * gym + pidy) * gxm + pidx;
1904
1905 // Calculate the base index for this particle's data in particle_arr
1906 PetscInt particle_base_idx = p * dof;
1907 // Calculate the base index for this cell's data in sum_arr_ptr
1908 PetscInt grid_base_idx = cell_flat_idx * dof;
1909
1910 // Add particle components to the grid sum vector components
1911 for (PetscInt c = 0; c < dof; ++c) {
1912 sum_arr_ptr[grid_base_idx + c] += particle_arr[particle_base_idx + c];
1913 }
1914 } else {
1915 // Log a warning if a particle's CellID is outside the expected local region.
1916 // This might indicate particles needing migration or boundary issues.
1917 LOG_ALLOW(LOCAL, LOG_WARNING, "AccumulateParticleField (Rank %d): Particle %d (field '%s') has out-of-bounds CellID (%d, %d, %d). Ghosted dims: %dx%dx%d. Skipping.\n",
1918 rank, p, particleFieldName, pidx, pidy, pidz, gxm, gym, gzm);
1919 }
1920 } // End of particle loop
1921
1922 // --- Restore Access to Arrays ---
1923 ierr = VecRestoreArray(gridSumVec, &sum_arr_ptr); CHKERRQ(ierr);
1924 ierr = DMSwarmRestoreField(swarm, particleFieldName, NULL, NULL, (void **)&particle_arr); CHKERRQ(ierr);
1925 ierr = DMSwarmRestoreField(swarm, "DMSwarm_CellID", NULL, NULL, (void **)&cell_id_arr); CHKERRQ(ierr);
1926
1927 // --- Assemble Global Sum Vector ---
1928 // Crucial for parallel execution: sums contributions for cells shared across process boundaries.
1929 LOG_ALLOW(GLOBAL, LOG_DEBUG, "AccumulateParticleField: Assembling global sum vector for '%s'.\n", particleFieldName);
1930 ierr = VecAssemblyBegin(gridSumVec); CHKERRQ(ierr);
1931 ierr = VecAssemblyEnd(gridSumVec); CHKERRQ(ierr);
1932
1933 PetscFunctionReturn(0);
1934}
#define ERROR_MSG_BUFFER_SIZE
#define LOCAL
Logging scope definitions for controlling message output.
Definition logging.h:44
#define GLOBAL
Scope for global logging across all processes.
Definition logging.h:45
#define LOG_ALLOW(scope, level, fmt,...)
Logging macro that checks both the log level and whether the calling function is in the allowed-funct...
Definition logging.h:207
@ LOG_WARNING
Non-critical issues that warrant attention.
Definition logging.h:30
@ LOG_DEBUG
Detailed debugging information.
Definition logging.h:33
Here is the caller graph for this function:

◆ NormalizeGridVectorByCount()

PetscErrorCode NormalizeGridVectorByCount ( DM  countDM,
Vec  countVec,
DM  dataDM,
Vec  sumVec,
Vec  avgVec 
)

Normalizes a grid vector of sums by a grid vector of counts to produce an average.

Calculates avgVec[i] = sumVec[i] / countVec[i] for each component of each OWNED cell where countVec[i] > 0. Sets avgVec[i] = 0 otherwise. Handles both scalar (DOF=1) and vector (DOF=3) data fields based on dataDM. Uses basic VecGetArray/VecGetArrayRead and manual index calculation.

Parameters
[in]countDMThe DMDA associated with countVec (must have DOF=1).
[in]countVecThe Vec containing particle counts per cell (read-only).
[in]dataDMThe DMDA associated with sumVec and avgVec (must have DOF=1 or DOF=3).
[in]sumVecThe Vec containing the accumulated sums per cell (read-only).
[in,out]avgVecThe Vec where the calculated averages will be stored (overwritten).
Returns
PetscErrorCode 0 on success.

Calculates avgVec[i] = sumVec[i] / countVec[i] for each component of each cell owned by the current process, provided countVec[i] > 0. Otherwise, sets avgVec[i] = 0. Handles both scalar (DOF=1) and vector (DOF=3) data fields based on the DOF of dataDM. Uses DMDA multi-dimensional array accessors (DMDAVecGetArray...) for safe and convenient indexing.

Parameters
[in]countDMThe DMDA associated with countVec (must have DOF=1).
[in]countVecThe Vec containing particle counts per cell (read-only).
[in]dataDMThe DMDA associated with sumVec and avgVec (must have DOF=1 or DOF=3).
[in]sumVecThe Vec containing the accumulated sums per cell (read-only).
[in,out]avgVecThe Vec where the calculated averages will be stored (overwritten). Must be associated with dataDM.
Returns
PetscErrorCode 0 on success. Errors on incompatible DMs or memory access failure.

Definition at line 1954 of file interpolation.c.

1956{
1957 PetscErrorCode ierr;
1958 PetscInt data_dof;
1959 PetscInt count_dof;
1960 PetscMPIInt rank;
1961 char msg[ERROR_MSG_BUFFER_SIZE];
1962
1963 // Pointers for DMDA array accessors - declare specific types
1964 PetscScalar ***count_arr_3d = NULL; // For DOF=1 count vector (3D DMDA)
1965 PetscScalar ***sum_arr_scalar = NULL; // For DOF=1 sum vector (3D DMDA)
1966 PetscScalar ***avg_arr_scalar = NULL; // For DOF=1 avg vector (3D DMDA)
1967 PetscScalar ***sum_arr_vector = NULL; // For DOF=3 sum vector (3D DMDA + DOF)
1968 PetscScalar ***avg_arr_vector = NULL; // For DOF=3 avg vector (3D DMDA + DOF)
1969
1970
1971 PetscFunctionBeginUser;
1972 ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank); CHKERRQ(ierr);
1973
1974 // --- Validation ---
1975 ierr = DMDAGetInfo(countDM, NULL, NULL, NULL, NULL, NULL, NULL, NULL, &count_dof, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
1976 ierr = DMDAGetInfo(dataDM, NULL, NULL, NULL, NULL, NULL, NULL, NULL, &data_dof, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
1977 if (count_dof != 1) { PetscSNPrintf(msg, sizeof(msg), "countDM must have DOF=1, got %d.", count_dof); SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg); }
1978 if (data_dof != 1 && data_dof != 3) { PetscSNPrintf(msg, sizeof(msg), "dataDM DOF must be 1 or 3, got %d.", data_dof); SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg); }
1979
1980 // --- Get Array Access using appropriate DMDA accessors ---
1981 ierr = DMDAVecGetArrayRead(countDM, countVec, &count_arr_3d); CHKERRQ(ierr);
1982
1983 if (data_dof == 1) {
1984 ierr = DMDAVecGetArrayRead(dataDM, sumVec, &sum_arr_scalar); CHKERRQ(ierr);
1985 ierr = DMDAVecGetArray(dataDM, avgVec, &avg_arr_scalar); CHKERRQ(ierr);
1986 } else { // data_dof == 3
1987 ierr = DMDAVecGetArrayDOFRead(dataDM, sumVec, &sum_arr_vector); CHKERRQ(ierr);
1988 ierr = DMDAVecGetArrayDOF(dataDM, avgVec, &avg_arr_vector); CHKERRQ(ierr);
1989 }
1990
1991 // Get the corners (global start indices) and dimensions of the *local owned* region
1992 PetscInt xs, ys, zs, xm, ym, zm;
1993 ierr = DMDAGetCorners(countDM, &xs, &ys, &zs, &xm, &ym, &zm); CHKERRQ(ierr);
1994
1995 // --- Normalize Over Owned Cells ---
1996 LOG_ALLOW(LOCAL, LOG_DEBUG, "NormalizeGridVectorByCount (Rank %d): Normalizing DOF=%d data over owned range [%d:%d, %d:%d, %d:%d].\n",
1997 rank, data_dof, xs, xs+xm, ys, ys+ym, zs, zs+zm);
1998
1999 // Loop using GLOBAL indices (i, j, k) over the range owned by this process
2000 for (PetscInt k = zs; k < zs + zm; ++k) {
2001 for (PetscInt j = ys; j < ys + ym; ++j) {
2002 for (PetscInt i = xs; i < xs + xm; ++i) {
2003
2004 // Access the count using standard 3D indexing
2005 PetscScalar count = count_arr_3d[k][j][i];
2006
2007 if (PetscRealPart(count) > 0.5) { // Use tolerance for float comparison
2008 if (data_dof == 1) {
2009 // Access scalar sum/avg using standard 3D indexing
2010 avg_arr_scalar[k][j][i] = sum_arr_scalar[k][j][i] / count;
2011 } else { // data_dof == 3
2012 // Access vector components using DOF indexing on the last dimension
2013 for (PetscInt c = 0; c < data_dof; ++c) {
2014 avg_arr_vector[k][j][i * data_dof + c] = sum_arr_vector[k][j][i * data_dof + c] / count;
2015 }
2016 }
2017 } else { // count is zero or negative
2018 // Set average to zero
2019 if (data_dof == 1) {
2020 avg_arr_scalar[k][j][i] = 0.0;
2021 } else { // data_dof == 3
2022 for (PetscInt c = 0; c < data_dof; ++c) {
2023 avg_arr_vector[k][j][i * data_dof + c] = 0.0;
2024 }
2025 }
2026 } // end if count > 0.5
2027 } // end i loop
2028 } // end j loop
2029 } // end k loop
2030
2031 // --- Restore Arrays using appropriate functions ---
2032 ierr = DMDAVecRestoreArrayRead(countDM, countVec, &count_arr_3d); CHKERRQ(ierr);
2033 if (data_dof == 1) {
2034 ierr = DMDAVecRestoreArrayRead(dataDM, sumVec, &sum_arr_scalar); CHKERRQ(ierr);
2035 ierr = DMDAVecRestoreArray(dataDM, avgVec, &avg_arr_scalar); CHKERRQ(ierr);
2036 } else { // data_dof == 3
2037 ierr = DMDAVecRestoreArrayDOFRead(dataDM, sumVec, &sum_arr_vector); CHKERRQ(ierr);
2038 ierr = DMDAVecRestoreArrayDOF(dataDM, avgVec, &avg_arr_vector); CHKERRQ(ierr);
2039 }
2040
2041 // --- Assemble Final Average Vector ---
2042 LOG_ALLOW(GLOBAL, LOG_DEBUG, "NormalizeGridVectorByCount: Assembling final average vector (DOF=%d).\n", data_dof);
2043 ierr = VecAssemblyBegin(avgVec); CHKERRQ(ierr);
2044 ierr = VecAssemblyEnd(avgVec); CHKERRQ(ierr);
2045
2046 PetscFunctionReturn(0);
2047}
Here is the caller graph for this function:

◆ GetScatterTargetInfo()

PetscErrorCode GetScatterTargetInfo ( UserCtx user,
const char *  particleFieldName,
DM *  targetDM,
PetscInt *  expected_dof 
)

Determines the target Eulerian DM and expected DOF for scattering a given particle field.

Based on hardcoded rules mapping particle field names ("P", "Nvert", "Ucat", "Ucont") to user context DMs (user->da or user->fda). This function encapsulates the policy of where different fields should be scattered. Modify this function to add rules for custom fields.

Parameters
[in]userPointer to the UserCtx containing required DMs (da, fda).
[in]particleFieldNameName of the particle field.
[out]targetDMPointer to store the determined target DM (user->da or user->fda).
[out]expected_dofPointer to store the expected DOF (1 or 3) for this field.
Returns
PetscErrorCode Returns 0 on success. Error codes:
  • PETSC_ERR_ARG_NULL if required inputs are NULL.
  • PETSC_ERR_ARG_WRONG if particleFieldName is not recognized.

Definition at line 1784 of file interpolation.c.

1786{
1787 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
1788 PetscFunctionBeginUser;
1789
1790 // --- Input Validation ---
1791 // Check for NULL pointers in essential inputs
1792 if (!user) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx pointer is NULL.");
1793 if (!user->da) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->da is NULL.");
1794 if (!user->fda) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->fda is NULL."); // Needed for vector fields
1795 if (!particleFieldName) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "particleFieldName is NULL.");
1796 if (!targetDM) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output targetDM pointer is NULL.");
1797 if (!expected_dof) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output expected_dof pointer is NULL.");
1798
1799 // --- Determine Target DM and DOF based on Field Name ---
1800 // Compare the input field name with known scalar fields targeting 'da'
1801 if (strcmp(particleFieldName, "Psi") == 0 || strcmp(particleFieldName, "Nvert") == 0) {
1802 *expected_dof = 1; // Scalar fields have DOF 1
1803 *targetDM = user->da; // Target the primary scalar DMDA
1804 LOG_ALLOW(GLOBAL, LOG_DEBUG, "GetScatterTargetInfo: Field '%s' targets DM 'da' (DOF=1).\n", particleFieldName);
1805 }
1806 // Compare with known vector fields targeting 'fda'
1807 else if (strcmp(particleFieldName, "Ucat") == 0 || strcmp(particleFieldName, "Ucont") == 0) {
1808 *expected_dof = 3; // Vector fields have DOF 3
1809 *targetDM = user->fda; // Target the vector DMDA (often node-based)
1810 LOG_ALLOW(GLOBAL, LOG_DEBUG, "GetScatterTargetInfo: Field '%s' targets DM 'fda' (DOF=3).\n", particleFieldName);
1811 }
1812 // --- Add rules for other fields here ---
1813 // else if (strcmp(particleFieldName, "SomeOtherScalar") == 0) { *expected_dof = 1; *targetDM = user->da; }
1814 // else if (strcmp(particleFieldName, "SomeOtherVector") == 0) { *expected_dof = 3; *targetDM = user->someOtherDM; }
1815 else {
1816 // The provided field name doesn't match any known rules
1817 *targetDM = NULL; // Indicate failure
1818 *expected_dof = 0;
1819 // Format the error message manually
1820 PetscSNPrintf(msg, sizeof(msg), "GetScatterTargetInfo: Field name '%s' is not recognized for automatic DM selection.", particleFieldName);
1821 // Use SETERRQ with the formatted message and appropriate error code
1822 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, msg); // Use WRONG argument error code
1823 }
1824
1825 PetscFunctionReturn(0);
1826}
Here is the caller graph for this function:

◆ ScatterParticleFieldToEulerField_Internal()

static PetscErrorCode ScatterParticleFieldToEulerField_Internal ( UserCtx user,
const char *  particleFieldName,
DM  targetDM,
PetscInt  expected_dof,
Vec  eulerFieldAverageVec 
)
static

Internal helper function to orchestrate the scatter operation (accumulate + normalize).

Manages the temporary sum vector and calls the accumulation and normalization functions. Assumes caller determined target DM and DOF. Checks for particle field existence. NOTE: If DMSwarmGetField fails, the subsequent SETERRQ will overwrite the original error.

Parameters
[in]userPointer to UserCtx containing swarm, ParticleCount, da.
[in]particleFieldNameName of the field in the DMSwarm.
[in]targetDMThe DMDA where the final average and intermediate sum reside.
[in]expected_dofThe expected DOF (1 or 3) for the targetDM and field.
[in,out]eulerFieldAverageVecThe pre-created Vec associated with targetDM to store the result.
Returns
PetscErrorCode 0 on success. Errors if particle field doesn't exist or underlying helpers fail.

Definition at line 2074 of file interpolation.c.

2079{
2080 PetscErrorCode ierr;
2081 Vec sumVec = NULL;
2082 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
2083
2084 PetscFunctionBeginUser;
2085
2086 if (!user || !user->swarm || !user->ParticleCount || !particleFieldName || !targetDM || !eulerFieldAverageVec)
2087 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "NULL input provided to ScatterParticleFieldToEulerField_Internal.");
2088
2089 // --- Check if Particle Field Exists ---
2090 // Attempt a GetField call; if it fails, the field doesn't exist.
2091 // We let CHKERRQ handle the error directly if the field doesn't exist OR
2092 // we catch it specifically to provide a more tailored message.
2093
2094 /*
2095 LOG_ALLOW(GLOBAL,LOG_DEBUG,"Field %s being accessed to check existence \n",particleFieldName);
2096 ierr = DMSwarmGetField(user->swarm, particleFieldName, NULL, NULL, NULL);
2097 if (ierr) { // If GetField returns an error
2098 PetscSNPrintf(msg, sizeof(msg), "Particle field '%s' not found in DMSwarm for scattering.", particleFieldName);
2099 // Directly set the error, overwriting the one from GetField
2100 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONGSTATE, msg);
2101 }
2102 ierr = DMSwarmRestoreField(user->swarm, particleFieldName, NULL, NULL, NULL);
2103 */
2104
2105 // --- Setup Temporary Sum Vector ---
2106 ierr = VecDuplicate(eulerFieldAverageVec, &sumVec); CHKERRQ(ierr);
2107 ierr = VecSet(sumVec, 0.0); CHKERRQ(ierr);
2108 ierr = PetscSNPrintf(msg, sizeof(msg), "TempSum_%s", particleFieldName); CHKERRQ(ierr);
2109 ierr = PetscObjectSetName((PetscObject)sumVec, msg); CHKERRQ(ierr);
2110
2111 // --- Accumulate ---
2112 // This will call DMSwarmGetField again. If it failed above, it will likely fail here too,
2113 // unless the error was cleared somehow between the check and here (unlikely).
2114 // If the check above was skipped (Option 1), this is where the error for non-existent
2115 // field will be caught by CHKERRQ.
2116 ierr = AccumulateParticleField(user->swarm, particleFieldName, targetDM, sumVec); CHKERRQ(ierr);
2117
2118 // Calculate the number of particles per cell.
2119 ierr = CalculateParticleCountPerCell(user);
2120 // --- Normalize ---
2121 ierr = NormalizeGridVectorByCount(user->da, user->ParticleCount, targetDM, sumVec, eulerFieldAverageVec); CHKERRQ(ierr);
2122
2123 // --- Cleanup ---
2124 ierr = VecDestroy(&sumVec); CHKERRQ(ierr);
2125
2126 PetscFunctionReturn(0);
2127}
PetscErrorCode NormalizeGridVectorByCount(DM countDM, Vec countVec, DM dataDM, Vec sumVec, Vec avgVec)
Normalizes a grid vector of sums by a grid vector of counts to produce an average.
PetscErrorCode AccumulateParticleField(DM swarm, const char *particleFieldName, DM gridSumDM, Vec gridSumVec)
Accumulates a particle field (scalar or vector) into a target grid sum vector.
PetscErrorCode CalculateParticleCountPerCell(UserCtx *user)
Counts particles in each cell of the DMDA 'da' and stores the result in user->ParticleCount.
Vec ParticleCount
Definition variables.h:697
Here is the call graph for this function:
Here is the caller graph for this function: