PICurv 0.1.0
A Parallel Particle-In-Cell Solver for Curvilinear LES
Loading...
Searching...
No Matches
Macros | Functions
Internal Scattering Helpers

Lower-level functions used by the main scattering routines. More...

Collaboration diagram for Internal Scattering Helpers:

Macros

#define __FUNCT__   "InterpolateFieldFromCornerToCenter_Vector"
 
#define __FUNCT__   "InterpolateFieldFromCornerToCenter_Scalar"
 
#define __FUNCT__   "TestCornerToCenterInterpolation"
 
#define __FUNCT__   "InterpolateFieldFromCenterToCorner_Vector"
 
#define __FUNCT__   "InterpolateFieldFromCenterToCorner_Scalar"
 
#define __FUNCT__   "InterpolateEulerFieldFromCenterToSwarm"
 
#define __FUNCT__   "InterpolateEulerFieldFromCornerToSwarm"
 
#define __FUNCT__   "InterpolateEulerFieldToSwarm"
 
#define __FUNCT__   "InterpolateAllFieldsToSwarm"
 
#define __FUNCT__   "GetScatterTargetInfo"
 
#define __FUNCT__   "GetPersistentLocalVector"
 
#define __FUNCT__   "AccumulateParticleField"
 
#define __FUNCT__   "NormalizeGridVectorByCount"
 
#define __FUNCT__   "ScatterParticleFieldToEulerField_Internal"
 
#define __FUNCT__   "ScatterParticleFieldToEulerField"
 
#define __FUNCT__   "ScatterAllParticleFieldsToEulerFields"
 
#define __FUNCT__   "InterpolateCornerToFaceCenter_Scalar"
 
#define __FUNCT__   "InterpolateCornerToFaceCenter_Vector"
 

Functions

PetscErrorCode AccumulateParticleField (DM swarm, const char *particleFieldName, DM gridSumDM, Vec gridSumVec)
 Accumulates a particle field (scalar or vector) into a target grid sum vector.
 
PetscErrorCode NormalizeGridVectorByCount (DM countDM, Vec countVec, DM dataDM, Vec sumVec, Vec avgVec)
 Normalizes a grid vector of sums by a grid vector of counts to produce an average.
 
PetscErrorCode GetScatterTargetInfo (UserCtx *user, const char *particleFieldName, DM *targetDM, PetscInt *expected_dof)
 Determines the target Eulerian DM and expected DOF for scattering a given particle field.
 
PetscErrorCode GetPersistentLocalVector (UserCtx *user, const char *fieldName, Vec *localVec)
 Internal helper implementation: GetPersistentLocalVector().
 

Detailed Description

Lower-level functions used by the main scattering routines.

Macro Definition Documentation

◆ __FUNCT__ [1/18]

#define __FUNCT__   "InterpolateFieldFromCornerToCenter_Vector"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [2/18]

#define __FUNCT__   "InterpolateFieldFromCornerToCenter_Scalar"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [3/18]

#define __FUNCT__   "TestCornerToCenterInterpolation"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [4/18]

#define __FUNCT__   "InterpolateFieldFromCenterToCorner_Vector"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [5/18]

#define __FUNCT__   "InterpolateFieldFromCenterToCorner_Scalar"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [6/18]

#define __FUNCT__   "InterpolateEulerFieldFromCenterToSwarm"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [7/18]

#define __FUNCT__   "InterpolateEulerFieldFromCornerToSwarm"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [8/18]

#define __FUNCT__   "InterpolateEulerFieldToSwarm"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [9/18]

#define __FUNCT__   "InterpolateAllFieldsToSwarm"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [10/18]

#define __FUNCT__   "GetScatterTargetInfo"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [11/18]

#define __FUNCT__   "GetPersistentLocalVector"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [12/18]

#define __FUNCT__   "AccumulateParticleField"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [13/18]

#define __FUNCT__   "NormalizeGridVectorByCount"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [14/18]

#define __FUNCT__   "ScatterParticleFieldToEulerField_Internal"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [15/18]

#define __FUNCT__   "ScatterParticleFieldToEulerField"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [16/18]

#define __FUNCT__   "ScatterAllParticleFieldsToEulerFields"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [17/18]

#define __FUNCT__   "InterpolateCornerToFaceCenter_Scalar"

Definition at line 20 of file interpolation.c.

◆ __FUNCT__ [18/18]

#define __FUNCT__   "InterpolateCornerToFaceCenter_Vector"

Definition at line 20 of file interpolation.c.

Function Documentation

◆ AccumulateParticleField()

PetscErrorCode AccumulateParticleField ( DM  swarm,
const char *  particleFieldName,
DM  gridSumDM,
Vec  gridSumVec 
)

Accumulates a particle field (scalar or vector) into a target grid sum vector.

This function iterates through local particles, identifies their cell using the "DMSwarm_CellID" field, and adds the particle's field value (particleFieldName) to the corresponding cell location in the gridSumVec. It handles both scalar (DOF=1) and vector (DOF=3) fields automatically based on the DOF of gridSumDM.

IMPORTANT: The caller must ensure gridSumVec is zeroed before calling this function if a fresh sum calculation is desired.

Parameters
[in]swarmThe DMSwarm containing particles.
[in]particleFieldNameName of the field on the particles (must match DOF).
[in]gridSumDMThe DMDA associated with gridSumVec. Its DOF determines how many components are accumulated.
[in,out]gridSumVecThe Vec (associated with gridSumDM) to accumulate sums into.
Returns
PetscErrorCode 0 on success. Errors if fields don't exist or DMs are incompatible.

Definition at line 1579 of file interpolation.c.

1581{
1582 PetscErrorCode ierr;
1583 PetscInt dof;
1584 PetscInt nlocal, p;
1585 const PetscReal *particle_arr = NULL;
1586 const PetscInt *cell_id_arr = NULL;
1587
1588 // DMDA Accessors
1589 PetscScalar ***arr_1d = NULL; // For scalar fields
1590 PetscScalar ****arr_3d = NULL; // For vector fields
1591
1592 // Ghosted dimension variables
1593 PetscInt gxs, gys, gzs, gxm, gym, gzm;
1594 PetscMPIInt rank;
1595 char msg[ERROR_MSG_BUFFER_SIZE];
1596
1597 PetscFunctionBeginUser;
1599
1600 ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank); CHKERRQ(ierr);
1601
1602 // --- 1. Validation & Setup ---
1603 if (!swarm || !gridSumDM || !localAccumulatorVec)
1604 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Null input in AccumulateParticleField.");
1605
1606 // Get DMDA information
1607 ierr = DMDAGetInfo(gridSumDM, NULL, NULL, NULL, NULL, NULL, NULL, NULL, &dof, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
1608 // Get Ghosted Corners (Global indices of the ghosted patch start, and its dimensions)
1609 ierr = DMDAGetGhostCorners(gridSumDM, &gxs, &gys, &gzs, &gxm, &gym, &gzm); CHKERRQ(ierr);
1610
1611 // --- 2. Verify Vector Type (Global vs Local) ---
1612 {
1613 PetscInt vecSize;
1614 PetscInt expectedLocalSize = gxm * gym * gzm * dof;
1615 ierr = VecGetSize(localAccumulatorVec, &vecSize); CHKERRQ(ierr);
1616
1617 if (vecSize != expectedLocalSize) {
1618 PetscSNPrintf(msg, sizeof(msg),
1619 "Vector dimension mismatch! Expected Ghosted Local Vector size %d (gxm*gym*gzm*dof), got %d. "
1620 "Did you pass a Global Vector instead of a Local Vector?",
1621 expectedLocalSize, vecSize);
1622 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, "%s", msg);
1623 }
1624 }
1625
1626 // --- 3. Acquire Particle Data ---
1627 ierr = DMSwarmGetLocalSize(swarm, &nlocal); CHKERRQ(ierr);
1628 // These calls will fail nicely if the field doesn't exist
1629 ierr = DMSwarmGetField(swarm, particleFieldName, NULL, NULL, (void **)&particle_arr); CHKERRQ(ierr);
1630 ierr = DMSwarmGetField(swarm, "DMSwarm_CellID", NULL, NULL, (void **)&cell_id_arr); CHKERRQ(ierr);
1631
1632 // --- 4. Acquire Grid Accessors ---
1633 // DMDAVecGetArray* handles the mapping from Global (i,j,k) to the underlying Local Array index
1634 if (dof == 1) {
1635 ierr = DMDAVecGetArray(gridSumDM, localAccumulatorVec, &arr_1d); CHKERRQ(ierr);
1636 } else if (dof == 3) {
1637 ierr = DMDAVecGetArrayDOF(gridSumDM, localAccumulatorVec, &arr_3d); CHKERRQ(ierr);
1638 } else {
1639 PetscSNPrintf(msg, sizeof(msg), "Unsupported DOF=%d. AccumulateParticleField supports DOF 1 or 3.", dof);
1640 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_SUP, "%s", msg);
1641 }
1642
1643 // --- 5. Accumulate Loop ---
1644 // Iterate through all local particles
1645 for (p = 0; p < nlocal; ++p) {
1646 // Retrieve Geometric Grid Index (0-based)
1647 PetscInt i_geom = cell_id_arr[p * 3 + 0];
1648 PetscInt j_geom = cell_id_arr[p * 3 + 1];
1649 PetscInt k_geom = cell_id_arr[p * 3 + 2];
1650
1651 // Apply Shift (+1) to match Memory Layout (Index 0 is boundary/ghost)
1652 PetscInt i = i_geom + 1;
1653 PetscInt j = j_geom + 1;
1654 PetscInt k = k_geom + 1;
1655
1656 // Bounds Check: Ensure (i,j,k) falls within the Local Ghosted Patch.
1657 // This allows writing to ghost slots which will later be reduced to the owner rank.
1658 if (i >= gxs && i < gxs + gxm &&
1659 j >= gys && j < gys + gym &&
1660 k >= gzs && k < gzs + gzm)
1661 {
1662 if (dof == 1) {
1663 arr_1d[k][j][i] += particle_arr[p];
1664 } else {
1665 // For DOF=3, unroll the loop for slight optimization
1666 arr_3d[k][j][i][0] += particle_arr[p * 3 + 0];
1667 arr_3d[k][j][i][1] += particle_arr[p * 3 + 1];
1668 arr_3d[k][j][i][2] += particle_arr[p * 3 + 2];
1669 }
1670 }
1671 // Note: Particles outside the ghost layer are skipped. This is expected behavior
1672 // if particles have not yet been migrated or localized correctly.
1673 }
1674
1675 // --- 6. Restore Arrays and Fields ---
1676 if (dof == 1) {
1677 ierr = DMDAVecRestoreArray(gridSumDM, localAccumulatorVec, &arr_1d); CHKERRQ(ierr);
1678 } else {
1679 ierr = DMDAVecRestoreArrayDOF(gridSumDM, localAccumulatorVec, &arr_3d); CHKERRQ(ierr);
1680 }
1681
1682 ierr = DMSwarmRestoreField(swarm, particleFieldName, NULL, NULL, (void **)&particle_arr); CHKERRQ(ierr);
1683 ierr = DMSwarmRestoreField(swarm, "DMSwarm_CellID", NULL, NULL, (void **)&cell_id_arr); CHKERRQ(ierr);
1684
1686 PetscFunctionReturn(0);
1687}
#define ERROR_MSG_BUFFER_SIZE
#define PROFILE_FUNCTION_END
Marks the end of a profiled code block.
Definition logging.h:779
#define PROFILE_FUNCTION_BEGIN
Marks the beginning of a profiled code block (typically a function).
Definition logging.h:770
Here is the caller graph for this function:

◆ NormalizeGridVectorByCount()

PetscErrorCode NormalizeGridVectorByCount ( DM  countDM,
Vec  countVec,
DM  dataDM,
Vec  sumVec,
Vec  avgVec 
)

Normalizes a grid vector of sums by a grid vector of counts to produce an average.

Calculates avgVec[i] = sumVec[i] / countVec[i] for each component of each OWNED cell where countVec[i] > 0. Sets avgVec[i] = 0 otherwise. Handles both scalar (DOF=1) and vector (DOF=3) data fields based on dataDM. Uses basic VecGetArray/VecGetArrayRead and manual index calculation.

Parameters
[in]countDMThe DMDA associated with countVec (must have DOF=1).
[in]countVecThe Vec containing particle counts per cell (read-only).
[in]dataDMThe DMDA associated with sumVec and avgVec (must have DOF=1 or DOF=3).
[in]sumVecThe Vec containing the accumulated sums per cell (read-only).
[in,out]avgVecThe Vec where the calculated averages will be stored (overwritten).
Returns
PetscErrorCode 0 on success.

Definition at line 1693 of file interpolation.c.

1695{
1696 PetscErrorCode ierr;
1697 PetscInt data_dof;
1698 PetscInt count_dof;
1699 PetscMPIInt rank;
1700 char msg[ERROR_MSG_BUFFER_SIZE];
1701
1702 // Pointers for DMDA array accessors - declare specific types
1703 PetscScalar ***count_arr_3d = NULL; // For DOF=1 count vector (3D DMDA)
1704 PetscScalar ***sum_arr_scalar = NULL; // For DOF=1 sum vector (3D DMDA)
1705 PetscScalar ***avg_arr_scalar = NULL; // For DOF=1 avg vector (3D DMDA)
1706 PetscScalar ****sum_arr_vector = NULL; // For DOF=3 sum vector (3D DMDA + DOF)
1707 PetscScalar ****avg_arr_vector = NULL; // For DOF=3 avg vector (3D DMDA + DOF)
1708
1709
1710 PetscFunctionBeginUser;
1711
1713
1714 ierr = MPI_Comm_rank(PETSC_COMM_WORLD, &rank); CHKERRQ(ierr);
1715
1716 // --- Validation ---
1717 ierr = DMDAGetInfo(countDM, NULL, NULL, NULL, NULL, NULL, NULL, NULL, &count_dof, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
1718 ierr = DMDAGetInfo(dataDM, NULL, NULL, NULL, NULL, NULL, NULL, NULL, &data_dof, NULL, NULL, NULL, NULL, NULL); CHKERRQ(ierr);
1719 if (count_dof != 1) { PetscSNPrintf(msg, sizeof(msg), "countDM must have DOF=1, got %d.", count_dof); SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, "%s", msg); }
1720 if (data_dof != 1 && data_dof != 3) { PetscSNPrintf(msg, sizeof(msg), "dataDM DOF must be 1 or 3, got %d.", data_dof); SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, "%s", msg); }
1721
1722 // --- Get Array Access using appropriate DMDA accessors ---
1723 ierr = DMDAVecGetArrayRead(countDM, countVec, &count_arr_3d); CHKERRQ(ierr);
1724
1725 if (data_dof == 1) {
1726 ierr = DMDAVecGetArrayRead(dataDM, sumVec, &sum_arr_scalar); CHKERRQ(ierr);
1727 ierr = DMDAVecGetArray(dataDM, avgVec, &avg_arr_scalar); CHKERRQ(ierr);
1728 } else { // data_dof == 3
1729 ierr = DMDAVecGetArrayDOFRead(dataDM, sumVec, &sum_arr_vector); CHKERRQ(ierr);
1730 ierr = DMDAVecGetArrayDOF(dataDM, avgVec, &avg_arr_vector); CHKERRQ(ierr);
1731 }
1732
1733 // Get the corners (global start indices) and dimensions of the *local owned* region
1734 PetscInt xs, ys, zs, xm, ym, zm;
1735 ierr = DMDAGetCorners(countDM, &xs, &ys, &zs, &xm, &ym, &zm); CHKERRQ(ierr);
1736
1737 // --- Normalize Over Owned Cells ---
1738 LOG_ALLOW(LOCAL, LOG_DEBUG, "(Rank %d): Normalizing DOF=%d data over owned range [%d:%d, %d:%d, %d:%d].\n",
1739 rank, data_dof, xs, xs+xm, ys, ys+ym, zs, zs+zm);
1740
1741 // Loop using GLOBAL indices (i, j, k) over the range owned by this process
1742 for (PetscInt k = zs; k < zs + zm; ++k) {
1743 for (PetscInt j = ys; j < ys + ym; ++j) {
1744 for (PetscInt i = xs; i < xs + xm; ++i) {
1745
1746 // Access the count using standard 3D indexing
1747 PetscScalar count = count_arr_3d[k][j][i];
1748
1749 if (PetscRealPart(count) > 0.5) { // Use tolerance for float comparison
1750 if (data_dof == 1) {
1751 // Access scalar sum/avg using standard 3D indexing
1752 avg_arr_scalar[k][j][i] = sum_arr_scalar[k][j][i] / count;
1753 } else { // data_dof == 3
1754 // Access vector components using DOF indexing on the last dimension
1755 for (PetscInt c = 0; c < data_dof; ++c) {
1756 avg_arr_vector[k][j][i][c] = sum_arr_vector[k][j][i][c] / count;
1757 }
1758 }
1759 } else { // count is zero or negative
1760 // Set average to zero
1761 if (data_dof == 1) {
1762 avg_arr_scalar[k][j][i] = 0.0;
1763 } else { // data_dof == 3
1764 for (PetscInt c = 0; c < data_dof; ++c) {
1765 avg_arr_vector[k][j][i][c] = 0.0;
1766 }
1767 }
1768 } // end if count > 0.5
1769 } // end i loop
1770 } // end j loop
1771 } // end k loop
1772
1773 // --- Restore Arrays using appropriate functions ---
1774 ierr = DMDAVecRestoreArrayRead(countDM, countVec, &count_arr_3d); CHKERRQ(ierr);
1775 if (data_dof == 1) {
1776 ierr = DMDAVecRestoreArrayRead(dataDM, sumVec, &sum_arr_scalar); CHKERRQ(ierr);
1777 ierr = DMDAVecRestoreArray(dataDM, avgVec, &avg_arr_scalar); CHKERRQ(ierr);
1778 } else { // data_dof == 3
1779 ierr = DMDAVecRestoreArrayDOFRead(dataDM, sumVec, &sum_arr_vector); CHKERRQ(ierr);
1780 ierr = DMDAVecRestoreArrayDOF(dataDM, avgVec, &avg_arr_vector); CHKERRQ(ierr);
1781 }
1782
1783 // --- Assemble Final Average Vector ---
1784 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Assembling final average vector (DOF=%d).\n", data_dof);
1785 ierr = VecAssemblyBegin(avgVec); CHKERRQ(ierr);
1786 ierr = VecAssemblyEnd(avgVec); CHKERRQ(ierr);
1787
1788
1790
1791 PetscFunctionReturn(0);
1792}
#define LOCAL
Logging scope definitions for controlling message output.
Definition logging.h:44
#define GLOBAL
Scope for global logging across all processes.
Definition logging.h:45
#define LOG_ALLOW(scope, level, fmt,...)
Logging macro that checks both the log level and whether the calling function is in the allowed-funct...
Definition logging.h:199
@ LOG_DEBUG
Detailed debugging information.
Definition logging.h:31
Here is the caller graph for this function:

◆ GetScatterTargetInfo()

PetscErrorCode GetScatterTargetInfo ( UserCtx user,
const char *  particleFieldName,
DM *  targetDM,
PetscInt *  expected_dof 
)

Determines the target Eulerian DM and expected DOF for scattering a given particle field.

Based on hardcoded rules mapping particle field names to user context DMs (da/fda). This function encapsulates the policy of where different fields should be scattered.

Parameters
[in]userPointer to the UserCtx containing da and fda.
[in]particleFieldNameName of the particle field (e.g., "P", "Ucat").
[out]targetDMPointer to store the determined target DM (da or fda).
[out]expected_dofPointer to store the expected DOF (1 or 3) for this field.
Returns
PetscErrorCode Returns 0 on success, PETSC_ERR_ARG_UNKNOWN if field name is not recognized, or PETSC_ERR_ARG_NULL for NULL inputs.

Definition at line 1486 of file interpolation.c.

1488{
1489 char msg[ERROR_MSG_BUFFER_SIZE]; // Buffer for formatted error messages
1490 PetscFunctionBeginUser;
1491
1493
1494 // --- Input Validation ---
1495 // Check for NULL pointers in essential inputs
1496 if (!user) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx pointer is NULL.");
1497 if (!user->da) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->da is NULL.");
1498 if (!user->fda) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "UserCtx->fda is NULL."); // Needed for vector fields
1499 if (!particleFieldName) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "particleFieldName is NULL.");
1500 if (!targetDM) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output targetDM pointer is NULL.");
1501 if (!expected_dof) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Output expected_dof pointer is NULL.");
1502
1503 // --- Determine Target DM and DOF based on Field Name ---
1504 // Compare the input field name with known scalar fields targeting 'da'
1505 if (strcmp(particleFieldName, "Psi") == 0 || strcmp(particleFieldName, "Nvert") == 0) {
1506 *expected_dof = 1; // Scalar fields have DOF 1
1507 *targetDM = user->da; // Target the primary scalar DMDA
1508 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Field '%s' targets DM 'da' (DOF=1).\n", particleFieldName);
1509 }
1510 // Compare with known vector fields targeting 'fda'
1511 else if (strcmp(particleFieldName, "Ucat") == 0 || strcmp(particleFieldName, "Ucont") == 0) {
1512 *expected_dof = 3; // Vector fields have DOF 3
1513 *targetDM = user->fda; // Target the vector DMDA (often node-based)
1514 LOG_ALLOW(GLOBAL, LOG_DEBUG, "Field '%s' targets DM 'fda' (DOF=3).\n", particleFieldName);
1515 }
1516 // --- Add rules for other fields here ---
1517 // else if (strcmp(particleFieldName, "SomeOtherScalar") == 0) { *expected_dof = 1; *targetDM = user->da; }
1518 // else if (strcmp(particleFieldName, "SomeOtherVector") == 0) { *expected_dof = 3; *targetDM = user->someOtherDM; }
1519 else {
1520 // The provided field name doesn't match any known rules
1521 *targetDM = NULL; // Indicate failure
1522 *expected_dof = 0;
1523 // Format the error message manually
1524 PetscSNPrintf(msg, sizeof(msg), "Field name '%s' is not recognized for automatic DM selection.", particleFieldName);
1525 // Use SETERRQ with the formatted message and appropriate error code
1526 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, "%s", msg); // Use WRONG argument error code
1527 }
1528
1530
1531 PetscFunctionReturn(0);
1532}
Here is the caller graph for this function:

◆ GetPersistentLocalVector()

PetscErrorCode GetPersistentLocalVector ( UserCtx user,
const char *  fieldName,
Vec *  localVec 
)

Internal helper implementation: GetPersistentLocalVector().

Retrieves the persistent local vector (e.g., lPsi, lUcat) for a given field name.

Local to this translation unit.

Definition at line 1540 of file interpolation.c.

1541{
1542 PetscFunctionBeginUser;
1543
1544 if (!user || !fieldName || !localVec)
1545 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_NULL, "Null input in GetPersistentLocalVector");
1546
1547 if (strcmp(fieldName, "Psi") == 0) {
1548 if (!user->lPsi) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONGSTATE, "user->lPsi is not initialized");
1549 *localVec = user->lPsi;
1550 }
1551 else if (strcmp(fieldName, "Ucat") == 0) {
1552 if (!user->lUcat) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONGSTATE, "user->lUcat is not initialized");
1553 *localVec = user->lUcat;
1554 }
1555 else if (strcmp(fieldName, "Ucont") == 0) {
1556 if (!user->lUcont) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONGSTATE, "user->lUcont is not initialized");
1557 *localVec = user->lUcont;
1558 }
1559 else if (strcmp(fieldName, "Nvert") == 0) {
1560 if (!user->lNvert) SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONGSTATE, "user->lNvert is not initialized");
1561 *localVec = user->lNvert;
1562 }
1563 else {
1564 char msg[ERROR_MSG_BUFFER_SIZE];
1565 PetscSNPrintf(msg, sizeof(msg), "Field '%s' does not have a mapped persistent local vector.", fieldName);
1566 SETERRQ(PETSC_COMM_SELF, PETSC_ERR_ARG_WRONG, "%s", msg);
1567 }
1568
1569 PetscFunctionReturn(0);
1570}
Vec lNvert
Definition variables.h:837
Vec lPsi
Definition variables.h:883
Vec lUcont
Definition variables.h:837
Vec lUcat
Definition variables.h:837