Numerical Solvers¶
The RatelSolverType determines how the composite CeedOperator are build and used to set the appropriate DMSNES or DMTS options.
-
PetscClassId PCPMG_CLASSID¶
-
PetscLogEvent PCPMG_Setup¶
-
PetscLogEvent RATEL_Prolong[RATEL_MAX_MULTIGRID_LEVELS]¶
-
PetscLogEvent RATEL_Prolong_CeedOp[RATEL_MAX_MULTIGRID_LEVELS]¶
-
PetscLogEvent RATEL_Restrict[RATEL_MAX_MULTIGRID_LEVELS]¶
-
PetscLogEvent RATEL_Restrict_CeedOp[RATEL_MAX_MULTIGRID_LEVELS]¶
-
static PetscErrorCode RatelRegisterPMultigridLogEvents(PetscInt num_multigrid_levels)¶
Register p-multigrid
Ratellog events.Not collective across MPI processes.
- Parameters:
num_multigrid_levels – [in] Number of multigrid levels to register
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCSetFromOptions_PMG(PC pc, PetscOptionItems PetscOptionsObject)¶
Get
PCpMGoptions.Collective across MPI processes.
Note
This is a PETSc interface, thus the odd signature
Note
The name of the second parameter must be
PetscOptionsObjectdue to abuse of PETSc macros- Parameters:
pc – [inout]
PCpMGobjectPetscOptionsObject – [in] Command line option items
- Returns:
An error code: 0 - success, otherwise - failure
-
static PetscErrorCode RatelSetupMultigridLevel(Ratel ratel, DM dm_level, Vec M_loc, PetscInt level, CeedOperator op_jacobian_fine, CeedOperator op_jacobian, CeedOperator op_prolong, CeedOperator op_restrict)¶
Setup
CeedOperatorfor multigrid prolongation, restriction, and coarse grid Jacobian evaluation.Collective across MPI processes.
- Parameters:
ratel – [in]
Ratelcontextdm_level – [in]
DMPlexfor multigrid level to setupM_loc – [in] PETSc local vector holding multiplicity data
level – [in] Multigrid level to set up
op_jacobian_fine – [inout] Composite
CeedOperatorfor Jacobianop_jacobian – [inout] Composite
CeedOperatorfor Jacobianop_prolong – [inout] Composite
CeedOperatorfor prolongationop_restrict – [inout] Composite
CeedOperatorfor restriction
- Returns:
An error code: 0 - success, otherwise - failure
-
static PetscErrorCode RatelPCPMGCreateMGHeirarchy(PC pc)¶
Create
MatCeedobjects for pMG heirarchy.Collective across MPI processes.
- Parameters:
pc – [inout]
PCpMGobject
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCPMGCreate(PC pc)¶
Setup
PCpMGpreconditioner fromRatelcontext.Collective across MPI processes.
- Parameters:
pc – [inout]
PCpMGobject
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCReset_PMG(PC pc)¶
Reset
Matobjects and sub-PC for reuse ofPCpMGCollective across MPI processes.
- Parameters:
pc – [in]
PCpMGobject
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCSetUp_PMG(PC pc)¶
SetUp for
PCpMG.Reassemble coarse operator.
Collective across MPI processes.
- Parameters:
pc – [inout]
PCobject to setup
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCPMGSetReuseCoarseMat(PC pc, PetscBool reuse_coarse_mat)¶
Set whether the pMG should reuse the coarse grid matrix between resets.
Not collective across MPI processes.
Note
Only set this flag if you are certain that the nonzero pattern does not change between resets
- Parameters:
pc – [in]
PCobject, must bePCShellcontaining pMGreuse_coarse_mat – [in]
PetscBooldescribing whether to reuse coarse grid mat
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCPMGGetReuseCoarseMat(PC pc, PetscBool *reuse_coarse_mat)¶
Get whether the pMG should reuse the coarse grid matrix between resets.
Not collective across MPI processes.
- Parameters:
pc – [in]
PCobject, must bePCShellcontaining pMGreuse_coarse_mat – [out]
PetscBooldescribing whether to reuse coarse grid mat
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCView_PMG(PC pc, PetscViewer viewer)¶
View
PCpMG.Collective across MPI processes.
- Parameters:
pc – [in]
PCobject to viewviewer – [inout] Visualization context
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCApply_PMG(PC pc, Vec X_in, Vec X_out)¶
Apply
PCpMG.Collective across MPI processes.
- Parameters:
pc – [in]
PCobject to applyX_in – [in] Input vector
X_out – [out] Output vector
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCApplyTranspose_PMG(PC pc, Vec X_in, Vec X_out)¶
Apply
PCpMGtranspose.Collective across MPI processes.
- Parameters:
pc – [in]
PCobject to applyX_in – [in] Input vector
X_out – [out] Output vector
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCMatApply_PMG(PC pc, Mat X_in, Mat X_out)¶
Apply
PCpMGto multiple vectors stored asMATDENSE.Collective across MPI processes.
- Parameters:
pc – [in]
PCobject to applyX_in – [in] Input matrix
X_out – [out] Output matrix
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCDestroy_PMG(PC pc)¶
Destroy
PCpMGcontext data.Collective across MPI processes.
- Parameters:
pc – [inout]
PCobject to destroy
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCPMGContextDestroy(RatelPMGContext pmg)¶
Destroy Ratel PMG preconditioner.
- Parameters:
pmg – [inout] Ratel pMG context
- Returns:
An error code: 0 - success, otherwise - failure