Numerical Solvers¶
The RatelSolverType
determines how the composite CeedOperator are build and used to set the appropriate DMSNES or DMTS options.
-
PetscClassId PCPMG_CLASSID¶
-
PetscLogEvent PCPMG_Setup¶
-
PetscLogEvent RATEL_Prolong[RATEL_MAX_MULTIGRID_LEVELS]¶
-
PetscLogEvent RATEL_Prolong_CeedOp[RATEL_MAX_MULTIGRID_LEVELS]¶
-
PetscLogEvent RATEL_Restrict[RATEL_MAX_MULTIGRID_LEVELS]¶
-
PetscLogEvent RATEL_Restrict_CeedOp[RATEL_MAX_MULTIGRID_LEVELS]¶
-
static PetscErrorCode RatelRegisterPMultigridLogEvents(PetscInt num_multigrid_levels)¶
Register p-multigrid
Ratel
log events.Not collective across MPI processes.
- Parameters:
num_multigrid_levels – [in] Number of multigrid levels to register
- Returns:
An error code: 0 - success, otherwise - failure
-
static PetscErrorCode RatelPMGProcessCommandLineOptions(RatelPMGContext pmg, PetscInt pmg_field, const char *pc_prefix)¶
Get
PCpMG
options.Collective across MPI processes.
- Parameters:
pmg – [inout]
PCpMG
contextpmg_field – [in] Index of field on the
DM
pc_prefix – [in] Command line option prefix for
PC
- Returns:
An error code: 0 - success, otherwise - failure
-
static PetscErrorCode RatelSetupMultigridLevel(Ratel ratel, DM dm_level, Vec M_loc, PetscInt level, CeedOperator op_jacobian_fine, CeedOperator op_jacobian, CeedOperator op_prolong, CeedOperator op_restrict)¶
Setup
CeedOperator
for multigrid prolongation, restriction, and coarse grid Jacobian evaluation.Collective across MPI processes.
- Parameters:
ratel – [in]
Ratel
contextdm_level – [in]
DMPlex
for multigrid level to setupM_loc – [in] PETSc local vector holding multiplicity data
level – [in] Multigrid level to set up
op_jacobian_fine – [inout] Composite
CeedOperator
for Jacobianop_jacobian – [inout] Composite
CeedOperator
for Jacobianop_prolong – [inout] Composite
CeedOperator
for prolongationop_restrict – [inout] Composite
CeedOperator
for restriction
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCPMGCreate(PC pc)¶
Setup
PCpMG
preconditioner fromRatel
context.Collective across MPI processes.
- Parameters:
pc – [inout]
PCpMG
object
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCSetUp_PMG(PC pc)¶
SetUp for
PCpMG
.Reassemble coarse operator.
Collective across MPI processes.
- Parameters:
pc – [inout]
PC
object to setup
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCView_PMG(PC pc, PetscViewer viewer)¶
View
PCpMG
.Collective across MPI processes.
- Parameters:
pc – [in]
PC
object to viewviewer – [inout] Visualization context
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCApply_PMG(PC pc, Vec X_in, Vec X_out)¶
Apply
PCpMG
.Collective across MPI processes.
- Parameters:
pc – [in]
PC
object to applyX_in – [in] Input vector
X_out – [out] Output vector
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCApplyTranspose_PMG(PC pc, Vec X_in, Vec X_out)¶
Apply
PCpMG
transpose.Collective across MPI processes.
- Parameters:
pc – [in]
PC
object to applyX_in – [in] Input vector
X_out – [out] Output vector
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCMatApply_PMG(PC pc, Mat X_in, Mat X_out)¶
Apply
PCpMG
to multiple vectors stored asMATDENSE
.Collective across MPI processes.
- Parameters:
pc – [in]
PC
object to applyX_in – [in] Input matrix
X_out – [out] Output matrix
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCDestroy_PMG(PC pc)¶
Destroy
PCpMG
context data.Collective across MPI processes.
- Parameters:
pc – [inout]
PC
object to destroy
- Returns:
An error code: 0 - success, otherwise - failure
-
PetscErrorCode RatelPCPMGContextDestroy(RatelPMGContext pmg)¶
Destroy Ratel PMG preconditioner.
- Parameters:
pmg – [inout] Ratel pMG context
- Returns:
An error code: 0 - success, otherwise - failure