Numerical Solvers

The RatelSolverType determines how the composite CeedOperator are build and used to set the appropriate DMSNES or DMTS options.

PetscClassId PCPMG_CLASSID
PetscLogEvent PCPMG_Setup
PetscLogEvent RATEL_Prolong[RATEL_MAX_MULTIGRID_LEVELS]
PetscLogEvent RATEL_Prolong_CeedOp[RATEL_MAX_MULTIGRID_LEVELS]
PetscLogEvent RATEL_Restrict[RATEL_MAX_MULTIGRID_LEVELS]
PetscLogEvent RATEL_Restrict_CeedOp[RATEL_MAX_MULTIGRID_LEVELS]
static PetscErrorCode RatelRegisterPMultigridLogEvents(PetscInt num_multigrid_levels)

Register p-multigrid Ratel log events.

Not collective across MPI processes.

Parameters:
  • num_multigrid_levels[in] Number of multigrid levels to register

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCSetFromOptions_PMG(PC pc, PetscOptionItems PetscOptionsObject)

Get PCpMG options.

Collective across MPI processes.

Note

This is a PETSc interface, thus the odd signature

Note

The name of the second parameter must be PetscOptionsObject due to abuse of PETSc macros

Parameters:
  • pc[inout] PCpMG object

  • PetscOptionsObject[in] Command line option items

Returns:

An error code: 0 - success, otherwise - failure

static PetscErrorCode RatelSetupMultigridLevel(Ratel ratel, DM dm_level, Vec M_loc, PetscInt level, CeedOperator op_jacobian_fine, CeedOperator op_jacobian, CeedOperator op_prolong, CeedOperator op_restrict)

Setup CeedOperator for multigrid prolongation, restriction, and coarse grid Jacobian evaluation.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • dm_level[in] DMPlex for multigrid level to setup

  • M_loc[in] PETSc local vector holding multiplicity data

  • level[in] Multigrid level to set up

  • op_jacobian_fine[inout] Composite CeedOperator for Jacobian

  • op_jacobian[inout] Composite CeedOperator for Jacobian

  • op_prolong[inout] Composite CeedOperator for prolongation

  • op_restrict[inout] Composite CeedOperator for restriction

Returns:

An error code: 0 - success, otherwise - failure

static PetscErrorCode RatelPCPMGCreateMGHeirarchy(PC pc)

Create MatCeed objects for pMG heirarchy.

Collective across MPI processes.

Parameters:
  • pc[inout] PCpMG object

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCPMGCreate(PC pc)

Setup PCpMG preconditioner from Ratel context.

Collective across MPI processes.

Parameters:
  • pc[inout] PCpMG object

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCReset_PMG(PC pc)

Reset Mat objects and sub-PC for reuse of PCpMG

Collective across MPI processes.

Parameters:
  • pc[in] PCpMG object

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCSetUp_PMG(PC pc)

SetUp for PCpMG.

Reassemble coarse operator.

Collective across MPI processes.

Parameters:
  • pc[inout] PC object to setup

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCPMGSetReuseCoarseMat(PC pc, PetscBool reuse_coarse_mat)

Set whether the pMG should reuse the coarse grid matrix between resets.

Not collective across MPI processes.

Note

Only set this flag if you are certain that the nonzero pattern does not change between resets

Parameters:
  • pc[in] PC object, must be PCShell containing pMG

  • reuse_coarse_mat[in] PetscBool describing whether to reuse coarse grid mat

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCPMGGetReuseCoarseMat(PC pc, PetscBool *reuse_coarse_mat)

Get whether the pMG should reuse the coarse grid matrix between resets.

Not collective across MPI processes.

Parameters:
  • pc[in] PC object, must be PCShell containing pMG

  • reuse_coarse_mat[out] PetscBool describing whether to reuse coarse grid mat

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCView_PMG(PC pc, PetscViewer viewer)

View PCpMG.

Collective across MPI processes.

Parameters:
  • pc[in] PC object to view

  • viewer[inout] Visualization context

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCApply_PMG(PC pc, Vec X_in, Vec X_out)

Apply PCpMG.

Collective across MPI processes.

Parameters:
  • pc[in] PC object to apply

  • X_in[in] Input vector

  • X_out[out] Output vector

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCApplyTranspose_PMG(PC pc, Vec X_in, Vec X_out)

Apply PCpMG transpose.

Collective across MPI processes.

Parameters:
  • pc[in] PC object to apply

  • X_in[in] Input vector

  • X_out[out] Output vector

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCMatApply_PMG(PC pc, Mat X_in, Mat X_out)

Apply PCpMG to multiple vectors stored as MATDENSE.

Collective across MPI processes.

Parameters:
  • pc[in] PC object to apply

  • X_in[in] Input matrix

  • X_out[out] Output matrix

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCDestroy_PMG(PC pc)

Destroy PCpMG context data.

Collective across MPI processes.

Parameters:
  • pc[inout] PC object to destroy

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCPMGContextDestroy(RatelPMGContext pmg)

Destroy Ratel PMG preconditioner.

Parameters:
  • pmg[inout] Ratel pMG context

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelPCRegisterPMG(Ratel ratel)

Register PCpMG preconditioner.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelDMSetupSolver(Ratel ratel)

Setup solver contexts.

Collective across MPI processes.

Parameters:
  • ratel[inout] Ratel context

Returns:

An error code: 0 - success, otherwise - failure