Core Ratel Functions#

These functions are publicly exposed for users.

enum RatelSolverType#

Specify solver type.

Values:

enumerator RATEL_SOLVER_STATIC#

Static solver, to be solved with a PETSc SNES.

enumerator RATEL_SOLVER_QUASISTATIC#

Quasistatic solver, to be solved with a PETSc TS.

enumerator RATEL_SOLVER_DYNAMIC#

Fully dynamic solver, to be solved with a PETSc TS.

enum RatelMethodType#

Specify numerical method.

Values:

enumerator RATEL_METHOD_FEM#

FEM method.

enumerator RATEL_METHOD_MPM#

MPM method.

enum RatelForcingType#

Specify forcing term.

Values:

enumerator RATEL_FORCING_NONE#

No forcing term.

enumerator RATEL_FORCING_BODY#

User-specified body force with interpolation.

enumerator RATEL_FORCING_MMS#

Forcing for linear elasticity manufactured solution.

enum RatelInitialConditionType#

Specify initial condition.

Values:

enumerator RATEL_INITIAL_CONDITION_ZERO#

Initial condition zero vector.

enumerator RATEL_INITIAL_CONDITION_CONTINUE#

Initial condition given by binary file.

enum RatelPMultigridCoarseningType#

Specify p-multigrid coarsening strategy.

Values:

enumerator RATEL_P_MULTIGRID_COARSENING_LOGARITHMIC#

P-multigrid, coarsen logarithmically, decreasing basis order to next lowest power of 2.

enumerator RATEL_P_MULTIGRID_COARSENING_UNIFORM#

P-multigrid, coarsen uniformly, decreasing basis order by 1.

enumerator RATEL_P_MULTIGRID_COARSENING_USER#

P-multigrid, user defined coarsening strategy.

enum RatelPointLocationType#

Specify initial point locations for MPM.

Values:

enumerator RATEL_POINT_LOCATION_GAUSS#

Initial point locations at Gauss quadrature points in each cell.

enumerator RATEL_POINT_LOCATION_UNIFORM#

Initial point locations uniformly spaced in each cell.

enumerator RATEL_POINT_LOCATION_CELL_RANDOM#

Initial point locations randomly distributed in each cell.

PetscErrorCode RatelHasMMS(Ratel ratel, PetscBool *has_mms)#

Determine if Ratel context has MMS solution.

Diagnostic quantities.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • has_mms[out] Boolean flag indicating if Ratel context has MMS solution

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelComputeMMSL2Error(Ratel ratel, Vec U, PetscInt *num_fields, PetscScalar **l2_error)#

Compute the L2 error from the manufactured solution.

Caller is responsible for freeing l2_error.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • U[in] Computed solution vector

  • num_fields[out] Number of L2 error values

  • l2_error[out] Computed L2 error for solution compared to MMS

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelGetExpectedStrainEnergy(Ratel ratel, PetscBool *has_expected, PetscScalar *expected_strain_energy)#

Retrieve expected strain energy.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • has_expected[out] PETSC_TRUE if expected strain energy was set

  • expected_strain_energy[out] Expected strain energy

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelGetExpectedMaxDisplacement(Ratel ratel, PetscInt *num_components, PetscBool has_expected[], PetscScalar expected_max_displacement[])#

Retrieve expected max displacements.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • num_components[inout] Size of input arrays / number of output components

  • has_expected[out] Boolean array, PETSC_TRUE at index if expected value set for component

  • expected_max_displacement[out] Expected max displacement for component

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelGetExpectedFaceSurfaceForce(Ratel ratel, const char *face, PetscInt *num_components, PetscBool has_expected[], PetscScalar expected_surface_force[])#

Retrieve expected surface forces on face.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • face[in] Face name

  • num_components[inout] Size of input arrays / number of output components

  • has_expected[out] Boolean array, PETSC_TRUE at index if expected value set for component

  • expected_surface_force[out] Expected surface forces

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelGetExpectedFaceCentroid(Ratel ratel, const char *face, PetscInt *num_components, PetscBool has_expected[], PetscScalar expected_centroid[])#

Retrieve expected centroid of face.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • face[in] Face name

  • num_components[inout] Size of input arrays / number of output components

  • has_expected[out] Boolean array, PETSC_TRUE at index if expected value set for component

  • expected_centroid[out] Expected centroid values

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelComputeStrainEnergyError(Ratel ratel, PetscScalar strain_energy, PetscBool *has_expected, PetscScalar *strain_energy_error)#

Compute the error between the expected and final strain energy.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel Context

  • strain_energy[in] Computed final strain energy

  • has_expected[out] PETSC_TRUE if expected strain energy provided

  • strain_energy_error[out] Computed error in the final strain energy

PetscErrorCode RatelComputeMaxDisplacementError(Ratel ratel, PetscInt num_components, const PetscScalar max_displacement[], PetscBool *has_any_expected, PetscScalar max_displacement_error[])#

Compute the error between the expected and final maximum displacements.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel Context

  • num_components[in] Number of components of the input and output arrays

  • max_displacement[in] Computed final maximum displacements

  • has_any_expected[out] PETSC_TRUE if expected maximum displacement provided for any component

  • max_displacement_error[out] Computed error in the final maximum displacements

PetscErrorCode RatelComputeFaceForceErrors(Ratel ratel, const char *face, PetscInt num_components, const PetscScalar centroid[], const PetscScalar surface_force[], PetscBool *has_any_expected_centroid, PetscBool *has_any_expected_surface_force, PetscScalar centroid_error[], PetscScalar surface_force_error[])#

Compute the error between the expected and final face forces and centroids.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel Context

  • face[in] Face name

  • num_components[in] Number of components of the input and output arrays

  • centroid[in] Computed final centroid for face

  • surface_force[in] Computed final surface forces for face

  • has_any_expected_centroid[out] PETSC_TRUE if expected centroid provided for face

  • has_any_expected_surface_force[out] PETSC_TRUE if expected surface force provided for face

  • centroid_error[out] Computed error in the final centroid for face

  • surface_force_error[out] Computed error in the final surface force for face

PetscErrorCode RatelComputeStrainEnergy(Ratel ratel, Vec U, PetscReal time, PetscScalar *strain_energy)#

Compute the final strain energy in the computed solution.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • U[in] Computed solution vector

  • time[in] Final time value, or 1.0 for SNES solution

  • strain_energy[out] Computed strain energy

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelComputeMaxVectorValues(Ratel ratel, DM dm, Vec U, PetscReal time, PetscInt *num_fields, char ***field_names, PetscInt **num_components, PetscScalar max_values[])#

Compute maximum values of global vector U.

Caller is responsible for freeing field_names and num_components.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • dm[in] DM for U

  • U[in] Global vector

  • time[in] Current solution time

  • num_fields[out] Number of field

  • field_names[out] Name of the fields

  • num_components[out] Size of output buffer/number of components

  • max_values[out] Computed max values of all fields

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelComputeMaxSolutionValues(Ratel ratel, Vec U, PetscReal time, PetscInt *num_fields, char ***field_names, PetscInt **num_components, PetscScalar max_values[])#

Compute maximum values of solution.

Caller is responsible for freeing field_names and num_components.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • U[in] Computed solution vector

  • time[in] Current solution time

  • num_fields[out] Number of field

  • field_names[out] Name of the fields

  • num_components[out] Size of output buffer/number of components

  • max_values[out] Computed max values of all fields

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelViewMaxDiagnosticValuesByNameFromOptions(Ratel ratel, PetscReal time, Vec D)#

Compute maximum values of diagnostic by name from options.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • time[in] Current time

  • D[in] Computed diagnostic quantities vector

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelGetSurfaceForceFaces(Ratel ratel, PetscInt *num_faces, const char **faces[])#

List mesh face numbers for diagnostic force computation.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • num_faces[out] Number of faces where forces are computed

  • faces[out] DMPlex mesh face numbers

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelComputeSurfaceForcesCellToFace(Ratel ratel, Vec U, PetscReal time, PetscInt *num_components, PetscScalar **surface_forces)#

Compute surface forces on mesh faces.

Note: The component i of the force for face f is at index surface_forces[num_comp * f + i].

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • U[in] Computed solution vector

  • time[in] Final time value, or 1.0 for SNES solution

  • num_components[out] Number of force components in output buffer

  • surface_forces[out] Computed face forces - caller is responsible for freeing

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelComputeSurfaceForces(Ratel ratel, Vec U, PetscReal time, PetscInt *num_components, PetscScalar **surface_forces)#

Compute surface forces on mesh faces.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • U[in] Computed solution vector

  • time[in] Final time value, or 1.0 for SNES solution

  • num_components[out] Number of force components in output buffer

  • surface_forces[out] Computed face surface forces - caller is responsible for freeing

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelComputeSurfaceCentroids(Ratel ratel, Vec U, PetscReal time, PetscInt *num_components, PetscScalar **surface_centroids)#

Compute centroids of mesh faces.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • U[in] Computed solution vector

  • time[in] Final time value, or 1.0 for SNES solution

  • num_components[out] Number of centroid components in output buffer

  • surface_centroids[out] Computed surface centroids - caller is responsible for freeing

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelComputeDiagnosticQuantities(Ratel ratel, Vec U, PetscReal time, Vec *D)#

Compute diagnostic quantities.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • U[in] Computed solution vector

  • time[in] Final time value, or 1.0 for SNES solution

  • D[out] Computed diagnostic quantities vector

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelTSMonitorStrainEnergy(TS ts, PetscInt steps, PetscReal time, Vec U, void *ctx)#

TSMonitor function for strain energy.

Collective across MPI processes.

Parameters:
  • ts[in] TS object to monitor

  • steps[in] Iteration number

  • time[in] Current time

  • U[in] Current state vector

  • ctx[in] Monitor context holding Ratel context object

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelTSMonitorSurfaceForceCellToFace(TS ts, PetscInt steps, PetscReal time, Vec U, void *ctx)#

TSMonitor function for surface forces.

Collective across MPI processes.

Parameters:
  • ts[in] TS object to monitor

  • steps[in] Iteration number

  • time[in] Current time

  • U[in] Current state vector

  • ctx[in] Monitor context holding Ratel context object

Returns:

An error code: 0 - success, otherwise - failure

static PetscErrorCode RatelViewerWriteSurfaceForce(RatelViewer viewer, PetscInt steps, PetscReal time, const char *name, PetscScalar centroid[3], PetscScalar force[3])#
PetscErrorCode RatelTSMonitorSurfaceForce(TS ts, PetscInt steps, PetscReal time, Vec U, void *ctx)#

TSMonitor function for face surface forces.

Collective across MPI processes.

Parameters:
  • ts[in] TS object to monitor

  • steps[in] Iteration number

  • time[in] Current time

  • U[in] Current state vector

  • ctx[in] Monitor context holding Ratel context object

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelTSMonitorSurfaceForcePerFace(TS ts, PetscInt steps, PetscReal time, Vec U, void *ctx)#

TSMonitor function for face surface forces.

Collective across MPI processes.

Parameters:
  • ts[in] TS object to monitor

  • steps[in] Iteration number

  • time[in] Current time

  • U[in] Current state vector

  • ctx[in] Monitor context holding RatelViewers context object

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelTSMonitorDiagnosticQuantities(TS ts, PetscInt steps, PetscReal time, Vec U, void *ctx)#

TSMonitor function for diagnostic quantities.

Collective across MPI processes.

Parameters:
  • ts[in] TS object to monitor

  • steps[in] Iteration number

  • time[in] Current time

  • U[in] Current state vector

  • ctx[in] Monitor context holding Ratel context object

Returns:

An error code: 0 - success, otherwise - failure

static PetscErrorCode RatelViewer_XDMF_WriteHeader(RatelViewer ratel_viewer)#

Write header for XDMF file.

Collective across MPI processes.

Parameters:
  • ratel_viewer[inout] Ratel viewer storing PetscViewerASCII object

Returns:

An error code: 0 - success, otherwise - failure

static PetscErrorCode RatelViewer_XDMF_WriteFooter(RatelViewer ratel_viewer)#

Write footer for XDMF file.

Collective across MPI processes.

Parameters:
  • ratel_viewer[inout] Ratel viewer storing PetscViewerASCII object

Returns:

An error code: 0 - success, otherwise - failure

static PetscErrorCode RatelDMSwarmView_WriteDMInfo(DM dm, RatelViewer ratel_viewer, PetscViewer binary_viewer, PetscInt *offset)#

Write time, topology, and geometry information for DMSwarm to XDMF file and store binary coordinate data.

Not collective across MPI processes.

Parameters:
  • dm[in] DMSwarm object

  • ratel_viewer[inout] Ratel viewer storing PetscViewerASCII object

  • binary_viewer[inout] PetscViewerBinary object to store topology and coordinate data

  • offset[out] Current offset in binary file

Returns:

An error code: 0 - success, otherwise - failure

static PetscErrorCode RatelDMSwarmView_WriteField(DM dm, const char *field, RatelViewer ratel_viewer, PetscViewer binary_viewer, PetscInt *offset)#

Write field data for DMSwarm to XDMF file and store binary field data.

Not collective across MPI processes.

Parameters:
  • dm[in] DMSwarm object

  • field[in] Field name to write

  • ratel_viewer[inout] Ratel viewer storing PetscViewerASCII object

  • binary_viewer[inout] PetscViewerBinary object to store field data

  • offset[out] Current offset in binary file

Returns:

An error code: 0 - success, otherwise - failure

static PetscErrorCode RatelDMSwarmSubViewerCreate(DM dm, RatelViewer ratel_viewer, PetscViewer *binary_viewer)#

Create a sub-viewer for DMSwarm to store binary data.

Collective across MPI processes.

Parameters:
  • dm[in] DMSwarm object

  • ratel_viewer[in] Ratel viewer storing PetscViewerASCII object

  • binary_viewer[out] Created PetscViewerBinary object for the current timestep

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelTSMonitorSwarm(TS ts, PetscInt steps, PetscReal time, Vec U, void *ctx)#

TSMonitor function for swarm data.

Collective across MPI processes.

Parameters:
  • ts[in] TS object to monitor

  • steps[in] Iteration number

  • time[in] Current time

  • U[in] Current state vector

  • ctx[in] Monitor context holding Ratel context object

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelTSMonitorCheckpoint(TS ts, PetscInt steps, PetscReal time, Vec U, void *ctx)#

TSMonitor function for binary checkpoints.

Collective across MPI processes.

Parameters:
  • ts[in] TS object to monitor

  • steps[in] Iteration number

  • time[in] Current time

  • U[in] Current state vector

  • ctx[in] Monitor context holding Ratel context object

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelCheckpointCtxCreateFromOptions(Ratel ratel, const char opt[], const char text[], RatelCheckpointCtx *ctx, PetscBool *set)#

Create a RatelCheckpointCtx object from PetscOptions command line handling.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • opt[in] TS monitor flag

  • text[in] Description of the flag

  • ctx[out] RatelCheckpointCtx object to create

  • set[out] PetscBool indicating if the flag was set

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelCheckpointCtxCreate(Ratel ratel, PetscInt interval, const char file_name[], RatelCheckpointCtx *ctx)#

Create a RatelCheckpointCtx object.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • interval[in] Checkpoint interval

  • file_name[in] Base file name for checkpoint files

  • ctx[out] RatelCheckpointCtx object to create

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelCheckpointCtxDestroy(RatelCheckpointCtx *ctx)#

Destroy RatelCheckpointCtx.

Not collective across MPI processes.

Parameters:
Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelViewerCreateFromOptions(Ratel ratel, const char opt[], const char text[], RatelViewer *monitor_viewer, PetscBool *set)#

Create a RatelViewer object from PetscOptions command line handling.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • opt[in] TS monitor flag

  • text[in] Description of the flag

  • monitor_viewer[out] RatelViewer object to create

  • set[out] PetscBool indicating if the flag was set

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelViewerCreate(Ratel ratel, PetscViewer viewer, PetscViewerFormat viewer_format, PetscInt interval, RatelViewer *monitor_viewer)#

Create a RatelViewer object.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • viewer[in] PetscViewer object to use

  • viewer_format[in] PetscViewerFormat object to use

  • interval[in] Monitor interval

  • monitor_viewer[out] RatelViewer object to create

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelViewerDestroy(RatelViewer *monitor_viewer)#

Destroy viewer data for RatelViewer.

Not collective across MPI processes.

Parameters:
  • monitor_viewer[inout] RatelViewer object to destroy

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelViewerShouldWrite(RatelViewer viewer, TS ts, PetscInt steps, PetscBool *should_write)#

Determine whether the RatelViewer should write at the current time step.

Not collective across MPI processes.

Parameters:
  • viewer[in] RatelViewer object

  • ts[in] TS object

  • steps[in] Current number of time steps

  • should_write[out] True if the viewer should write at the current time step

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelViewersCreateFromOptions(Ratel ratel, const char opt[], const char text[], PetscInt num_viewers, const char **names, RatelViewers *viewer_list, PetscBool *set)#

Create a RatelViewers object from PetscOptions command line handling.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • opt[in] TS monitor flag

  • text[in] Description of the flag

  • num_viewers[in] Number of viewers to create

  • names[in] Names for each viewer

  • viewer_list[out] RatelViewer object to create

  • set[out] PetscBool indicating if the flag was set

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelViewersCreate(Ratel ratel, PetscViewer viewer, PetscViewerFormat viewer_format, PetscInt interval, PetscInt num_viewers, const char **names, RatelViewers *viewer_list)#

Create a RatelViewers object.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • viewer[in] PetscViewer object to use

  • viewer_format[in] PetscViewerFormat object to use

  • interval[in] Monitor interval

  • num_viewers[in] Number of viewers to create

  • names[in] Names for each viewer

  • viewer_list[out] RatelViewers object to create

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelViewersDestroy(RatelViewers *viewer_list)#

Destroy viewer list data for RatelViewers.

Collective across MPI processes.

Parameters:
Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelTSSetup(Ratel ratel, TS ts)#

Setup default TS options, the DM, and options from command line.

Solver management.

Note: Sets SNES defaults from RatelSNESSetup().

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • ts[inout] TS object to setup

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelTSSetFromOptions(Ratel ratel, TS ts)#

Set additional Ratel specific TS options.

   `-ts_monitor_strain_energy viewer:filename.extension` sets TSMonitor function for strain energy
   `-ts_monitor_surface_force viewer:filename.extension` sets TSMonitor function for surface forces
   `-ts_monitor_surface_force_cell_to_face viewer:filename.extension` sets TSMonitor function for cell-to-face surface forces
   `-ts_monitor_diagnostic_quantities viewer:filename.extension` sets TSMonitor function for diagnostic quantities
   `-ts_monitor_checkpoint viewer:filename.extension` sets TSMonitor function for solution checkpoint binaries
Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • ts[inout] TS object to setup

Returns:

An error code: 0 - success, otherwise - failure

static PetscErrorCode RatelKSPSetup(Ratel ratel, KSP ksp)#

Setup default KSP options.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • ksp[inout] KSP object to setup

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelSNESSetup(Ratel ratel, SNES snes)#

Setup default SNES options, the DM, and options from command line.

Note: Sets default SNES line search to critical point method. Note: Sets default KSP to CG with natural norm for most problems. Sets default KSP to GMRES with preconditioned norm for problems with multiple active fields or platen boundary conditions. Note: Sets default PC to PCPMG most problems. Sets default PC to PCGAMG for problems one material and linear elements. Sets default PC to PCJACOBI for problems with multiple materials.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • snes[inout] SNES object to setup

Returns:

An error code: 0 - success, otherwise - failure

static inline PetscErrorCode RatelSetupZeroInitialCondition(Ratel ratel, Vec U)#

Setup zero initial condition with random noise, if needed.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • U[out] Global vector to set with zero initial condition

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelTSSetupInitialCondition(Ratel ratel, TS ts, Vec U)#

Setup inital conditions for TS based upon command line options.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • ts[inout] TS to setup

  • U[out] Global vector to set with initial conditions

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelSNESSetupInitialCondition(Ratel ratel, SNES snes, Vec U)#

Setup inital conditions for SNES based upon command line options.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • snes[inout] SNES to setup

  • U[out] Global vector to set with initial conditions

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelTSCheckpointFinalSolutionFromOptions(Ratel ratel, TS ts, Vec U)#

Checkpoint final solution for TS.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • ts[in] TS to checkpoint

  • U[out] Global vector to checkpoint

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelSNESCheckpointFinalSolutionFromOptions(Ratel ratel, SNES snes, Vec U)#

Checkpoint final solution for SNES.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • snes[in] SNES to checkpoint

  • U[out] Global vector to checkpoint

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelGetVersion(int *major, int *minor, int *patch, PetscBool *release)#

Get Ratel library version info.

Ratel version numbers have the form major.minor.patch. Non-release versions may contain unstable interfaces..

Not collective across MPI processes.

The caller may pass NULL for any arguments that are not needed.

Parameters:
  • major[out] Major version of the library

  • minor[out] Minor version of the library

  • patch[out] Patch (subminor) version of the library

  • release[out] True for releases; false for development branches.

Returns:

An error code: 0 - success, otherwise - failure

static PetscErrorCode RatelRegisterLogEvents()#

Register core Ratel log events.

Not collective across MPI processes.

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelInit(MPI_Comm comm, Ratel *ratel)#

Setup Ratel context object.

Library context management.

Note: This function call initializes the libCEED context.

Collective across MPI processes.

Parameters:
  • comm[in] MPI communication object

  • ratel[out] Ratel context object

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelView(Ratel ratel, PetscViewer viewer)#

View a Ratel context.

Note: This function can be called with the command line option -ratel_view.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context to view

  • viewer[in] Optional visualization context

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelDestroy(Ratel *ratel)#

Destroy a Ratel context.

Collective across MPI processes.

Parameters:
  • ratel[inout] Ratel context object to destroy

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelDMCreate(Ratel ratel, RatelSolverType solver_type, DM *dm)#

Create DM with SNES or TS hooks from Ratel context.

User DM setup.

Collective across MPI processes.

Parameters:
  • ratel[in] Ratel context

  • solver_type[in] Solver type to use

  • dm[out] DMPlex object with SNES or TS hooks

Returns:

An error code: 0 - success, otherwise - failure

PetscErrorCode RatelSetNonSPD(Ratel ratel)#

Set the Ratel context to not assume an SPD Jacobian.

Not collective across MPI processes.

Parameters:
  • ratel[in] Ratel context object

Returns:

An error code: 0 - success, otherwise - failure

RATEL_VERSION_MAJOR 0#

Ratel library major version number.

RATEL_VERSION_MINOR 3#

Ratel library minor version number.

RATEL_VERSION_PATCH 0#

Ratel library patch version number.

RATEL_VERSION_RELEASE false#

Flag indicating library is a release version.

RATEL_VERSION_GE(major, minor, patch)   (!RATEL_VERSION_RELEASE

||                  \

(

RATEL_VERSION_MAJOR

> major ||            \

(

RATEL_VERSION_MAJOR == major && (RATEL_VERSION_MINOR > minor || (RATEL_VERSION_MINOR == minor && RATEL_VERSION_PATCH >= patch)))))#

Compile-time check that the the current library version is at least as recent as the specified version.

This macro is typically used in

#if RATEL_VERSION_GE(0, 1, 0)
  code path that needs at least 0.1.0
#else
  fallback code for older versions
#endif

A non-release version always compares as positive infinity.

Parameters:
  • major – Major version

  • minor – Minor version

  • patch – Patch (subminor) version

struct RatelUnits#
#include <ratel.h>

Context for internal unit scaling.

struct Ratel#
#include <ratel.h>

Library context created by RatelInit()