fedoo.problem.Linear.set_solver

Linear.set_solver(solver: str = 'direct', **kargs)

Define the solver for the linear system resolution.

Parameters:
  • solver (str, ufunc) –

    Type of solver. The possible choice are : * ‘direct’: direct solver. If pypardiso is installed, the

    pypardiso solver is used. Else, if petsc is installed, the mumps solver is used. If not, the function scipy.sparse.linalg.spsolve is used. If sckikit-umfpack is installed, scipy will use the umfpack solver which is significantly more efficient than the base scipy solver.

    • ’cg’, ‘bicg’, ‘bicgstab’,’minres’,’gmres’, ‘lgmres’ or ‘gcrotmk’ using the corresponding iterative method from scipy.sparse.linalg. For instance, ‘cg’ is the conjugate gradient based on the function scipy.sparse.linalg.cg.

    • ’pardiso’: force the use of the pypardiso solver

    • ’direct_scipy’: force the use of the direct scipy solver (umfpack if installed)

    • ’petsc’: use the petsc methods (iterative or direct). petsc4py should be installed.

    • function: A user spsolve function that should have the signature res = solver(A,B,**kargs). where A is a scipy sparse matrix and B a 1d numpy array. kargs may contains optional parameters.

  • kargs (optional parameters depending on the type of solver) –

    precond: bool

    Use precond = False to desactivate the diagonal matrix preconditionning for scipy iterative methods. If this parametre is not given, the precoditionning is activated if M is not given.

    M: {sparse matrix, ndarray, LinearOperator}

    Preconditioner for A used for scipy iterative methods.

    solver_type: str

    Petsc solver. The default is ‘bcgs’. See the petsc documentation for available solvers.

    pc_type: str

    Petsc type of preconditioner. The default is ‘eisenstat’. See the petsc documentation for available preconditioners.

    pc_factor_mat_solver_type: str

    Petsc solver for matrix factorization when applicable. See the petsc documentation for details about this parameter.

Notes

  • To change the many available petsc options, the petsc4py.PETSc.Options dict should be imported and modified (see example below).

  • To use the petsc with MPI parallelization (PCMPI), the script mpi4py needs to be installed, and the script should be launch in command line using mpiexec. For instance:

    >>> mpiexec -n 4 python petsc_examples.py -mpi_linear_solver_server -ksp_type cg -pc_type bjacobi
    

    A performance gain may be observed for very large problems, but this method should be avoid for problem with moderate size.

Examples

>>> # Use the scipy cg solver without preconditioner
>>> pb.set_solver('cg', precond=False, rtol=1e-4)
>>>
>>> # Use the petsc cg solver with jacobi preconditioner and modify
>>> # the rtol default parameter.
>>> pb.set_solver('petsc', solver_type='cg', pc_type='jacobi')
>>> from petsc4py.PETSc import Options
>>> petsc_opt = Options()
>>> petsc_opt['ksp_rtol'] = 1e-4
>>>
>>> # Use the MUMPS direct solver with petsc
>>> pb.set_solver('petsc', solver_type='preonly', pc_type='lu', pc_factor_mat_solver_type='mumps')