============================ KSP: Solving a Linear System ============================ TODO: update the below, copied verbatim from Hannah, with ``pandoc -s -t rst -f latex main.tex -o main.rst`` .. role:: math(raw) :format: html latex .. PETSc hands-on tutorial: Solving a linear system in serial ========================================================== In this tutorial, we will solve a tridiagonal linear system :math:`Ax=b` on a single processor. We express a system of linear equations as PETSc vector (``Vec``) and matrix (``Mat``) objects. We precondition (``PC``) the system and find the solution using a linear solver (``KSP``). The tutorial demonstrates how to: - Create PETSc vector ``Vec`` and matrix ``Mat`` objects - Precondition and solve a linear system - View solver results and error Setting up a PETSc program ========================== We will write this program in one ``main`` function, but more complex program can be split across functions and files. :: static char help[] = "Solves a tridiagonal linear system with KSP.\n\n"; /*T Concepts: KSP^solving a system of linear equations Processors: 1 T*/ /* Include "petscksp.h" so that we can use KSP solvers. Note that this file automatically includes: petscsys.h - base PETSc routines petscvec.h - vectors petscmat.h - matrices petscpc.h - preconditioners petscis.h - index sets petscviewer.h - viewers Note: The corresponding parallel example is ex23.c */ #include int main(int argc,char **args) { Vec x, b, u; /* approx solution, RHS, exact solution */ Mat A; /* linear system matrix */ KSP ksp; /* linear solver context */ PC pc; /* preconditioner context */ PetscReal norm; /* norm of solution error */ PetscErrorCode ierr; PetscInt i,n = 10,col[3],its; PetscMPIInt size; PetscScalar value[3] Now we set up MPI using ``PetscInitialize`` which invokes ``MPI_Init`` and reads the command line for PETSc runtime options. We want to run this tutorial in serial, so we check the size of the MPI communicator and return an error if more than one rank is used. :: ierr = PetscInitialize(&argc,&args,(char*)0,help);if (ierr) return ierr; ierr = MPI_Comm_size(PETSC_COMM_WORLD,&size);CHKERRQ(ierr); if (size != 1) SETERRQ(PETSC_COMM_WORLD,1,"This is a uniprocessor example only!"); ierr = PetscOptionsGetInt(NULL,NULL,"-n",&n,NULL);CHKERRQ(ierr); Creating PETSc vectors and matrices =================================== Now we will create the vector objects that we need including the right-hand-side :math:`b` and space for a solution :math:`x` and exact solution :math:`u`. Here, we form one vector from scratch and duplicate it when needed. :: /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Compute the matrix and right-hand-side vector that define the linear system, Ax = b. - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */ /* Create vectors. Note that we form 1 vector from scratch and then duplicate as needed. */ ierr = VecCreate(PETSC_COMM_WORLD,&x);CHKERRQ(ierr); ierr = PetscObjectSetName((PetscObject) x, "Solution");CHKERRQ(ierr); ierr = VecSetSizes(x,PETSC_DECIDE,n);CHKERRQ(ierr); ierr = VecSetFromOptions(x);CHKERRQ(ierr); ierr = VecDuplicate(x,&b);CHKERRQ(ierr); ierr = VecDuplicate(x,&u);CHKERRQ(ierr); :: /* Create matrix. When using MatCreate(), the matrix format can be specified at runtime. Performance tuning note: For problems of substantial size, preallocation of matrix memory is crucial for attaining good performance. See the matrix chapter of the users manual for details. */ ierr = MatCreate(PETSC_COMM_WORLD,&A);CHKERRQ(ierr); ierr = MatSetSizes(A,PETSC_DECIDE,PETSC_DECIDE,n,n);CHKERRQ(ierr); ierr = MatSetFromOptions(A);CHKERRQ(ierr); ierr = MatSetUp(A);CHKERRQ(ierr); :: /* Assemble matrix */ value[0] = -1.0; value[1] = 2.0; value[2] = -1.0; for (i=1; i -pc_type -ksp_monitor -ksp_rtol These options will override those specified above as long as KSPSetFromOptions() is called _after_ any other customization routines. */ ierr = KSPSetFromOptions(ksp);CHKERRQ(ierr); Solve a linear system and check the solution ============================================ We are ready to solve the system! One call to ``KSPSolve`` will solve the system with the options set above or from the command line. :: /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Solve the linear system - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */ ierr = KSPSolve(ksp,b,x);CHKERRQ(ierr); /* View solver info; we could instead use the option -ksp_view to print this info to the screen at the conclusion of KSPSolve(). */ ierr = KSPView(ksp,PETSC_VIEWER_STDOUT_WORLD);CHKERRQ(ierr); /* - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - Check the solution and clean up - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - */ ierr = VecAXPY(x,-1.0,u);CHKERRQ(ierr); ierr = VecNorm(x,NORM_2,&norm);CHKERRQ(ierr); ierr = KSPGetIterationNumber(ksp,&its);CHKERRQ(ierr); ierr = PetscPrintf(PETSC_COMM_WORLD, "Norm of error %g, Iterations %D\n",(double)norm,its);CHKERRQ(ierr); End a PETSc program =================== It’s good practice to free memory that was used during our program. Finally we call ``PetscFinalize`` which finalizes PETSc libraries and MPI. It also provides summary information if used with certain runtime options like ``-log_view``. :: /* Free work space. All PETSc objects should be destroyed when they are no longer needed. */ ierr = VecDestroy(&x);CHKERRQ(ierr); ierr = VecDestroy(&u);CHKERRQ(ierr); ierr = VecDestroy(&b);CHKERRQ(ierr); ierr = MatDestroy(&A);CHKERRQ(ierr); ierr = KSPDestroy(&ksp);CHKERRQ(ierr); /* Always call PetscFinalize() before exiting a program. This routine - finalizes the PETSc libraries as well as MPI - provides summary and diagnostic information if certain runtime options are chosen (e.g., -log_view). */ ierr = PetscFinalize(); return ierr; } Run a PETSc program =================== We use the command line to run PETSc programs. Navigate to ``ex1.c`` using the ``cd`` command followed by the path to the file. PETSc makes use of ``make`` to build tests and tutorials, so compiling ``ex1.c`` is straightforward and we get an executable called ``ex1``. We run this serial program with one MPI rank. Additional PETSc options like ``-n`` to set the size of the system or ``-log_view`` to view logging information can be added after the name of the executable. .. code-block:: bash $ cd $PETSC_DIR/$PETSC_ARCH/src/ksp/ksp/examples/tutorials $ make ex1 $ $PETSC_DIR/$PETSC_ARCH/bin/mpiexec -np 1 ./ex1