Parallel computing in Phase field benchmarks
Description
We will get the following errors if we run the phase field benchmarks (e.g., k-regime) in parallel. This error happens of setting the initial phase field;
<parameter>
<name>phasefield_ic</name>
<type>MeshNode</type>
<field_name>phase-field</field_name>
</parameter>
if we set the initial phase field in the entire domain to constant, this error does not appear.
<parameter>
<name>phasefield_ic</name>
<type>Constant</type>
<value>1</value>
</parameter>
================================================
Linear solver cg with jacobi preconditioner
converged in 269 iterations (absolute convergence criterion fulfilled).
================================================
[0] info: [time] Linear solver took 5.00066 s.
[1] info: [time] Linear solver took 4.97686 s.
[0] info: Convergence criterion: |dx|=6.1706e-07, |x|=1.6077e-05, |dx|/|x|=3.8381e-02
[0] info: [time] Iteration #2 took 24.3248 s.
[1] info: Convergence criterion: |dx|=6.1706e-07, |x|=1.6077e-05, |dx|/|x|=3.8381e-02
[1] info: [time] Iteration #2 took 24.2764 s.
[0] info: [time] Assembly took 17.393 s.
[1] info: [time] Assembly took 17.4236 s.
[0] info: [time] Applying Dirichlet BCs took 0.416363 s.
[1] info: [time] Applying Dirichlet BCs took 0.413423 s.
================================================
Linear solver cg with jacobi preconditioner
converged in 0 iterations (absolute convergence criterion fulfilled).
================================================
[0] info: [time] Linear solver took 0.094611 s.
[1] info: [time] Linear solver took 0.066866 s.
[1] info: Convergence criterion: |dx|=0.0000e+00, |x|=1.6077e-05, |dx|/|x|=0.0000e+00
[1] info: [time] Iteration #3 took 17.9046 s.
[0] info: Convergence criterion: |dx|=0.0000e+00, |x|=1.6077e-05, |dx|/|x|=0.0000e+00
[0] info: [time] Iteration #3 took 17.9046 s.
[0] info: Integral of crack: 4.34388e-07
[0] info: Internal pressure: 115104 and Pressure error: 1.0000e+00
[0] info: [time] Solving process #0 took 62.0896 s in time step #1 coupling iteration #0
[1] info: Integral of crack: 4.34388e-07
[1] info: Internal pressure: 115104 and Pressure error: 1.0000e+00
[1] info: [time] Solving process #0 took 62.15 s in time step #1 coupling iteration #0
[0] info: [time] Applying Dirichlet BCs took 0.00012500000002546585 s.
[1] info: [time] Applying Dirichlet BCs took 0.0001700000000255386 s.
[0] info: [time] Assembly took 13.93766800000003 s.
[1] info: [time] Assembly took 13.93765899999994 s.
[0]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[0]PETSC ERROR: Argument out of range
[0]PETSC ERROR: Row -4044 out of range [0,5304)
[0]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[1]PETSC ERROR: --------------------- Error Message --------------------------------------------------------------
[1]PETSC ERROR: Argument out of range
[1]PETSC ERROR: Row -1207 out of range [0,5304)
[1]PETSC ERROR: See https://petsc.org/release/faq/ for trouble shooting.
[1]PETSC ERROR: Petsc Release Version 3.16.3, unknown
[1]PETSC ERROR: /Users/mollaali/master_test/build_master/bin/ogs on a named modmon150.intranet.ufz.de by mollaali Tue Dec 6 11:32:01 2022
[1]PETSC ERROR: Configure options --download-f2cblaslapack=1 --prefix=/Users/mollaali/master_test/build_master/_ext/PETSc --download-hypre --with-debugging=0 --with-fc=0
[1]PETSC ERROR: #1 MatZeroRowsMapLocal_Private() at /Users/mollaali/master_test/build_master/_ext/PETSc/src/PETSc/src/mat/utils/zerorows.c:24
[0]PETSC ERROR: Petsc Release Version 3.16.3, unknown
[0]PETSC ERROR: /Users/mollaali/master_test/build_master/bin/ogs on a named modmon150.intranet.ufz.de by mollaali Tue Dec 6 11:32:01 2022
[0]PETSC ERROR: Configure options --download-f2cblaslapack=1 --prefix=/Users/mollaali/master_test/build_master/_ext/PETSc --download-hypre --with-debugging=0 --with-fc=0
[0]PETSC ERROR: #1 MatZeroRowsMapLocal_Private() at /Users/mollaali/master_test/build_master/_ext/PETSc/src/PETSc/src/mat/utils/zerorows.c:24
[0]PETSC ERROR: #2 MatZeroRows_MPIAIJ() at /Users/mollaali/master_test/build_master/_ext/PETSc/src/PETSc/src/mat/impls/aij/mpi/mpiaij.c:788
[0]PETSC ERROR: #3 MatZeroRows() at /Users/mollaali/master_test/build_master/_ext/PETSc/src/PETSc/src/mat/interface/matrix.c:6167
[1]PETSC ERROR: #2 MatZeroRows_MPIAIJ() at /Users/mollaali/master_test/build_master/_ext/PETSc/src/PETSc/src/mat/impls/aij/mpi/mpiaij.c:788
[1]PETSC ERROR: #3 MatZeroRows() at /Users/mollaali/master_test/build_master/_ext/PETSc/src/PETSc/src/mat/interface/matrix.c:6167
0 SNES Function norm 1.971786423759e+02
[0] info: [time] Assembly took 14.03416100000004 s.
[1] info: [time] Assembly took 14.034158000000161 s.
[0]PETSC ERROR: #4 MatZeroRowsMapLocal_Private() at /Users/mollaali/master_test/build_master/_ext/PETSc/src/PETSc/src/mat/utils/zerorows.c:24
[0]PETSC ERROR: #5 MatZeroRows_MPIAIJ() at /Users/mollaali/master_test/build_master/_ext/PETSc/src/PETSc/src/mat/impls/aij/mpi/mpiaij.c:788
[0]PETSC ERROR: #6 MatZeroRows() at /Users/mollaali/master_test/build_master/_ext/PETSc/src/PETSc/src/mat/interface/matrix.c:6167
[1]PETSC ERROR: #4 MatZeroRowsMapLocal_Private() at /Users/mollaali/master_test/build_master/_ext/PETSc/src/PETSc/src/mat/utils/zerorows.c:24
[1]PETSC ERROR: #5 MatZeroRows_MPIAIJ() at /Users/mollaali/master_test/build_master/_ext/PETSc/src/PETSc/src/mat/impls/aij/mpi/mpiaij.c:788
[1]PETSC ERROR: #6 MatZeroRows() at /Users/mollaali/master_test/build_master/_ext/PETSc/src/PETSc/src/mat/interface/matrix.c:6167
Steps to Reproduce the Problem
- Partition the mesh:
/bin/partmesh -i quad_0deg_h0p01.vtu --ogs2metis
/bin/partmesh -n 2 -m -i quad_0deg_h0p01.vtu -- bar_right.vtu bar_left.vtu bar_bottom.vtu bar_top.vtu bar_p_*.vtu
- run the simulation
mpirun -n 2 ogs 2D_bm_0p01.prj
Expected behavior: Both parallel and serial computing run without error.
Actual behavior: When we set the initial phase field for parallel computing, some Petsc errors appear.
Specifications
- Version: (You can get this information from executing
ogs --version
) version: 6.4.3-544-g1a707cb8 - Platform: Mac and Ubuntu