[CIG-SHORT] Question about PETSc

Brad Aagaard baagaard at usgs.gov
Wed Aug 23 13:01:11 PDT 2017


Tu Xiang,

As Matt mentioned, this appears to be a problem with your input file, 
although it could also be a problem with your environment. Are you able 
to run the examples? Have you tried recreating one of the example meshes 
using your CUBIT and running PyLith? Also, have you tried viewing the 
Exodus file in question using ParaView? If so, do the blocks and 
nodesets look like you expect?

Regards,
Brad


On 08/22/2017 07:39 PM, tu xiang wrote:
> I want to do 3D dynamic simulation. Then I construct a fault and embed 
> it into 3D domain (Figure 1) with Cubit.
> 
> 
> Figure 1 Mesh grid model
> 
> I create block for material and create nodesets for faults and boundary 
> conditions with next command.
> 
> # ----------------------------------------------------------------------
> 
> # Create blocks for materials
> 
> # ----------------------------------------------------------------------
> 
> block 20 volume v_domain
> 
> block 20 name "crust"
> 
> # ----------------------------------------------------------------------
> 
> # Create nodesets for faults and boundary conditions.
> 
> # ----------------------------------------------------------------------
> 
> group "fault_edge" add node in curve 14
> 
> group "fault_edge" add node in curve 15
> 
> group "fault_edge" add node in curve 16
> 
> nodeset 30 group fault_edge
> 
> nodeset 30 name "fault_edge"
> 
> group "front_bd" add node in surface 6
> 
> nodeset 50 group front_bd
> 
> nodeset 50 name "front_bd"
> 
> group "back_bd" add node in surface 4
> 
> nodeset 51 group back_bd
> 
> nodeset 51 name "back_bd"
> 
> group "left_bd" add node in surface 3
> 
> nodeset 52 group left_bd
> 
> nodeset 52 name "left_bd"
> 
> group "right_bd" add node in surface 5
> 
> nodeset 53 group right_bd
> 
> nodeset 53 name "right_bd"
> 
> group "bottom_bd" add node in surface 2
> 
> nodeset 54 group bottom_bd
> 
> nodeset 54 name "bottom_bd"
> 
> group "demo_fault" add node in surface 7
> 
> nodeset 60 group demo_fault
> 
> nodeset 60 name "demo_fault"
> 
> Next I create the configure files
> 
> The configure files:
> 
> pylithapp.cfg
> 
> # ----------------------------------------------------------------------
> # journal
> # ----------------------------------------------------------------------
> # Turn on some journals to show progress.
> [pylithapp.journal.info]
> timedependent = 1
> petsc = 1
> solverlinear = 1
> meshiocubit = 1
> faultcohesivedyn = 1
> fiatsimplex = 1
> pylithapp = 1
> materials = 1
> 
> # ----------------------------------------------------------------------
> # mesh_generator
> # ----------------------------------------------------------------------
> [pylithapp.mesh_generator]
> # Change the default mesh reader to the CUBIT reader.
> reader = pylith.meshio.MeshIOCubit
> 
> [pylithapp.mesh_generator.reader]
> filename = mesh/mesh.exo
> coordsys.space_dim = 3
> 
> # ----------------------------------------------------------------------
> # problem
> # ----------------------------------------------------------------------
> [pylithapp.timedependent]
> dimension = 3
> 
> [pylithapp.problem.formulation.time_step]
> total_time = 80*s
> dt = 0.005*s
> 
> [pylithapp.timedependent]
> formulation = pylith.problems.Explicit
> 
> normalizer = spatialdata.units.NondimElasticDynamic
> [pylithapp.timedependent.normalizer]
> shear_wave_speed = 3.0*km/s
> mass_density = 3.0e+3*kg/m**3
> wave_period = 0.3*s
> 
> # ----------------------------------------------------------------------
> # materials
> # ----------------------------------------------------------------------
> [pylithapp.timedependent]
> materials = [crust_material]
> 
> [pylithapp.timedependent.materials]
> crust_material = pylith.materials.ElasticIsotropic3D
> 
> [pylithapp.timedependent.materials.crust_material]
> label = Elastic material
> id = 20
> #db_properties = spatialdata.spatialdb.SimpleDB
> db_properties.label = Crust properties
> db_properties.iohandler.filename = spatialdb/mat_crust.spatialdb
> 
> quadrature.cell = pylith.feassemble.FIATSimplex
> quadrature.cell.dimension = 3
> 
> # ----------------------------------------------------------------------
> # boundary conditions
> # ----------------------------------------------------------------------
> [pylithapp.problem]
> bc = [front,back,left,right,bottem]
> bc.front = pylith.bc.AbsorbingDampers
> bc.back = pylith.bc.AbsorbingDampers
> bc.left = pylith.bc.AbsorbingDampers
> bc.right = pylith.bc.AbsorbingDampers
> bc.bottem = pylith.bc.AbsorbingDampers
> 
> [pylithapp.problem.bc.front]
> label = front_bd
> db.label = Absorbing BC +x
> db.iohandler.filename = spatialdb/matprops.spatialdb
> quadrature.cell = pylith.feassemble.FIATSimplex
> quadrature.cell.dimension = 2
> #quadrature.cell.quad_order = 2
> 
> [pylithapp.problem.bc.back]
> label = back_bd
> db.label = Absorbing BC -x
> db.iohandler.filename = spatialdb/matprops.spatialdb
> quadrature.cell = pylith.feassemble.FIATSimplex
> quadrature.cell.dimension = 2
> #quadrature.cell.quad_order = 2
> 
> [pylithapp.problem.bc.left]
> label = left_bd
> db.label = Absorbing BC -y
> db.iohandler.filename = spatialdb/matprops.spatialdb
> quadrature.cell = pylith.feassemble.FIATSimplex
> quadrature.cell.dimension = 2
> #quadrature.cell.quad_order = 2
> 
> [pylithapp.problem.bc.right]
> label = right_bd
> db.label = Absorbing BC +y
> db.iohandler.filename = spatialdb/matprops.spatialdb
> quadrature.cell = pylith.feassemble.FIATSimplex
> quadrature.cell.dimension = 2
> #quadrature.cell.quad_order = 2
> 
> [pylithapp.problem.bc.bottem]
> label = bottem_bd
> db.label = Absorbing BC -z
> db.iohandler.filename = spatialdb/matprops.spatialdb
> quadrature.cell = pylith.feassemble.FIATSimplex
> quadrature.cell.dimension = 2
> #quadrature.cell.quad_order = 2
> 
> # End of file
> 
> 
> 
> demo.cfg
> 
> [pylithapp]
> 
> # ----------------------------------------------------------------------
> 
> # ----------------------------------------------------------------------
> 
> # faults
> 
> # ----------------------------------------------------------------------
> 
> [pylithapp.problem]
> 
> interfaces = [fault]
> 
> [pylithapp.problem.interfaces]
> 
> fault = pylith.faults.FaultCohesiveDyn
> 
> [pylithapp.problem.interfaces.fault]
> 
> label = demo_fault
> 
> edge  = fault_edge
> 
> id    = 100
> 
> quadrature.cell = pylith.feassemble.FIATSimplex
> 
> quadrature.cell.dimension = 2
> 
> [pylithapp.problem.interfaces.fault]
> 
> # Specify zero tolerance for detecting slip. Must be larger than the
> 
> # KSP absolute tolerance.
> 
> open_free_surface = True
> 
> zero_tolerance = 1.0e-10
> 
> #friction.force_healing = True
> 
> friction = pylith.friction.SlipWeakening
> 
> friction.label = Slip weakening
> 
> friction.db_properties = spatialdata.spatialdb.SimpleDB
> 
> friction.db_properties.label = Slip weakening
> 
> friction.db_properties.iohandler.filename = 
> spatialdb/fault_friction.spatialdb
> 
> friction.db_properties.query_type = linear
> 
> traction_perturbation = pylith.faults.TractPerturbation
> 
> traction_perturbation.db_initial.label = Initial fault tractions
> 
> traction_perturbation.db_initial = spatialdata.spatialdb.SimpleDB
> 
> traction_perturbation.db_initial.iohandler.filename = 
> spatialdb/fault_traction.spatialdb
> 
> traction_perturbation.db_initial.query_type = linear
> 
> # ----------------------------------------------------------------------
> 
> # output
> 
> # ----------------------------------------------------------------------
> 
> # Domain
> 
> #[pylithapp.problem.formulation]
> 
> #output = [domain]
> 
> #[pylithapp.problem.formulation.output.domain]
> 
> #output_freq = time_step
> 
> #time_step = 0.9999999*s
> 
> #writer = pylith.meshio.DataWriterHDF5Mesh
> 
> #[pylithapp.problem.formulation.output.domain]
> 
> #writer.filename = output/demo.h5
> 
> #vertex_data_fields = [displacement,velocity]
> 
> # ----------------------------------------------------------------------
> 
> # PETSc
> 
> # ----------------------------------------------------------------------
> 
> # NOTE: There are additional settings specific to fault friction.
> 
> [pylithapp.petsc]
> 
> # Friction sensitivity solve used to compute the increment in slip
> 
> # associated with changes in the Lagrange multiplier imposed by the
> 
> # fault constitutive model.
> 
> friction_pc_type = asm
> 
> friction_sub_pc_factor_shift_type = nonzero
> 
> friction_ksp_max_it = 25
> 
> friction_ksp_gmres_restart = 30
> 
> # Uncomment to view details of friction sensitivity solve.
> 
> friction_ksp_monitor = true
> 
> friction_ksp_view = true
> 
> friction_ksp_converged_reason = true
> 
> # End of file
> 
> After that, I run this demo simulation with pylith 2.2.0
> 
> But it output:
> 
> $ pylith demo.cfg
> 
>   >> 
> /home/tux/pylith/pylith-2.2.0-linux-x86_64/lib/python2.7/site-packages/pylith/utils/PetscManager.py:64:initialize
> 
>   -- petsc(info)
> 
>   -- Initialized PETSc.
> 
>   >> 
> /home/tux/pylith/pylith-2.2.0-linux-x86_64/lib/python2.7/site-packages/pylith/apps/PyLithApp.py:103:main
> 
>   -- pylithapp(info)
> 
>   -- Running on 1 process(es).
> 
>   >> 
> /home/tux/pylith/pylith-2.2.0-linux-x86_64/lib/python2.7/site-packages/pylith/meshio/MeshIOObj.py:55:read
> 
>   -- meshiocubit(info)
> 
>   -- Reading finite-element mesh
> 
>   >> ../../../pylith-2.2.0/libsrc/pylith/meshio/MeshIOCubit.cc:158:void 
> pylith::meshio::MeshIOCubit::_readVertices(pylith::meshio::ExodusII&, 
> pylith::scalar_array*, int*, int*) const
> 
>   -- meshiocubit(info)
> 
>   -- Reading 26725 vertices.
> 
>   >> ../../../pylith-2.2.0/libsrc/pylith/meshio/MeshIOCubit.cc:217:void 
> pylith::meshio::MeshIOCubit::_readCells(pylith::meshio::ExodusII&, 
> pylith::int_array*, pylith::int_array*, int*, int*) const
> 
>   -- meshiocubit(info)
> 
>   -- Reading 147220 cells in 1 blocks.
> 
>   >> ../../../pylith-2.2.0/libsrc/pylith/meshio/MeshIOCubit.cc:281:void 
> pylith::meshio::MeshIOCubit::_readGroups(pylith::meshio::ExodusII&)
> 
>   -- meshiocubit(info)
> 
>   -- Found 7 node sets.
> 
>   >> ../../../pylith-2.2.0/libsrc/pylith/meshio/MeshIOCubit.cc:309:void 
> pylith::meshio::MeshIOCubit::_readGroups(pylith::meshio::ExodusII&)
> 
>   -- meshiocubit(info)
> 
>   -- Reading node set 'fault_edge' with id 30 containing 2 nodes.
> 
>   >> ../../../pylith-2.2.0/libsrc/pylith/meshio/MeshIOCubit.cc:309:void 
> pylith::meshio::MeshIOCubit::_readGroups(pylith::meshio::ExodusII&)
> 
>   -- meshiocubit(info)
> 
>   -- Reading node set 'front_bd' with id 50 containing 775 nodes.
> 
>   >> ../../../pylith-2.2.0/libsrc/pylith/meshio/MeshIOCubit.cc:309:void 
> pylith::meshio::MeshIOCubit::_readGroups(pylith::meshio::ExodusII&)
> 
>   -- meshiocubit(info)
> 
>   -- Reading node set 'back_bd' with id 51 containing 776 nodes.
> 
>   >> ../../../pylith-2.2.0/libsrc/pylith/meshio/MeshIOCubit.cc:309:void 
> pylith::meshio::MeshIOCubit::_readGroups(pylith::meshio::ExodusII&)
> 
>   -- meshiocubit(info)
> 
>   -- Reading node set 'left_bd' with id 52 containing 470 nodes.
> 
>   >> ../../../pylith-2.2.0/libsrc/pylith/meshio/MeshIOCubit.cc:309:void 
> pylith::meshio::MeshIOCubit::_readGroups(pylith::meshio::ExodusII&)
> 
>   -- meshiocubit(info)
> 
>   -- Reading node set 'right_bd' with id 53 containing 471 nodes.
> 
>   >> ../../../pylith-2.2.0/libsrc/pylith/meshio/MeshIOCubit.cc:309:void 
> pylith::meshio::MeshIOCubit::_readGroups(pylith::meshio::ExodusII&)
> 
>   -- meshiocubit(info)
> 
>   -- Reading node set 'bottom_bd' with id 54 containing 1162 nodes.
> 
>   >> ../../../pylith-2.2.0/libsrc/pylith/meshio/MeshIOCubit.cc:309:void 
> pylith::meshio::MeshIOCubit::_readGroups(pylith::meshio::ExodusII&)
> 
>   -- meshiocubit(info)
> 
>   -- Reading node set 'demo_fault' with id 60 containing 37 nodes.
> 
> [0]PETSC ERROR: 
> ------------------------------------------------------------------------
> 
> [0]PETSC ERROR: Caught signal number 11 SEGV: Segmentation Violation, 
> probably memory access out of range
> 
> [0]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
> 
> [0]PETSC ERROR: or see 
> http://www.mcs.anl.gov/petsc/documentation/faq.html#valgrind
> 
> [0]PETSC ERROR: or try http://valgrind.org on GNU/linux and Apple Mac OS 
> X to find memory corruption errors
> 
> [0]PETSC ERROR: configure using --with-debugging=yes, recompile, link, 
> and run
> 
> [0]PETSC ERROR: to get more information on the crash.
> 
> [0]PETSC ERROR: --------------------- Error Message 
> --------------------------------------------------------------
> 
> [0]PETSC ERROR: Signal received
> 
> [0]PETSC ERROR: See http://www.mcs.anl.gov/petsc/documentation/faq.html 
> for trouble shooting.
> 
> [0]PETSC ERROR: Petsc Development GIT revision: v3.7.5-3660-g9beae12  
> GIT Date: 2017-03-23 13:26:44 -0500
> 
> [0]PETSC ERROR: 
> /home/tux/pylith/pylith-2.2.0-linux-x86_64/bin/mpinemesis on a 
> arch-pylith named insar.geodyn by tux Wed Aug 23 02:28:24 2017
> 
> [0]PETSC ERROR: Configure options --prefix=/home/brad/pylith-binary/dist 
> --with-c2html=0 --with-x=0 --with-clanguage=C --with-mpicompilers=1 
> --with-debugging=0 --with-shared-libraries=1 --with-64-bit-points=1 
> --with-large-file-io=1 --download-chaco=1 --download-ml=1 
> --download-f2cblaslapack=1 --with-hwloc=0 --with-ssl=0 --with-x=0 
> --with-c2html=0 --with-lgrind=0 --with-hdf5=1 
> --with-hdf5-dir=/home/brad/pylith-binary/dist --with-zlib=1 --LIBS=-lz 
> --with-fc=0 CPPFLAGS="-I/home/brad/pylith-binary/dist/include " 
> LDFLAGS="-L/home/brad/pylith-binary/dist/lib " CFLAGS="-g -O2" 
> CXXFLAGS="-g -O2 -DMPICH_IGNORE_CXX_SEEK" FCFLAGS= 
> PETSC_DIR=/home/brad/pylith-binary/build/petsc-pylith PETSC_ARCH=arch-pylith
> 
> [0]PETSC ERROR: #1 User provided function() line 0 in  unknown file
> 
> application called MPI_Abort(MPI_COMM_WORLD, 59) - process 0
> 
> /home/tux/pylith/pylith-2.2.0-linux-x86_64/bin/nemesis: mpirun: exit 59
> 
> /home/tux/pylith/pylith-2.2.0-linux-x86_64/bin/pylith: 
> /home/tux/pylith/pylith-2.2.0-linux-x86_64/bin/nemesis: exit 1
> 
>  From the output, it reveal that Pylith can successfully read the mesh 
> model, but it cannot solve the problem. I do not know how to deal with 
> this problem. Is anybody can help me?
> 
> Best regards,
> 
> Tu xiang
> 
> 
> 



More information about the CIG-SHORT mailing list