[CIG-SHORT] debugging rate-state quasi-static simulation

Brad Aagaard baagaard at usgs.gov
Mon Dec 11 11:58:40 PST 2017


On 12/2/17 8:59 AM, Ekaterina Bolotskaya wrote:
> Dear Brad,
>
> I've tried running your simulation with a finer mesh, I'm getting similar results in terms of fluctuations. I guess you are right and curvature is an issue.
>
> I'm sorry about the solver settings. Some are taken from pylithapp.сfg file.
> Here's what I'm using from both files:
> pc_type = asm
> sub_pc_factor_shift_type = nonzero
>
> ksp_rtol = 1.0e-19
> ksp_atol = 1.0e-16
> ksp_max_it = 2500
> ksp_gmres_restart = 50
>
> snes_rtol = 1.0e-17
> snes_atol = 1.0e-14
> snes_max_it = 1500

Try replacing

pc_type = asm
sub_pc_factor_shift_type = nonzero

with the recommended solver settings in 
share/settings/solver_fault_fieldsplit.cfg. You can simply add this .cfg 
file to the command line or copy/paste the solver settings into your 
existing .cfg file. These settings should significantly improve the rate 
of convergence for the linear solver. For simulations with friction, I 
would set all of the relative tolerances to 1.0e-20 just to be sure the 
absolute tolerance is used for convergence. You should be able to back 
off the absolute tolerances a few orders of magnitude (make sure you 
keep the zero_tolerance and zero_tolerance_normal to between the ksp and 
snes tolerance).

> The problem with my fault is (and I would be incredibly happy if there's a way to solve it!):
> I want it to be thoroughgoing, but I also want the y-displacement fixed at the block sides (vertical ones) where the fault comes out and pylith doesn't allow me to apply any bc on nodes belonging to the fault. So, I'm just making the fault go all the way through apart from the last node on each side and apply the bc I want. Is there a way to overcome it and make the fault thoroughgoing but apply some bc on the two side nodes?

We do not currently allow an overlap between the Dirichlet BC and the 
fault, so there is no way to constrain the y degree of freedom on the 
boundary at the fault vertices. The best workaround is to simply remove 
the fault vertices from the list of vertices on the boundary where you 
wish to apply the Dirichlet boundary condition. This will result in 
small edge effects but they should be confined to a region only a little 
larger than the size of your cells.

>
> The diagram is attached.
> The block is 2m x 3m that's why my minimum cell is 0.01m

The length scale for nondimensionalization should definitely be set to 
something around 0.01m. The shear modulus for nondimensionalization 
should be set to the average shear modulus of your material (they 
default is 3.0e+10 Pa). Additionally, given the dimensions of your 
domain, you will also definitely need to set a value for the time scale 
for nondimensionaliation. The default values is 1.0 year. Normally, in a 
quasistatic problem it is set to the shortest viscoelastic relaxation 
time. If you have a purely elastic problem, then the time scale should 
be somewhere between a time step and the duration of the simulation.

Regards,
Brad

> ________________________________________
> From: CIG-SHORT [cig-short-bounces at geodynamics.org] on behalf of Brad Aagaard [baagaard at usgs.gov]
> Sent: Saturday, November 25, 2017 22:10
> To: cig-short at geodynamics.org
> Subject: Re: [CIG-SHORT] debugging rate-state quasi-static simulation
>
> Ekaterina,
>
> The example in examples/2d/subduction/step06 uses a mesh that is
> relatively coarse. We set up the example to illustrate the basic
> concepts of rate-state friction while using a coarse mesh so that the
> simulation runs quickly. In a real research problem, I would use a
> significantly finer mesh. In this example, the fault is also curved, so
> the fault tractions are not as smooth as they would be if the fault was
> planar or had a fine mesh.
>
> I do not understand your solver settings. You specify tolerances for the
> main solve in your 53*.cfg file, but no corresponding solver parameters.
> If you have a through-going fault (as we have in
> examples/2d/subduction/step06), then I would use something like:
>
> [pylithapp.petsc]
> pc_type = asm
> sub_pc_factor_shift_type = nonzero
>
> ksp_rtol = 1.0e-20
> ksp_atol = 1.0e-9
> ksp_max_it = 1000
> ksp_gmres_restart = 50
>
> snes_rtol = 1.0e-20
> snes_atol = 1.0e-7
>
> snes_max_it = 500
>
> If you do not have a through-going fault (it ends before reaching both
> edges of the domain), then I strongly recommend the field split solver
> settings in share/settings/solver_fault_fieldsplit.cfg. I would use the
> tolerances shown above (with a zero_tolerance for friction of 1.0e-9,
> see step06.cfg for details). This will significantly improve the rate of
> convergence of the linear solve.
>
> I also don't understand why you use a scale of 0.01 m for the
> nondimensionaliation. The length scale should be about equal to the size
> of your cells. Are your cells this small?
>
> I don't recall your sending a diagram illustrating the problem you are
> trying to solve.
>
> Other suggestions:
>
> 1. Start with a simulation using prescribed slip. Get the linear solve
> converging rapidly and iron out all other issues related to mesh
> quality, nondimensional scales, and boundary conditions.
>
> 2. Start with a static friction example with zero slip and make sure the
> linear solve and nonlinear solve converge as expected. Then start
> working towards the rate-state example you want to really solve.
>
> Regards,
> Brad
>
>
> On 11/24/17 6:31 PM, Ekaterina Bolotskaya wrote:
>> Dear Brad,
>>
>> PETSc: Attached is a .cfg file.
>>
>> "Fluctuations": I don't mean fluctuations from one timestep to
>> another, I mean spatial fluctuations for one timestep. Attached is an
>> image from Paraview where the slip magnitude is plotted along the top
>> of the fault. I'm just wondering why the slip fluctuates that much.
>> Shouldn't it look smoother for a homogeneous fault, if everything is
>> fine with the convergence? I haven't changed any parameters. This is
>> step06 from subduction example.
>>
>> Thanks!
>>
>> Best regards, Ekaterina Bolotskaya
>>
>> PhD in Geophysics, Earth, Atmospheric and Planetary Science
>> Department, Massachusetts Institute of Technology E-mail.
>> bolee at mit.edu Mob. +1 (857) 284-2805 +7 (963) 995-36-33
>>
>> ________________________________________ From: CIG-SHORT
>> [cig-short-bounces at geodynamics.org] on behalf of Brad Aagaard
>> [baagaard at usgs.gov] Sent: Tuesday, November 21, 2017 18:38 To:
>> cig-short at geodynamics.org Subject: Re: [CIG-SHORT] debugging
>> rate-state quasi-static simulation
>>
>> Ekaterina,
>>
>> I am still baffled by the behavior of the linear solver. The linear
>> (KSP) solve within the nonlinear solver (SNES) is taking several
>> hundred iterations and it looks like your absolute tolerance is
>> around 1.0e-17. This is where I would start your debugging. I forgot
>> that the JSON parameter file doesn't include the PETSc parameters.
>> Can you send your .cfg file with the solver parameters you are
>> using?
>>
>> I am not sure what you mean by "artifacts" and fluctuations. With
>> rate and state friction, I have seen the slip jump forward and
>> backward when the nonlinear solve has trouble converging. When this
>> happens, I think the solution is poorly resolved. Either the spatial
>> and/or temporal discretizations are too large and/or the rate-state
>> friction parameters need to be adjusted to help regularize the
>> solution. For example, using a laboratory value of the Dc parameter
>> with kilometer sized cells will almost certainly have convergence
>> problems due to inadequate resolution of the cohesive zone.
>>
>> Sorry for the delay in responding.
>>
>> Regards, Brad
>>
>>
>> On 11/14/2017 06:18 PM, Ekaterina Bolotskaya wrote:
>>> Dear Brad,
>>>
>>> Thanks for helping with this!
>>>
>>> Attached are the json parameter file and the log file for the same
>>> simulation run on pylith v2.2.1.
>>>
>>> I made the time step 4 times smaller: 1 year -> 0.25 year I'm using
>>> friction_pc_type = asm, which seems to be appropriate for my case.
>>> Please correct me, if I'm wrong. This one doesn't have a
>>> convergence error, but it stops converging at the same time step as
>>> it used to, I just made the maximum time smaller to be able to look
>>> at xmf result files.
>>>
>>> In this simulation not only the convergence is a problem, the
>>> results themselves seem to have a lot of computational artifacts
>>> and fluctuations, where I wouldn't expect them (tractions and slip
>>> on the fault, state parameter).
>>>
>>> Although when I'm looking at the results of step06 in
>>> examples/2d/subduction, those seem to have even more fluctuations
>>> for the fault slip and traction (slabtop). Should I just
>>> interpolate mine and ignore the artifacts or is there something
>>> wrong?
>>>
>>> Thanks!
>>>
>>>
>>> Best regards, Ekaterina Bolotskaya
>>>
>>> PhD in Geophysics, Earth, Atmospheric and Planetary Science
>>> Department, Massachusetts Institute of Technology E-mail.
>>> bolee at mit.edu Mob. +1 (857) 284-2805 +7 (963) 995-36-33
>>>
>>> ________________________________________ From: CIG-SHORT
>>> [cig-short-bounces at geodynamics.org] on behalf of Brad Aagaard
>>> [baagaard at usgs.gov] Sent: Thursday, October 26, 2017 13:56 To:
>>> cig-short at geodynamics.org Subject: Re: [CIG-SHORT] debugging
>>> rate-state quasi-static simulation
>>>
>>> Ekaterina,
>>>
>>> Your 53_log.log file shows that the SNES solver converged for
>>> several time steps with 1 iteration (consistent with no slip) and
>>> then there were 14 time steps with multiple iterations that did
>>> converge (consistent with frictional slip starting). The time step
>>> before the SNES solve failed to converge in 400 iterations required
>>> 198 nonlinear iterations. I extracted this information using grep
>>> on the log file. So it looks like the simulation was proceeding
>>> normally. You may need to reduce your time step to get the
>>> nonlinear solve to converge in a reasonable number of iterations.
>>> You can also increase the number of SNES iterations allowed using
>>> snes_max_it.
>>>
>>> I also noticed that your linear solves are requiring about 600
>>> iterations. For a mesh with 20K cells I would expect much faster
>>> convergence in the linear solve IF you are using the solver and
>>> preconditioner parameters we recommend for real research problems
>>> (share/settings/solver_fault_fieldsplit.cfg).
>>>
>>> If you would like further help in debugging this simulation, please
>>> use PyLith version 2.2.0 or later (I suggest the latest version
>>> v2.2.1) and send the JSON parameter file that is automatically
>>> created by the run. This file includes all of the parameters in
>>> your .cfg files or command line and we can easily view it using the
>>> PyLith Parameter viewer. Please see Section 4.10 of the PyLith
>>> v2.2.1 manual for more information.
>>>
>>> For real research problems, we also strongly recommend using HDF5
>>> output rather than VTK as the I/O is much faster and they make
>>> post-processing with Matlab or Python much easier. You can load the
>>> corresponding Xdmf file into ParaView as described in the PyLith
>>> manual.
>>>
>>> Regards, Brad
>>>
>>>
>>>
>>>
>>> On 10/25/2017 08:35 PM, Ekaterina Bolotskaya wrote:
>>>> Dear Brad,
>>>>
>>>> I'm using Pylith 2.1.3 and my info outputs are .vtk Please find
>>>> the two info files (fault and statevars) and the log attached.
>>>> Hope, that helps.
>>>>
>>>> Thanks!
>>>>
>>>> Best regards, Ekaterina Bolotskaya
>>>>
>>>> PhD in Geophysics, Earth, Atmospheric and Planetary Science
>>>> Department, Massachusetts Institute of Technology E-mail.
>>>> bolee at mit.edu Mob. +1 (857) 284-2805 +7 (963) 995-36-33
>>>>
>>>> ________________________________________ From: CIG-SHORT
>>>> [cig-short-bounces at geodynamics.org] on behalf of Brad Aagaard
>>>> [baagaard at usgs.gov] Sent: Wednesday, October 25, 2017 19:31 To:
>>>> cig-short at geodynamics.org Subject: Re: [CIG-SHORT] CSEC
>>>> benchmarks pylith files on Github?
>>>>
>>>> On 10/25/17 3:58 PM, Ekaterina Bolotskaya wrote:
>>>>> Dear developers,
>>>>>
>>>>> I'm failing to find pylith files (I'm looking here:
>>>>> https://github.com/geodynamics/pylith_benchmarks) that were
>>>>> used to obtain the results shown here
>>>>> http://scecdata.usc.edu/cvws/cgi-bin/cvws.cgi
>>>>>
>>>>> Also is that correct that there's just one pylith
>>>>> rate-and-state friction benchmark on the SCEC website
>>>>> (http://scecdata.usc.edu/cvws/cgi-bin/cvws.cgi): tpv102 -
>>>>> Rate-state friction, ageing law, half-space?
>>>>
>>>> The files are in dynamic/scecdynrup/tpv102. These files may need
>>>> updating for work with PyLith v2.X.X.
>>>>
>>>>> I'm also looking at step06 in examples/2d/subduction and
>>>>> step14 in examples/3d/hex8. Are there any more rate-and-state
>>>>> friction examples I can look at? Espesially m-scale (rather
>>>>> than km-scale) ones, as my m-scale rate-and-state simulation is
>>>>> not converging and I thing scaling might be one of the problems
>>>>> (I'm using appropriate normalizer.length_scale).
>>>>
>>>> I think the examples you list are the ones we have for
>>>> quasistatic modeling.
>>>>
>>>>> (My simulation is 2D, 2m by 3m, trilateral mesh, buried fault
>>>>> with rate and state friction in the middle, I'm fixing vertical
>>>>> displacement along all boundaries and applying compression in
>>>>> both x and y direction and shear traction rates).
>>>>
>>>> Do you have sufficient Dirichlet BC to eliminate all rigid body
>>>> motion (translation and rotation)? If so, turn on the KSP and
>>>> SNES monitors and send your JSON parameter file (generated
>>>> automatically in v2.2.1)  and your entire log.
>>>>
>>>> Regards, Brad
>>>>
>>>> _______________________________________________ CIG-SHORT mailing
>>>> list CIG-SHORT at geodynamics.org
>>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>>>>
>>>
>>> _______________________________________________ CIG-SHORT mailing
>>> list CIG-SHORT at geodynamics.org
>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>>>
>>
>> _______________________________________________ CIG-SHORT mailing
>> list CIG-SHORT at geodynamics.org
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>>
>
> _______________________________________________
> CIG-SHORT mailing list
> CIG-SHORT at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>



More information about the CIG-SHORT mailing list