[CIG-SHORT] Linear convergence for nonplanar faults

Matthew Knepley knepley at rice.edu
Fri Nov 11 07:43:16 PST 2016


On Fri, Nov 11, 2016 at 8:52 AM, Josimar Alves da Silva <
jsilva.mit at gmail.com> wrote:

> Matt,
>
> Thank you for taking the time to look at this.
>
> I went through your slides and the PETSc <https://www.mcs.anl.gov/petsc/>
> manual and found that the parameters shown here, at the bottom of my
> e-mail, solve my problem. I am happy with it. However, would you be able to
> look at these parameters and let me know if you find anything strange about
> it ?
>

That should be much slower than using a block decomposition, but I am glad
it works.


> Regarding your suggestions, I combined my parameters with the following:
>
> pylith no_fault.cfg fault.cfg ./solvers/solver_fault_schur_custompc.cfg >
> log.out
>
> pylith no_fault.cfg fault.cfg ./solvers/solver_fault_fieldsplit.cfg >
> log.out
>
> pylith no_fault.cfg fault.cfg ./solvers/solver_fault_exact.cfg > log.out
>

There is no log.out attached.


> and in all cases I get the following Error message:
>
> KSPSolve has not converged
>
> In this specific case, it seems that when I use field split option,
> combined with the other parameter below, I can't get the linear solver to
> converge. However, I am not sure why this happens. Can you clarify ? Can
> you clarify also if I my result would be jeopardized by not using the field
> split option that you suggest ?
>

You should not mix solver options. You give the ones you need for that case.

As long as you solve to a tight enough tolerance, the solvers are
indistinguishable.

   Matt


> Thank you,
> Josimar
>
>
>
> ##################################################### The parameters
> below works for me .
> ## Set the solver options.
> [pylithapp.petsc]
>
> ## Preconditioner settings.
> pc_type = asm
> sub_pc_type=lu
> sub_pc_factor_shift_type = nonzero
>
> ## Convergence parameters.
> ksp_max_it = 4000
> ksp_gmres_restart = 30
>
> ## Linear solver monitoring options.
> ksp_type=gmres
> ksp_monitor = true
> ksp_converged_reason = true
> ksp_error_if_not_converged = true
> ksp_view=true
> ksp_monitor_true_residual=true
>
> # Nonlinear solver monitoring options.
> snes_max_it = 10000
> snes_monitor = true
> snes_linesearch_monitor = true
> snes_converged_reason = true
> snes_error_if_not_converged = true
> snes_view=true
> #snes_max_funcs = 3000
> snes_check_jacobian_view=true
> snes_ls_monitor=true
> snes_monitor_short=true
>
>
> # PETSc summary -- useful for performance information.
> log_view = true
>
> # Uncomment to launch gdb when starting PyLith.
> #start_in_debugger = true
>
>
> #########Fault Paramerters
>
> ksp_rtol = 1.0e-20
> ksp_atol = 1.0e-11
>
> snes_rtol = 1.0e-20
> snes_atol = 1.0e-8
>
> #########################
>
> ## ----------------------------------------------------------------------
> ## NOTE: There are additional settings specific to fault friction.
> [pylithapp.petsc]
>
> ### Friction sensitivity solve used to compute the increment in slip
> ### associated with changes in the Lagrange multiplier imposed by the
> ### fault constitutive model.
> friction_pc_type = asm
> friction_sub_pc_type=lu
>
> friction_ksp_type=gmres
> friction_sub_pc_factor_shift_type = nonzero
> friction_ksp_max_it = 500
> friction_ksp_gmres_restart = 30
>
>
> ### Uncomment to view details of friction sensitivity solve.
> friction_ksp_converged_reason = true
>
> ############################################################
> ##################################
>
>
>
>
>
>
> On Thu, Nov 10, 2016 at 6:33 PM, Matthew Knepley <knepley at rice.edu> wrote:
>
>> On Wed, Nov 9, 2016 at 9:50 AM, Matthew Knepley <knepley at rice.edu> wrote:
>>
>>> On Wed, Nov 9, 2016 at 9:42 AM, Josimar Alves da Silva <
>>> jsilva.mit at gmail.com> wrote:
>>>
>>>> Brad,
>>>>
>>>> I changed the boundary conditions for Dirichlet BC with constrained
>>>> displacement perpendicular to the surfaces, but the problem still persists.
>>>> I am starting to wonder if the nonplanar nature of the fault surface may be
>>>> causing the problem.
>>>>
>>>> The reason for this hypothesis is that I have the same set up but now
>>>> with flat surfaces, and then in this case I don't find any convergence
>>>> issue. Everything else was kept the same, except the fault geometry was
>>>> changed from slightly smoothed (the version I am having problems) to not
>>>> smoothed.
>>>>
>>>
>>> You NEED to use the exact version of the solver and check for
>>> convergence, just as in the slides. This will tell us whether your
>>> formulation is screwed up.
>>>
>>
>> I want to follow up here. What i mean is that you need to run with exact
>> solvers for the two blocks of the
>> Schur form, and I would test without the custom PC. This is described in
>> the slides, and also given in the
>> schur_exact cfg file. Have you been able to run this?
>>
>>   Thanks,
>>
>>      Matt
>>
>>
>>>    Matt
>>>
>>>
>>>> Let me know what you think about this.
>>>> Thank you
>>>> Josimar
>>>>
>>>>
>>>>
>>>>
>>>> On Wed, Nov 9, 2016 at 10:20 AM, Brad Aagaard <baagaard at usgs.gov>
>>>> wrote:
>>>>
>>>>> On 11/09/2016 07:15 AM, Josimar Alves da Silva wrote:
>>>>>
>>>>>> Dear Brad, Matt and Charles,
>>>>>>
>>>>>> Thank you again for the suggestions. After reviewing your comments I
>>>>>> performed the following task in order to solve my problem:
>>>>>>
>>>>>> Brad mentioned:
>>>>>>
>>>>>>  "1. Without a fault, check convergence with using ML for
>>>>>> the preconditioned."
>>>>>>
>>>>>>          Result: the solver converges fine without the fault and using
>>>>>> the specified preconditioned.
>>>>>>
>>>>>> "2. Add the fault and with zero prescribed slip (FaultCohesiveKin),
>>>>>> examine the convergence with the solver_fault_fieldsplit.cfg
>>>>>> parameters."
>>>>>>
>>>>>>          Results: no convergence for any solver. Error message:
>>>>>> Linear
>>>>>> solve did not converge due to DIVERGED_ITS iterations 10000
>>>>>>
>>>>>
>>>>> This suggests there is something wrong with your problem setup. Make
>>>>> sure you have sufficient Dirichlet BC on each side of the fault to prevent
>>>>> rigid body motion.
>>>>>
>>>>> Brad
>>>>>
>>>>>
>>>>> _______________________________________________
>>>>> CIG-SHORT mailing list
>>>>> CIG-SHORT at geodynamics.org
>>>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>>>>>
>>>>
>>>>
>>>> _______________________________________________
>>>> CIG-SHORT mailing list
>>>> CIG-SHORT at geodynamics.org
>>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>>>>
>>>
>>>
>>
>
> _______________________________________________
> CIG-SHORT mailing list
> CIG-SHORT at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/cig-short/attachments/20161111/6acdd1b1/attachment.html>


More information about the CIG-SHORT mailing list