<div dir="ltr">On Mon, Mar 11, 2013 at 12:33 PM, <span dir="ltr"><<a href="mailto:BOK10@pitt.edu" target="_blank">BOK10@pitt.edu</a>></span> wrote:<br><div class="gmail_extra"><div class="gmail_quote"><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
So, I just checked and both the linear and nonlinear solutions are<br>
converging. I'm not sure I know what you mean by different solution<br>
settings? Do you mean the following:<br>
<br>
# Preconditioner settings.<br>
pc_type = asm<br>
sub_pc_factor_shift_type = nonzero<br>
<br>
# Convergence parameters.<br>
ksp_rtol = 1.0e-20<br></blockquote><div><br></div><div style>These kinds of tolerances usually mean that something is not scaled right. You</div><div style>cannot really get meaningful information below 1.0e-12.</div><div style>
<br></div><div style>Brad, does this have something to do with the friction solve?</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
ksp_atol = 1.0e-13<br>
ksp_max_it = 1000000<br>
ksp_gmres_restart = 50<br></blockquote><div><br></div><div style>This restart is too small. If you have more memory, increase it to 100-200.</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
# Linear solver monitoring options.<br>
ksp_monitor = true<br>
ksp_view = true<br>
ksp_converged_reason = true<br>
<br>
# Nonlinear solver monitoring options.<br>
snes_rtol = 1.0e-20<br></blockquote><div><br></div><div style>Again, this seems way too small.</div><div style><br></div><div style> Thanks,</div><div style><br></div><div style> Matt</div><div> </div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex">
snes_atol = 1.0e-11<br>
snes_max_it = 1000000<br>
snes_monitor = true<br>
snes_view = true<br>
snes_converged_reason = true<br>
<br>
# PETSc summary -- useful for performance information.<br>
log_summary = true<br>
<br>
Bobby<br>
<br>
> Hi Bobby,<br>
><br>
> Are all of the solutions converging (both linear and nonlinear)? I would<br>
> have to look more at the different solution settings to see which of the<br>
> two you've shown is more reasonable.<br>
><br>
> Cheers,<br>
> Charles<br>
><br>
><br>
> On 9/03/2013, at 6:11 AM, <a href="mailto:BOK10@pitt.edu">BOK10@pitt.edu</a> wrote:<br>
><br>
>> Thanks! I tried a mixture of those suggestions, and i was able to get it<br>
>> to reduce its runtime by half.<br>
>><br>
>> I do have a question regarding the most suitable ksp/snes tolerances<br>
>> though:<br>
>><br>
>> I ran two simulations. The first had the following:<br>
>><br>
>> for the fault zero_tolerance = 1e-12<br>
>> ksp_rtol/snes_rtol = 1e-20<br>
>> ksp_atol = 1e-13<br>
>> snes_atol = 1e-11<br>
>><br>
>> The second had this:<br>
>><br>
>> for the fault zero_tolerance = 1e-14<br>
>> ksp_rtol/snes_rtol = 1e-20<br>
>> ksp_atol = 1e-15<br>
>> snes_atol = 1e-13<br>
>><br>
>> What I'm trying to do is look at the time it takes for certain portions<br>
>> of<br>
>> my fault to rupture >1m. Running both simulations, I got wildly<br>
>> different<br>
>> results, and I'm not sure which to rely on at this point. Is there any<br>
>> insight you might be able to give me regarding the best tolerances to<br>
>> settle on?<br>
>><br>
>> Bobby<br>
>><br>
>><br>
>>> Hmm. It sounds to me as though you need to play with your parameters a<br>
>>> bit. I'm assuming you're using a frictional fault, which can<br>
>>> definitely<br>
>>> take a while to converge; however, you can do a few things to speed<br>
>>> things<br>
>>> up:<br>
>>><br>
>>> 1. Follow all of the suggestions Brad had from his previous e-mail.<br>
>>> 2. Make sure you have the highest quality mesh you can get. Just one<br>
>>> poorly formed element, especially on the fault, can really slow things<br>
>>> down.<br>
>>> 3. Possibly try reducing your time step size. It's possible that your<br>
>>> load increment per timestep is too high for reasonable convergence.<br>
>>><br>
>>> When you have a chance, I would see if you can build PyLith from source<br>
>>> on<br>
>>> your cluster. In addition to allowing parallel runs, this will let you<br>
>>> take advantage of any machine-specific tools (e.g., optimized math<br>
>>> libraries, etc.).<br>
>>><br>
>>> Let us know whether any of this helps.<br>
>>><br>
>>> Cheers,<br>
>>> Charles<br>
>>><br>
>>><br>
>>> On 7/03/2013, at 12:15 PM, <a href="mailto:BOK10@pitt.edu">BOK10@pitt.edu</a> wrote:<br>
>>><br>
>>>> Hi Charles,<br>
>>>><br>
>>>> I'm not running in parallel (I couldn't get the model to process in<br>
>>>> parallel on the cluster). Each time step takes ~80 minutes with loose<br>
>>>> tolerances, and about 2 hours for tighter tolerances.<br>
>>>><br>
>>>> The mesh itself is about 6,000 cells (2D). The faults have a 2 km<br>
>>>> discretization, and the boundaries have a 5 km discretization. Im<br>
>>>> using<br>
>>>> elasticplanestrain.<br>
>>>><br>
>>>> Bobby<br>
>>>><br>
>>>>> What sort of machine are you running on, and are you running in<br>
>>>>> parallel?<br>
>>>>> I'm not sure what your problem size is, but 80 time steps shouldn't<br>
>>>>> take<br>
>>>>> that long to run, unless it's a very nonlinear problem. How large is<br>
>>>>> your<br>
>>>>> mesh, and what sort of rheology are you using?<br>
>>>>><br>
>>>>> Cheers,<br>
>>>>> Charles<br>
>>>>><br>
>>>>><br>
>>>>> On 7/03/2013, at 11:36 AM, <a href="mailto:BOK10@pitt.edu">BOK10@pitt.edu</a> wrote:<br>
>>>>><br>
>>>>>> Hi Charles,<br>
>>>>>><br>
>>>>>> It takes pretty long for a simulation to finish processing, and I<br>
>>>>>> was<br>
>>>>>> hoping to split the simulation up into parts so I can come back to<br>
>>>>>> it<br>
>>>>>> later. It's not a necessity, but more a convenience issue.<br>
>>>>>><br>
>>>>>> I think I'll just continue on with running it overnight.<br>
>>>>>><br>
>>>>>> Thanks,<br>
>>>>>> Bobby<br>
>>>>>><br>
>>>>>><br>
>>>>>><br>
>>>>>>> Hi Bobby,<br>
>>>>>>><br>
>>>>>>> I'm not quite sure what you have in mind. If you're running any<br>
>>>>>>> sort<br>
>>>>>>> of<br>
>>>>>>> viscoelastic problem, you would need to save the entire state at<br>
>>>>>>> the<br>
>>>>>>> end<br>
>>>>>>> of each run. I don't see what benefit there would be from doing<br>
>>>>>>> this,<br>
>>>>>>> since you would still need to finish each run to get all the state<br>
>>>>>>> variables at the end of each chunk, and then feed them into the<br>
>>>>>>> next<br>
>>>>>>> simulation as initial state variables.<br>
>>>>>>><br>
>>>>>>> If your problem is completely elastic, I suppose you could run them<br>
>>>>>>> in<br>
>>>>>>> the<br>
>>>>>>> way you're suggesting, and then use linear superposition to obtain<br>
>>>>>>> the<br>
>>>>>>> final result. What is your reason for wanting to break up the<br>
>>>>>>> simulation?<br>
>>>>>>><br>
>>>>>>> Cheers,<br>
>>>>>>> Charles<br>
>>>>>>><br>
>>>>>>><br>
>>>>>>> On 7/03/2013, at 11:22 AM, <a href="mailto:BOK10@pitt.edu">BOK10@pitt.edu</a> wrote:<br>
>>>>>>><br>
>>>>>>>> Is it possible to split a simulation into parts? I'm running my<br>
>>>>>>>> model<br>
>>>>>>>> for<br>
>>>>>>>> 400 years at 5 year time intervals, but is it possible to split it<br>
>>>>>>>> to<br>
>>>>>>>> 100<br>
>>>>>>>> year chunks and run them serially?<br>
>>>>>>>><br>
>>>>>>>> Bobby<br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>><br>
>>>>>>>> _______________________________________________<br>
>>>>>>>> CIG-SHORT mailing list<br>
>>>>>>>> <a href="mailto:CIG-SHORT@geodynamics.org">CIG-SHORT@geodynamics.org</a><br>
>>>>>>>> <a href="http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short" target="_blank">http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short</a><br>
>>>>>>><br>
>>>>>>> Charles A. Williams<br>
>>>>>>> Scientist<br>
>>>>>>> GNS Science<br>
>>>>>>> 1 Fairway Drive, Avalon<br>
>>>>>>> PO Box 30368<br>
>>>>>>> Lower Hutt 5040<br>
>>>>>>> New Zealand<br>
>>>>>>> ph (office): 0064-4570-4566<br>
>>>>>>> fax (office): 0064-4570-4600<br>
>>>>>>> <a href="mailto:C.Williams@gns.cri.nz">C.Williams@gns.cri.nz</a><br>
>>>>>>><br>
>>>>>>><br>
>>>>>>> Notice: This email and any attachments are confidential.<br>
>>>>>>> If received in error please destroy and immediately notify us.<br>
>>>>>>> Do not copy or disclose the contents.<br>
>>>>>>><br>
>>>>>>> _______________________________________________<br>
>>>>>>> CIG-SHORT mailing list<br>
>>>>>>> <a href="mailto:CIG-SHORT@geodynamics.org">CIG-SHORT@geodynamics.org</a><br>
>>>>>>> <a href="http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short" target="_blank">http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short</a><br>
>>>>>><br>
>>>>>><br>
>>>>>> _______________________________________________<br>
>>>>>> CIG-SHORT mailing list<br>
>>>>>> <a href="mailto:CIG-SHORT@geodynamics.org">CIG-SHORT@geodynamics.org</a><br>
>>>>>> <a href="http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short" target="_blank">http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short</a><br>
>>>>><br>
>>>>> Charles A. Williams<br>
>>>>> Scientist<br>
>>>>> GNS Science<br>
>>>>> 1 Fairway Drive, Avalon<br>
>>>>> PO Box 30368<br>
>>>>> Lower Hutt 5040<br>
>>>>> New Zealand<br>
>>>>> ph (office): 0064-4570-4566<br>
>>>>> fax (office): 0064-4570-4600<br>
>>>>> <a href="mailto:C.Williams@gns.cri.nz">C.Williams@gns.cri.nz</a><br>
>>>>><br>
>>>>> _______________________________________________<br>
>>>>> CIG-SHORT mailing list<br>
>>>>> <a href="mailto:CIG-SHORT@geodynamics.org">CIG-SHORT@geodynamics.org</a><br>
>>>>> <a href="http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short" target="_blank">http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short</a><br>
>>>><br>
>>>><br>
>>>> _______________________________________________<br>
>>>> CIG-SHORT mailing list<br>
>>>> <a href="mailto:CIG-SHORT@geodynamics.org">CIG-SHORT@geodynamics.org</a><br>
>>>> <a href="http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short" target="_blank">http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short</a><br>
>>><br>
>>> Charles A. Williams<br>
>>> Scientist<br>
>>> GNS Science<br>
>>> 1 Fairway Drive, Avalon<br>
>>> PO Box 30368<br>
>>> Lower Hutt 5040<br>
>>> New Zealand<br>
>>> ph (office): 0064-4570-4566<br>
>>> fax (office): 0064-4570-4600<br>
>>> <a href="mailto:C.Williams@gns.cri.nz">C.Williams@gns.cri.nz</a><br>
>>><br>
>>> _______________________________________________<br>
>>> CIG-SHORT mailing list<br>
>>> <a href="mailto:CIG-SHORT@geodynamics.org">CIG-SHORT@geodynamics.org</a><br>
>>> <a href="http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short" target="_blank">http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short</a><br>
>><br>
>><br>
><br>
> Charles A. Williams<br>
> Scientist<br>
> GNS Science<br>
> 1 Fairway Drive, Avalon<br>
> PO Box 30368<br>
> Lower Hutt 5040<br>
> New Zealand<br>
> ph (office): 0064-4570-4566<br>
> fax (office): 0064-4570-4600<br>
> <a href="mailto:C.Williams@gns.cri.nz">C.Williams@gns.cri.nz</a><br>
><br>
><br>
> Notice: This email and any attachments are confidential.<br>
> If received in error please destroy and immediately notify us.<br>
> Do not copy or disclose the contents.<br>
><br>
> _______________________________________________<br>
> CIG-SHORT mailing list<br>
> <a href="mailto:CIG-SHORT@geodynamics.org">CIG-SHORT@geodynamics.org</a><br>
> <a href="http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short" target="_blank">http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short</a><br>
<br>
<br>
_______________________________________________<br>
CIG-SHORT mailing list<br>
<a href="mailto:CIG-SHORT@geodynamics.org">CIG-SHORT@geodynamics.org</a><br>
<a href="http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short" target="_blank">http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short</a><br>
</blockquote></div><br><br clear="all"><div><br></div>-- <br>What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.<br>
-- Norbert Wiener
</div></div>