[aspect-devel] New feature that helps with convergence issues due to viscosity jumps

Magali Billen mibillen at ucdavis.edu
Wed Jun 27 23:32:33 PDT 2018


Hello Diogo,
Thanks for your response. Even though you are not using a nonlinear solver the other information is helpful.
I’m simultaneously doing models with nonlinear rheology and ones with linear rheology with similar
viscosity variations (I’m even running the non-linear rheology but using linear solver so I can have the
same strong gradients), I’m hoping this will help me understand what issues are due to just the gradients
in viscosity versus the iterative process versus finer mesh refinement. 

I’m moving my code modifications for the material model into a separate module so I can update Aspect
and try the solver restart length. There have also been some changes made to accelerate the Newton nonlinear
solver.   I’ll let you know once I’ve run some tests (probably early next week).
Cheers,
Magali
 


> On Jun 27, 2018, at 9:14 PM, Diogo Louro Lourenco <dlourenco at ucdavis.edu> wrote:
> 
> Hi Magali,
> 
> I think we are running quite different setups; I am running viscous whole-mantle convection models, while I believe you are running regional (subduction?) models with complex rheology. This means that in my case I have no need to use a non-linear solver. I am sorry I don't think i can be of much help here. It is nevertheless very informative to share experiences on this! If it matters, I am using a linear solver tolerance of 1e-7 and 20 cheap stokes solver steps.
> 
> The resolution I am using is an initial global refinement of 3 with an initial adaptive refinement of 2. Because I am running 3D whole-mantle for hundreds of millions of years it is hard to go above this. But yes, if I remember correctly it was the case that as refinement level increases, it is harder to get to a solution. In my mind it makes sense: it is harder to get to a solution within the solver tolerance when taking into account small-scale heterogeneities (that is for example the power of multigrid approaches, where different wavelength components are solved on different scales). However, in my case the residual would decrease, but too slowly to get to a good solution. Did you try to increase the GMRES solver restart length? I wonder if it will help achieving convergence in this case as well.
> 
> Cheers,
> Diogo
> 
> On Mon, Jun 25, 2018 at 11:46 AM Magali Billen <mibillen at ucdavis.edu <mailto:mibillen at ucdavis.edu>> wrote:
> Hi Diogo (others),
> 
> Can I ask a couple other things?  
> - Which nonlinear solver are you using (Iterated Stokes or Newton Stokes)? I’ve noticed a big improvement with
> Newton (and using the Harmonic average) on my test cases.
> - What solver tolerance are you using (1e-4, 1e-5)? 
> - How many non-linear iterations do you find you need (10, 30, 100…)?
> - What sort of resolution do your models use (what is the highest level of refinement)? As I watch the solver
> (before changing the restart length), I notice that in the 0th time-step, as the refinement level increases, the number of iterations needed to reach (or not reach in some cases) the tolerance also increases. At level 7, it doesn’t converge, and I notice that I can tell it won’t converge because the residual sometimes increases rather than steadily decreasing.   Is this the kind of convergence problem you had also encountered?
> 
> I really appreciate you sharing your experience with this. I have years of experience getting to know the solver
> in Citcom (but I’m new to Aspect) and  I know that its very helpful to learn how to interpret the output of the solver to understand what needs to be changed (either in the model set-up or the solver parameters).
> 
> Cheers,
> Magali
> 
> P.S.  I found the section of the manual called “Making Aspect run faster” (4.6) really helpful - that’s where
> I found out about the material averaging  (section 4.6.7  points to the cookbook on this). You might want to add
> a short paragraph to the manual pointing out the importance of the GMRES solver restart length (maybe just
> combining and shortening the text from your last two emails on this. 
> 
>> On Jun 25, 2018, at 7:02 AM, Diogo Louro Lourenco <dlourenco at ucdavis.edu <mailto:dlourenco at ucdavis.edu>> wrote:
>> 
>> ASPECT uses the GMRES (Generalized minimal residual method) solver, which is an iterative method for the numerical solution of a non-symmetric system of linear equations. The method generates a sequence of orthogonal vectors and approximates the solution by the vector with minimal residual. Because this method works for non-symmetrical systems, all previously computed vectors in the orthogonal sequence have to be saved. This is a disadvantage because the amount of work and storage required per iteration rises rapidly and can make the cost prohibitive. This is the reason why Aspect uses a "restarted" version of the GMRES method. What does that mean?
>> 
>> After a chosen number of iterations (our "GMRES solver restart length"), the accumulated data is cleared and the intermediate results are used as the initial data for the next "GMRES solver restart length" number of iterations. This procedure is repeated until convergence is achieved. But this leaves us with a difficulty: chose the appropriate value for "GMRES solver restart length". If the value is too small, the solver may fail to converge, as was happening in my case, because there was no "memory" of the previously computed attempted solutions. Increasing the value of the "GMRES solver restart length" solves this. But, as I mentioned before using a higher value for "GMRES solver restart length" involves more storage and work. So, this feature does not necessarily make the solver to converge more quickly, but it makes the solver to converge when it otherwise wouldn't.
>> 
>> (I'm sure other people in the mailing list know much more about this than me, so they can complement my answer if they find necessary.)
>> 
>> I'm not sure if using the Material averaging parameter on or off helps in all situations, sorry. In my case it helps, I am using "Material averaging = harmonic average".
>> 
>> I think that in principle this new feature should help with your convergence problems due to strong viscosity gradients. Try it with a model where you don't get convergence and change the "GMRES solver restart length" to a higher value like 100 or 200. If it helps, you should get convergence and the model should run smoothly (although a bit slower). 
>> 
>> Hope this helps! Let me know if I can help further.
>> Diogo
>> 
>> On Sat, Jun 23, 2018 at 3:09 AM Magali Billen <mibillen at ucdavis.edu <mailto:mibillen at ucdavis.edu>> wrote:
>> This sounds very interesting, can you explain in simple terms what this change does... that is why or how does it allow the solver to converge more quickly with strong viscosity jumps (or gradients?)?  
>> 
>> Also, do you recommend using this with the Material averaging parameter on or off ? 
>> 
>> I ask because i am currently building subduction models with strong viscosity gradients, and also have convergence challenges (although I am just at the beginning of trying different parameters, etc...)
>> 
>> Sent from my iPhone
>> 
>> > On Jun 23, 2018, at 1:14 AM, Diogo Louro Lourenco <dlourenco at ucdavis.edu <mailto:dlourenco at ucdavis.edu>> wrote:
>> > 
>> > Hi all,
>> > 
>> > There is a new feature in ASPECT that helps if you are experiencing solver convergence issues. I was having troubles with convergence due to viscosity jumps of 100x in the mid-mantle while using gplates to prescribe surface velocities. Even for (unrealistic) viscosity jumps of more than 1000x across the 660 km, ASPECT will still converge now. It should help too with other problems that involve localized viscosity jumps in the domain.
>> > 
>> > To use it add "GMRES solver restart length" to the Solver parameters section in your .prm file. This is the number of iterations that define the GMRES solver restart length. The value used so far as default in ASPECT was 50 (still the default value if you don't change it). Using a value of 200 fixed my problem. You should keep in mind that the higher the value, the slower the simulation will become, because it increases the memory usage of the solver.
>> > 
>> > Thanks to Rene for finding the solution to the problem!
>> > Hope it helps those who run into the same issue!
>> > Diogo
>> > _______________________________________________
>> > Aspect-devel mailing list
>> > Aspect-devel at geodynamics.org <mailto:Aspect-devel at geodynamics.org>
>> > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel>
>> _______________________________________________
>> Aspect-devel mailing list
>> Aspect-devel at geodynamics.org <mailto:Aspect-devel at geodynamics.org>
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel>_______________________________________________
>> Aspect-devel mailing list
>> Aspect-devel at geodynamics.org <mailto:Aspect-devel at geodynamics.org>
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel>
> ____________________________________________________________
> Professor of Geophysics 
> Earth & Planetary Sciences Dept., UC Davis
> Davis, CA 95616
> 2129 Earth & Physical Sciences Bldg.
> Office Phone: (530) 752-4169
> http://magalibillen.faculty.ucdavis.edu <http://magalibillen.faculty.ucdavis.edu/>
> 
> Currently on Sabbatical at Munich University (LMU)
> Department of Geophysics (PST + 9 hr)
> 
> Avoid implicit bias - check before you submit: 
> http://www.tomforth.co.uk/genderbias/ <http://www.tomforth.co.uk/genderbias/>
> ___________________________________________________________
> 
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org <mailto:Aspect-devel at geodynamics.org>
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel>_______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel

____________________________________________________________
Professor of Geophysics 
Earth & Planetary Sciences Dept., UC Davis
Davis, CA 95616
2129 Earth & Physical Sciences Bldg.
Office Phone: (530) 752-4169
http://magalibillen.faculty.ucdavis.edu

Currently on Sabbatical at Munich University (LMU)
Department of Geophysics (PST + 9 hr)

Avoid implicit bias - check before you submit: 
http://www.tomforth.co.uk/genderbias/
___________________________________________________________

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20180628/b0300c72/attachment-0001.html>


More information about the Aspect-devel mailing list