[aspect-devel] "Not converge" problem with variable viscosity.

Timo Heister heister at clemson.edu
Tue Sep 1 06:39:50 PDT 2015


maybe we should take over
into master for analyzing situations like this? The patch dumps the
solver history to a file when the solver doesn't converge.

We have been fighting with convergence issues of the (extended) Stokes
system on the melt transport too and noticed:
1. Sometimes the preconditioner introduces large changes in the
residual which is in fact in the pressure null space (adding a
constant to every entry). This screws up the convergence. One hack is
to fix a pressure unknown but in reality one would like to make sure
that applying the preconditioner stays in the same subspace. Not sure
what the best way to do this is (maybe we need to compute the null
space vector, depending on the pressure element used, and remove that
part in the preconditioner application). I had one example where I
went from thousands to ~10 iterations but it might be more related to
melt transport. Worth investigating. Here is the hack on the branch:
2. I had to play with restart lengths of GMRES which sometimes made a
large difference (no convergence after thousands of iterations vs a
couple hundred). Hack:
Of course this might become really expensive (how to benchmark that?).
I am hesitant to introduce dozens of solver parameters, but maybe we
need to look into it some more.

I think it would be great to have a set of "hard" problems to play
with (that don't require a large parallel computation).

I will look into all of this soon, but I am super busy with other
stuff these days.

On Tue, Sep 1, 2015 at 9:25 AM, Wolfgang Bangerth <bangerth at tamu.edu> wrote:
> Shangxin,
>> Yes. Now I can get the result of the simple four-layer viscosity with 3
>> global
>> refinement of tolerance 10^-5. But for the steinberger radial viscosity
>> profile, even 10^-4 cannot converge. I've tried running the case with 500
>> and
>> 600 CPUs and 24 hours wall time limit, it still cannot finish. I even
>> tried
>> 250 CPUs with 6 days but still failed.
> That's a lot of CPU time wasted :-(
>> Because the depth-dependent viscosity profile is a basic part for mantle
>> convection, I suppose it's worth for us to think about solving this
>> long-standing problem. If this works, then we can add other things such as
>> ocean/continent differentia for scaling parameter to improve the model.
> There are a number of things that can go wrong. I don't think there is a bug
> per se that makes this converge so slowly (though you never know) but more a
> question of how one sets this up.
> So let me ask a few questions:
> 1/ Are you using material model averaging?
> 2/ Are you limiting the viscosity from above and below? If not, what
>    are the maximal and minimal values you encounter for the viscosity
>    in your model? If they are too far apart, would it make a difference
>    if you limited values? For example, if you have parts of your model
>    with a viscosity of 10^18 and others where the viscosity varies between
>    10^23 and 10^26, then there is likely going to be no harm in simply
>    cutting it off at 10^23: yes, the original models would have places
>    where the velocity is going to be on the order of 10^-5 to 10^-8 of
>    the maximal velocity, but you're still getting the essentially same
>    answer if in all of these places the velocity is just 10^-5 of the max.
>    On the other hand, you just improved the conditioning of the system
>    matrix substantially.
> 3/ Rather than trying to get the large model to converge right away, look
>    at a *sequence* of smaller models first. For example, consider 2 and 3
>    global refinements and look at how the number of iterations change. If,
>    for example, the number of iterations grows from 100 to 1000 for these
>    two refinement levels, then you can expect it to become even larger on
>    refinement level 4. There may then not even be a point in trying to
>    run at level 4 because it's going to be prohibitively expensive -- no
>    need to try. In your case, I'd be interested in seeing the
>    output/statistics file from your runs on coarser meshes. That may give
>    us an indication whether it's the outer solver, or one of the inner
>    solvers that is running into trouble in your case.
> I don't think that there's a magic bullet for difficult cases like yours.
> There are just many small steps one can do to improve the condition number
> of the various matrices involved.
> Best
>  W.
> --
> ------------------------------------------------------------------------
> Wolfgang Bangerth               email:            bangerth at math.tamu.edu
>                                 www: http://www.math.tamu.edu/~bangerth/
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel

Timo Heister

More information about the Aspect-devel mailing list