[aspect-devel] "Not converge" problem with variable viscosity.
Shangxin Liu
sxliu at vt.edu
Tue Sep 1 15:44:43 PDT 2015
Hi Wolfgang,
For now I can get the results in 5 global refinement degree in the
Tolerance of 1e-4. Taking the Steinberger radial viscosity profile (as the
attachment) for example, it took 5.09e+04s (~14 hours) using 192 processes
with 30+53 iterations. I also found the results between the Tol. 1e-3 and
1e-4 are nearly the same so I suppose Tol. 1e-4 might be enough. Getting
the results with the default tolerance 1e-7 seems impossible. Btw, I didn't
find the detailed explanation about the definition of the tolerance in
ASPECT. What's the unit of this tolerance or which parameter is this
tolerance for? Does this tolerance mean the accuracy of the scaled Stokes
equation in deal.ii step-32?
Yes. I'm using the material model averaging. I use arithmetic averaging and
I didn't find a notable improvement with other averaging such as log.
Besides the Steinberger radial viscosity profile, I also use a simple four
layers viscosity model with the max of 1e24 and min of 1e19 (as the
attachment). But for this viscosity model, it seems the discontinuous
abrupt visocsity changes make the solver more difficult and the 5 global
refinement degree needs 30+191 iterations taking around 4 hours using 600
processes in the Tol. of 1e-4.
For now I indeed can get the results of denser meshing in Tol. 1e-4 but the
convergency is still pretty slow (I must use large run time and several
hundred processes). I haven't tried more complicated viscosity structures
such as adding the lateral variation from Steinberger model.
Best,
Shangxin
>
>
> Shangxin,
>
> > Yes. Now I can get the result of the simple four-layer viscosity with 3
> global
> > refinement of tolerance 10^-5. But for the steinberger radial viscosity
> > profile, even 10^-4 cannot converge. I've tried running the case with
> 500 and
> > 600 CPUs and 24 hours wall time limit, it still cannot finish. I even
> tried
> > 250 CPUs with 6 days but still failed.
>
> That's a lot of CPU time wasted :-(
>
>
> > Because the depth-dependent viscosity profile is a basic part for mantle
> > convection, I suppose it's worth for us to think about solving this
> > long-standing problem. If this works, then we can add other things such
> as
> > ocean/continent differentia for scaling parameter to improve the model.
>
> There are a number of things that can go wrong. I don't think there is a
> bug
> per se that makes this converge so slowly (though you never know) but more
> a
> question of how one sets this up.
>
> So let me ask a few questions:
>
> 1/ Are you using material model averaging?
> 2/ Are you limiting the viscosity from above and below? If not, what
> are the maximal and minimal values you encounter for the viscosity
> in your model? If they are too far apart, would it make a difference
> if you limited values? For example, if you have parts of your model
> with a viscosity of 10^18 and others where the viscosity varies between
> 10^23 and 10^26, then there is likely going to be no harm in simply
> cutting it off at 10^23: yes, the original models would have places
> where the velocity is going to be on the order of 10^-5 to 10^-8 of
> the maximal velocity, but you're still getting the essentially same
> answer if in all of these places the velocity is just 10^-5 of the max.
> On the other hand, you just improved the conditioning of the system
> matrix substantially.
> 3/ Rather than trying to get the large model to converge right away, look
> at a *sequence* of smaller models first. For example, consider 2 and 3
> global refinements and look at how the number of iterations change. If,
> for example, the number of iterations grows from 100 to 1000 for these
> two refinement levels, then you can expect it to become even larger on
> refinement level 4. There may then not even be a point in trying to
> run at level 4 because it's going to be prohibitively expensive -- no
> need to try. In your case, I'd be interested in seeing the
> output/statistics file from your runs on coarser meshes. That may give
> us an indication whether it's the outer solver, or one of the inner
> solvers that is running into trouble in your case.
>
> I don't think that there's a magic bullet for difficult cases like yours.
> There are just many small steps one can do to improve the condition number
> of
> the various matrices involved.
>
> Best
> W.
>
> --
> ------------------------------------------------------------------------
> Wolfgang Bangerth email: bangerth at math.tamu.edu
> www: http://www.math.tamu.edu/~bangerth/
>
>
>
> ------------------------------
>
> Message: 2
> Date: Tue, 1 Sep 2015 09:39:50 -0400
> From: Timo Heister <heister at clemson.edu>
> To: aspect-devel <aspect-devel at geodynamics.org>
> Subject: Re: [aspect-devel] "Not converge" problem with variable
> viscosity.
> Message-ID:
> <
> CAMRj59GBoEeutsVLhbnEJC3_iet14ihFJc0wu9M19Td3XM8PKA at mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
>
> Wolfgang,
>
> maybe we should take over
>
> https://github.com/tjhei/aspect/commit/b5dbe7440b25df57b1b01cb3c057742fc30cb1d1
> into master for analyzing situations like this? The patch dumps the
> solver history to a file when the solver doesn't converge.
>
> We have been fighting with convergence issues of the (extended) Stokes
> system on the melt transport too and noticed:
> 1. Sometimes the preconditioner introduces large changes in the
> residual which is in fact in the pressure null space (adding a
> constant to every entry). This screws up the convergence. One hack is
> to fix a pressure unknown but in reality one would like to make sure
> that applying the preconditioner stays in the same subspace. Not sure
> what the best way to do this is (maybe we need to compute the null
> space vector, depending on the pressure element used, and remove that
> part in the preconditioner application). I had one example where I
> went from thousands to ~10 iterations but it might be more related to
> melt transport. Worth investigating. Here is the hack on the branch:
>
> https://github.com/tjhei/aspect/commit/b631ce5d425178eb2e4e953d04e684a44d9b9af2
> 2. I had to play with restart lengths of GMRES which sometimes made a
> large difference (no convergence after thousands of iterations vs a
> couple hundred). Hack:
>
> https://github.com/tjhei/aspect/commit/743d86bf02b3fbe4e91e877df81b81c70a9bef65
> Of course this might become really expensive (how to benchmark that?).
> I am hesitant to introduce dozens of solver parameters, but maybe we
> need to look into it some more.
>
> I think it would be great to have a set of "hard" problems to play
> with (that don't require a large parallel computation).
>
> I will look into all of this soon, but I am super busy with other
> stuff these days.
>
>
> On Tue, Sep 1, 2015 at 9:25 AM, Wolfgang Bangerth <bangerth at tamu.edu>
> wrote:
> >
> > Shangxin,
> >
> >> Yes. Now I can get the result of the simple four-layer viscosity with 3
> >> global
> >> refinement of tolerance 10^-5. But for the steinberger radial viscosity
> >> profile, even 10^-4 cannot converge. I've tried running the case with
> 500
> >> and
> >> 600 CPUs and 24 hours wall time limit, it still cannot finish. I even
> >> tried
> >> 250 CPUs with 6 days but still failed.
> >
> >
> > That's a lot of CPU time wasted :-(
> >
> >
> >> Because the depth-dependent viscosity profile is a basic part for mantle
> >> convection, I suppose it's worth for us to think about solving this
> >> long-standing problem. If this works, then we can add other things such
> as
> >> ocean/continent differentia for scaling parameter to improve the model.
> >
> >
> > There are a number of things that can go wrong. I don't think there is a
> bug
> > per se that makes this converge so slowly (though you never know) but
> more a
> > question of how one sets this up.
> >
> > So let me ask a few questions:
> >
> > 1/ Are you using material model averaging?
> > 2/ Are you limiting the viscosity from above and below? If not, what
> > are the maximal and minimal values you encounter for the viscosity
> > in your model? If they are too far apart, would it make a difference
> > if you limited values? For example, if you have parts of your model
> > with a viscosity of 10^18 and others where the viscosity varies
> between
> > 10^23 and 10^26, then there is likely going to be no harm in simply
> > cutting it off at 10^23: yes, the original models would have places
> > where the velocity is going to be on the order of 10^-5 to 10^-8 of
> > the maximal velocity, but you're still getting the essentially same
> > answer if in all of these places the velocity is just 10^-5 of the
> max.
> > On the other hand, you just improved the conditioning of the system
> > matrix substantially.
> > 3/ Rather than trying to get the large model to converge right away, look
> > at a *sequence* of smaller models first. For example, consider 2 and 3
> > global refinements and look at how the number of iterations change.
> If,
> > for example, the number of iterations grows from 100 to 1000 for these
> > two refinement levels, then you can expect it to become even larger on
> > refinement level 4. There may then not even be a point in trying to
> > run at level 4 because it's going to be prohibitively expensive -- no
> > need to try. In your case, I'd be interested in seeing the
> > output/statistics file from your runs on coarser meshes. That may give
> > us an indication whether it's the outer solver, or one of the inner
> > solvers that is running into trouble in your case.
> >
> > I don't think that there's a magic bullet for difficult cases like yours.
> > There are just many small steps one can do to improve the condition
> number
> > of the various matrices involved.
> >
> > Best
> > W.
> >
> > --
> > ------------------------------------------------------------------------
> > Wolfgang Bangerth email: bangerth at math.tamu.edu
> > www: http://www.math.tamu.edu/~bangerth/
> >
> >
> > _______________________________________________
> > Aspect-devel mailing list
> > Aspect-devel at geodynamics.org
> > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>
>
>
> --
> Timo Heister
> http://www.math.clemson.edu/~heister/
>
>
> ------------------------------
>
> Message: 3
> Date: Tue, 1 Sep 2015 11:02:35 -0500
> From: Wolfgang Bangerth <bangerth at tamu.edu>
> To: aspect-devel at geodynamics.org
> Subject: Re: [aspect-devel] "Not converge" problem with variable
> viscosity.
> Message-ID: <55E5CC1B.2010600 at tamu.edu>
> Content-Type: text/plain; charset=utf-8; format=flowed
>
>
> > maybe we should take over
> >
> https://github.com/tjhei/aspect/commit/b5dbe7440b25df57b1b01cb3c057742fc30cb1d1
> > into master for analyzing situations like this? The patch dumps the
> > solver history to a file when the solver doesn't converge.
>
> Yes, I think something along these lines is useful.
>
> You don't need the member variable of the class, nor the member
> function. Something of the kind
>
> namespace {
> void record_solver_state (const unsigned int iteration,
> const double check_value,
> const LinearAlgebra::BlockVector ¤t_iterate,
> std::vector<double> &solver_history)
> { ...as before... }
> }
>
> should work, if you connect it via
>
> solver.connect(std_cxx11::bind (&record_solver_state,
> std_cxx11::_1,
> std_cxx11::_2,
> std_cxx11::_3,
> std_cxx11::ref(solver_history)));
>
> This way, solver_history can simply be a local variable to the function
> you're in.
>
> The problem Shangxin is seeing is of course that not only does it not
> converge, it simply never terminates either. Though presumably one could
> make it terminate by setting the maximal number of iterations to
> something reasonable, and if it exceeds that, see it fail.
>
>
> > We have been fighting with convergence issues of the (extended) Stokes
> > system on the melt transport too and noticed:
> > 1. Sometimes the preconditioner introduces large changes in the
> > residual which is in fact in the pressure null space (adding a
> > constant to every entry). This screws up the convergence. One hack is
> > to fix a pressure unknown but in reality one would like to make sure
> > that applying the preconditioner stays in the same subspace. Not sure
> > what the best way to do this is (maybe we need to compute the null
> > space vector, depending on the pressure element used, and remove that
> > part in the preconditioner application). I had one example where I
> > went from thousands to ~10 iterations but it might be more related to
> > melt transport. Worth investigating. Here is the hack on the branch:
> >
> https://github.com/tjhei/aspect/commit/b631ce5d425178eb2e4e953d04e684a44d9b9af2
>
> I think that might generally be useful (but don't you have to restrict
> it to the incompressible case?). We normalize the pressure later anyway.
>
>
> > 2. I had to play with restart lengths of GMRES which sometimes made a
> > large difference (no convergence after thousands of iterations vs a
> > couple hundred). Hack:
> >
> https://github.com/tjhei/aspect/commit/743d86bf02b3fbe4e91e877df81b81c70a9bef65
> > Of course this might become really expensive (how to benchmark that?).
> > I am hesitant to introduce dozens of solver parameters, but maybe we
> > need to look into it some more.
>
> Yes, I think it would be worthwhile playing with this.
>
> Best
> W.
>
>
> --
> ------------------------------------------------------------------------
> Wolfgang Bangerth email: bangerth at math.tamu.edu
> www: http://www.math.tamu.edu/~bangerth/
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20150901/c9f63769/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: steinberger-radial-visc.eps
Type: application/postscript
Size: 29528 bytes
Desc: not available
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20150901/c9f63769/attachment-0002.eps>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: four_layers.eps
Type: application/postscript
Size: 15910 bytes
Desc: not available
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20150901/c9f63769/attachment-0003.eps>
More information about the Aspect-devel
mailing list