[aspect-devel] "Not converge" problem with variable viscosity.

Shangxin Liu sxliu at vt.edu
Thu Aug 6 13:10:04 PDT 2015


Hi Jacky,

Yes. Now I can get the result of the simple four-layer viscosity with 3
global refinement of tolerance 10^-5. But for the steinberger radial
viscosity profile, even 10^-4 cannot converge. I've tried running the case
with 500 and 600 CPUs and 24 hours wall time limit, it still cannot finish.
I even tried 250 CPUs with 6 days but still failed. I know CitcomS cannot
take that long run time and so large CPU numbers with only one time step.
So I'm doubting there is a computational algorithm problem in the ASPECT's
stokes solver. Also, even if 1e-4 or 1e-5 tolerance finally works, we
cannot make sure this leads to the reasonable result, since normally a safe
tolerance for mantle convection is at 10^-6 level.

Because the depth-dependent viscosity profile is a basic part for mantle
convection, I suppose it's worth for us to think about solving this
long-standing problem. If this works, then we can add other things such as
ocean/continent differentia for scaling parameter to improve the model.

Actually, I'm not sure whether this "not converge" problem is a deal.ii or
ASPECT issue. But in the Hackathon Ying He mentioned to me that deal.ii has
some shortcoming dealing with jump or discontinuous viscosity structure and
she was working on the improvement. It's something related to the
discontinuous Galerkin method (if I recall correctly).

Another possible temporary solution is to try a new cell material averaging
using log algorithm that I'm working on now. Hopefully, this will help
dealing with the complicated viscosity structure.

Best,

Shangxin

On Thu, Aug 6, 2015 at 3:05 PM, Austermann, Jacqueline <
jaustermann at fas.harvard.edu> wrote:

> Hi Shangxin,
>
> I agree with you that the step that takes long is probably the Stokes
> solver and not initializing or post-processing and that convergence becomes
> much slower when using more complex viscosity profiles. However, I've run
> global models with global resolution 4, similar viscosity profile to yours
> and linear solver tolerance of 10^-4 and it converges. I had it run for 30
> or so timesteps (the first ones take the longest) with a CFL number of 0.2.
> I don't recall the exact number of CPUs and runtime for that, but probably
> something like 600 CPUs for a day or so.
> Could you specify how many CPUs you are using and for how long you are
> running it before it times out?
>
> Thanks!
> Jacky
>
>
> On Aug 6, 2015, at 2:44 PM, Shangxin Liu wrote:
>
> Hi;
>
> Just follow up the old problem that our model runs very slow and cannot
> converge with variable viscosity in high global refinement degree (> 2). We
> used to think this is due to the time-consuming computation algorithm of
> S40RTS initial condition. But I have tried other initial conditions and
> this still cannot converge in 3 or higher global refinement degree. We are
> using several depth-dependent viscosity model such as the mantle radial
> viscosity profile of Steinberger (only radial viscosity part of steinberger
> model) or a simple four layer viscosity profile from lithosphere to CMB.
> The common phenomenon is that they all can converge quite fast up to only 2
> global refinement degree. However, when we make the meshing denser to try 3
> or higher global refinement degree, the code just keeps running and cannot
> finish (converge).
>
> I have tried making the tolerance up to 10^-4, but the code with 3 or
> higher global refinement still cannot finish until the wall time limit. I
> also have excluded the possibility that this is due to the large number of
> post processing. Because when I run the code with constant viscosity or
> just prescribe a narrow viscosity range of 1e20-1e22, the code can converge
> very fast. I also used the cell material average operation in the Material
> Model subsection but it still cannot solve the problem.
>
> I suppose this is also why the steinberger material model with higher (>
> 2) global refinement degree takes very long time and cannot finish.
>
> The attachment is our prm file and two viscosity profiles we're using. Any
> suggestions to solve this long-standing problem?
>
> Best,
>
> Shangxin
>
>
>
>
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20150806/a414f38a/attachment.html>


More information about the Aspect-devel mailing list