[aspect-devel] problem with ASPECT - the iterative Stokes solver did not converge
Petar Glisovic
pglisovic at gmail.com
Wed Aug 6 05:18:36 PDT 2014
Hi Timo,
It works very well without adaptive refinement
> set Initial global refinement = 2
> set Initial adaptive refinement = 0
> set Strategy = viscosity
> set Time steps between mesh refinement = 0
>
For example, I run a test using 12 CPUs and 2hrs wall time and the code
processed ~5 Myr with no problems.
> Timestep 72: t=5.13532e+06 years
> Solving temperature system... 15 iterations.
> Rebuilding Stokes preconditioner...
> Solving Stokes system... 30+14 iterations.
>
I have, however, noticed that the code crashes without error message using
a larger level for Initial global refinement(=4). Following your comment, "I
assume the job gets killed because it uses too much memory", I run another
test using 48 CPUs and here are my observations (some of them are trivial,
I hope you don't mind):
1. The number of active cells and the number of degrees of freedom for
Initial global refinement=4 are greater than that for Initial global
refinement=2
> Initial global refinement=2
> Number of active cells: 6144 (on 3 levels)
> Number of degrees of freedom: 215962 (156774+6930+52258)
>
> Initial global refinement=4
> Number of active cells: 393216 (on 5 levels)
> Number of degrees of freedom: 13185610 (9585030+405570+3195010)
>
2. The results for the first time step (Timestep=0) are quite different
> Initial global refinement=2
> RMS, max velocity: 0.185 m/year, 0.678 m/year
> Temperature min/avg/max: 300 K, 1923 K, 3565 K
> Heat fluxes through boundary parts: -1.685e+13 W, 1.59e+13 W
>
> Initial global refinement=4
> RMS, max velocity: 0.0216 m/year, 0.0846 m/year
> Temperature min/avg/max: 300 K, 1977 K, 3565 K
> Heat fluxes through boundary parts: -6.404e+13 W, 6.162e+13 W
>
3. There is a long delay in postprocessing for Initial global refinement=4
> -rw-r--r-- 1 glisovic sxf-902-01 30 Aug 6 06:51 solution.visit
>
-rw-r--r-- 1 glisovic sxf-902-01 29928 Aug 6 07:19 depth_average.gnuplot
>
i.e., there is a difference of ~30 min in writing between the solution file
and depth-average file.
4. A simulation with Initial global refinement=4 generates ~1.7 Myr in only
2 time steps over 2hrs wall time
> Timestep 2: t=1.70568e+06 years
> Solving temperature system... 16 iterations.
> Rebuilding Stokes preconditioner...
> Solving Stokes system... 30+23 iterations.
>
this implies a significant difference in time step between these two global
refinements.
These are all for a constant viscosity profile(=1e21) and the temperature
dependence equals 0, i.e., for the following Material model
> subsection Material model
> set Model name = Steinberger
> subsection Steinberger model
> set Bilinear interpolation = true
> set Compressible = true
> set Data directory =
> /RQusagers/glisovic/aspect/data/material-model/steinberger/test-steinberger-compressible/
> set Latent heat = false
> set Lateral viscosity file name =
> test-viscosity-prefactor.txt
> set Material file names = pyr-ringwood88.txt
> set Radial viscosity file name = test-radial-visc.txt
> set Reference viscosity = 1e22
> end
> end
>
and Linear solver tolerance is 1e-7.
Just to add, using the same viscosity profile that I'm trying to test here
with Aspect Glisovic et al. (GJI, 2012) suggested that "a fully resolved
solution of thermal convection dynamics in the upper mantle may require a
horizontal expansion of the field variables up to spherical harmonic degree
512 (corresponding to a horizontal resolution scale of less than half a
degree on the sphere." So, maybe it is a resolution issue, after all?
Petar
On Tue, Aug 5, 2014 at 5:02 PM, Timo Heister <heister at clemson.edu> wrote:
> I wonder if this problem is somehow related to the bug
> https://github.com/geodynamics/aspect/issues/123 that we never solved.
>
> Petar, does it crash without adaptive refinement?
>
> On Tue, Aug 5, 2014 at 4:09 PM, Petar Glisovic <pglisovic at gmail.com>
> wrote:
> > Hi Jacky,
> >
> > Thank you for your reply.
> >
> > I have tried three different values for Linear solver tolerance: 1e-1,
> 1e-4,
> > and 1e-7. A tolerance of 1e-1 is the only one that can pass the initial
> time
> > step and produce solutions for the first 1 Myr:
> > *** Timestep 19: t=996804 years
> > Solving temperature system... 17 iterations.
> > Rebuilding Stokes preconditioner...
> > Solving Stokes system... 15 iterations.
> >
> > Postprocessing:
> > RMS, max velocity: 0.106 m/year, 0.869 m/year
> > Temperature min/avg/max: 300 K, 1892 K, 3565 K
> > Heat fluxes through boundary parts: -2.335e+13 W, 1.463e+13 W
> >
> > *** Timestep 20: t=1.0422e+06 years
> > Solving temperature system... 16 iterations.
> > Rebuilding Stokes preconditioner...
> > Solving Stokes system... 12 iterations.
> >
> > Considering the tutorial
> >>
> >> In practice, you should choose the value of this parameter to be so that
> >> if
> >> you make it smaller the results of your simulation do not change any
> more
> >> (qualitatively) whereas if you make it larger, they do. For most cases,
> >> the
> >> default value should be sufficient. In fact, a tolerance of 1e-4 might
> be
> >> accurate enough.
> >
> > I am not sure that a tolerance of 1e-1 is able to produce the accurate
> > solutions for 3-D models.
> >
> > Best regards,
> > Petar
> >
> >> Austermann, Jacqueline jaustermann at fas.harvard.edu
> >> Mon Jul 28 05:53:37 PDT 2014
> >>
> >> Hi Petar,
> >>
> >> I recently did the same run (Steinberger material model on a 3D
> spherical
> >> shell) and ran into similar problems as you. To all the helpful things
> Rene
> >> mentioned I want to add one (maybe obvious) point, which is the option
> to
> >> set the Linear solver tolerance up. In the end you probably want to set
> it
> >> down again, but for just doing test runs and checking whether e.g. your
> >> material model is input correctly if helped me a lot. I also found that
> >> Steinberger&Calderwood has a complexity that went beyond what I wanted
> to
> >> use and simplifying some of the parameters (not viscosity) sped up the
> >> calculation, too.
> >>
> >> Cheers,
> >> Jacky
> >>>
> >>> On 08/04/2014 10:16 PM, Petar Glisovic wrote:
> >>>
> >>> > @Wolfgang or @list moderator
> >>> > I didn't receive either Rene or Jacky's message. Checking the
> >>> > aspect-devel
> >>> > archives I have found out that there are these two messages intended
> >>> > for
> >>> > me. However, I have subscribed to the aspect-devel list and I hope
> this
> >>> > will prevent the communication problems in the future.
> >>> >
> >>> > @Rene
> >>> > Thank you for your detailed reply.
> >>> >
> >>> > 1. It seems the Adiabatic surface temperature should be 1600 K by
> >>> > default.
> >>> > However this alone is not enough to solve the convergence of the
> >>> > iterative
> >>> > Stokes solver.
> >>> >
> >>> > 2. The convergence in 2D could not be achieved also, but
> >>> >
> >>> > 3. a simple material model with Thermal viscosity exponent=10 worked
> >>> > quite
> >>> > well, therefore, a resolution is well-defined after all, is it?
> >>> >
> >>> > 4. However, I run a test setting the global refinement in the 3D
> model
> >>> > to 5
> >>> > and the adaptive refinement to 0 and it seems nothing happened (the
> job
> >>> > is
> >>> > done after ~30 sec - there is no error message), this is a log file:
> >>> >
> >>> >> Number of active cells: 3145728 (on 6 levels)
> >>> >> Number of degrees of freedom: 104645770 (76088070+3195010+25362690)
> >>> >>
> >>> >
> >>> > 5. If I set the temperature dependence to 0 as you explained and use
> a
> >>> > constant radial viscosity profile test-radial-visc.txt (inside
> >>> >
> aspect/data/material-model/steinberger/test-steinberger-compressible/)
> >>> > then
> >>> > the code works for a while, to be precise 13 timesteps,
> >>> >
> >>> >> *** Timestep 13: t=343515 years
> >>> >> Solving temperature system... 19 iterations.
> >>> >> Rebuilding Stokes preconditioner...
> >>> >> Solving Stokes system... 30+15 iterations.
> >>> >>
> >>> >> Postprocessing:
> >>> >> RMS, max velocity: 0.583 m/year, 3.54 m/year
> >>> >> Temperature min/avg/max: 300 K, 1934 K, 3565 K
> >>> >>
> >>> > Heat fluxes through boundary parts: -2.756e+13 W, 2.332e+13 W
> >>> >>
> >>> >
> >>> > and after that I have received the following error message:
> >>> >
> >>> >> An error occurred in line <3369> of file
> >>> >>
> >>> >>
> </RQusagers/glisovic/deal.II/include/deal.II/lac/trilinos_sparse_matrix.h>
> >>> >> in function
> >>> >> void
> >>> >>
> >>> >>
> dealii::TrilinosWrappers::SparseMatrix::add(dealii::TrilinosWrappers::SparseMatrix::size_type,
> >>> >> dealii::TrilinosWrappers::SparseMatrix::size_type, const size_type*,
> >>> >> const
> >>> >> double*, bool, bool)
> >>> >> The violated condition was:
> >>> >> numbers::is_finite(values[j])
> >>> >> The name and call sequence of the exception was:
> >>> >> ExcNumberNotFinite()
> >>> >> Additional Information:
> >>> >>
> >>> > (none)
> >>> >>
> >>> >
> >>> > So, help! No idea how to move forward.
> >>> >
> >>> > Many thanks,
> >>> > Petar
> >>> >
> >>> > *Rene Gassmoeller rengas at gfz-potsdam.de
> >>> >>
> >>> >> <aspect-devel%
> 40geodynamics.org?Subject=Re%3A%20%5Baspect-devel%5D%20problem%20with%20ASPECT%20-%20the%20iterative%20Stokes%0A%20solver%20did%20not%20converge&In-Reply-To=%3C53D5EB69.30809%40gfz-potsdam.de%3E
> >*
> >>> >>
> >>> > *Sun Jul 27 23:19:21 PDT 2014*
> >>> >>
> >>> > Hi Petar,
> >>> >> from a first glance the parameters you used for your model look
> good,
> >>> >> but there is one thing you should test. In your parameter file you
> did
> >>> >> not set the Adiabatic surface temperature, because it is not needed
> >>> >> for
> >>> >> the simple material model, and it defaults to 0 (@Wolfgang: We
> should
> >>> >> maybe change this to a reasonable value like 1600? Many of the more
> >>> >> complex plugins rely on a good adiabatic profile). The Steinberger
> >>> >> material model, and also your harmonic perturbation initial
> condition
> >>> >> rely on the initially computed adiabatic profile to compute their
> >>> >> properties, therefore this might mess up you calculation. I would
> >>> >> guess
> >>> >> this is already the problem.
> >>> >>
> >>> >> However there are a few other things that you might try to find the
> >>> >> reason for the convergence problems if the above did not help. First
> >>> >> you
> >>> >> could test the convergence in 2D, in your model setup that is as
> >>> >> simple
> >>> >> as setting Dimension = 2. And you could try a higher resolution in
> 2D.
> >>> >> Both options should show significantly better convergence, so you
> have
> >>> >> at least a starting point.
> >>> >>
> >>> >> Then you could test a temperature dependency in the simple material
> >>> >> model (e.g. set Thermal viscosity exponent to something like 10). If
> >>> >> this turns out to be a problem then your resolution might be too low
> >>> >> to
> >>> >> resolve the viscosity changes you are prescribing.
> >>> >>
> >>> >> If it turns out to be a resolution issue, try setting the global
> >>> >> refinement in the 3D model to 5 and the adaptive refinement to 0.
> >>> >> Aspect
> >>> >> will still coarsen and adaptively refine again later on, just the
> >>> >> initial convergence should be much better.
> >>> >>
> >>> >> You mentioned that you tested a constant viscosity profile with the
> >>> >> Steinberger material model, but in the parameter file it looks like
> >>> >> you
> >>> >> only modified the radial dependence? Did you also set the
> temperature
> >>> >> dependence to 0? You could do that by setting 'Lateral viscosity
> file
> >>> >> name' to our test file in
> >>> >>
> >>> >>
> data/material-model/steinberger/test-steinberger-compressible/test-viscosity-prefactor.txt.
> >>> >> With a constant radial and lateral viscosity the convergence should
> be
> >>> >> extremely fast.
> >>> >>
> >>> >> Let us know if you have any further questions.
> >>> >>
> >>> >> In fact I have a question for you. We are constantly looking to
> >>> >> improve
> >>> >> the available options in Aspect and include new models. Could you
> ask
> >>> >> Alessandro, whether he would be willing to share the viscosity
> profile
> >>> >> you used with the general Aspect users? In that case we could
> include
> >>> >> it
> >>> >> in the official version as an alternative to the existing one by
> >>> >> Steinberger & Calderwood.
> >>> >>
> >>> >> Best,
> >>> >> Rene
> >>> >>
> >>> >>
> >>> >
> >>> >
> >>> > On Fri, Jul 25, 2014 at 6:22 PM, Petar Glisovic <pglisovic at gmail.com
> >
> >>> > wrote:
> >>> >
> >>> >> Wolfgang,
> >>> >>
> >>> >> First of all, thank you for your quick reply.
> >>> >>
> >>> >> This is almost certainly an unrelated observation, but with as few
> >>> >>> unknowns as you have, you should first run small models using only
> a
> >>> >>> single
> >>> >>> processor when you try things out.
> >>> >>>
> >>> >>> My rule of thumb is that you can run models with a few 100,000
> >>> >>> unknowns
> >>> >>> on a single processor. For parallel computations, you typically
> don't
> >>> >>> gain
> >>> >>> anything if you have fewer than 50,000 unknowns per processor --
> >>> >>> i.e., in
> >>> >>> your case, 4 or 5 processors is about the optimum.
> >>> >>>
> >>> >>
> >>> >> Thanks for this tip.
> >>> >>
> >>> >>
> >>> >>
> >>> >> subsection Material model
> >>> >>>> set Model name = Steinberger
> >>> >>>> subsection Steinberger model
> >>> >>>> set Bilinear interpolation = true
> >>> >>>> set Compressible = true
> >>> >>>
> >>> >>> What happens if you set this to false?
> >>> >>>
> >>> >>
> >>> >> The outcome of this action is:
> >>> >> --------------------------------------------------------
> >>> >> An error occurred in line <3369> of file </RQusagers/glisovic
> >>> >> /deal.II/include/deal.II/lac/trilinos_sparse_matrix.h> in function
> >>> >> void dealii::TrilinosWrappers::SparseMatrix::add(dealii::
> >>> >> TrilinosWrappers::SparseMatrix::size_type,
> dealii::TrilinosWrappers::
> >>> >> SparseMatrix::size_type, const size_type*, const double*, bool,
> bool)
> >>> >> The violated condition was:
> >>> >> numbers::is_finite(values[j])
> >>> >> The name and call sequence of the exception was:
> >>> >> ExcNumberNotFinite()
> >>> >> Additional Information:
> >>> >> (none)
> >>> >> ...
> >>> >> The complete log file is attached below.
> >>> >>
> >>> >> Petar
> >>> >>
> >>> >>
> >>> >>
> >>> >> On Fri, Jul 25, 2014 at 4:14 PM, Wolfgang Bangerth
> >>> >> <bangerth at math.tamu.edu
> >>> >>> wrote:
> >>> >>
> >>> >>>
> >>> >>> Petar,
> >>> >>> let me pull this discussion to the mailing list since the authors
> of
> >>> >>> the
> >>> >>> Steinberger model are there.
> >>> >>>
> >>> >>>
> >>> >>> I have experienced a problem running ASPECT. It seems that the
> >>> >>> iterative
> >>> >>>> Stokes solver did not converge after three days of the job running
> >>> >>>> on 48
> >>> >>>> CPUs - please see attached log files (/aspect.e2477653,
> >>> >>>> aspect.o2477653
> >>> >>>> & log.txt/).
> >>> >>>>
> >>> >>>
> >>> >>> This is almost certainly an unrelated observation, but with as few
> >>> >>> unknowns as you have, you should first run small models using only
> a
> >>> >>> single
> >>> >>> processor when you try things out.
> >>> >>>
> >>> >>> My rule of thumb is that you can run models with a few 100,000
> >>> >>> unknowns
> >>> >>> on a single processor. For parallel computations, you typically
> don't
> >>> >>> gain
> >>> >>> anything if you have fewer than 50,000 unknowns per processor --
> >>> >>> i.e., in
> >>> >>> your case, 4 or 5 processors is about the optimum.
> >>> >>>
> >>> >>> However, my goal is to use the depth-dependent (i.e., radial)
> >>> >>> viscosity
> >>> >>>> profile (Mitrovica & Forte, 2004; please see attached file:
> >>> >>>> /mitro-forte-2004-radial-visc-new.txt/) jointly with lateral
> >>> >>>> variations
> >>> >>>> in viscosity (see below /shell_simple_3d_new.prm/). For example,
> to
> >>> >>>> use
> >>> >>>>
> >>> >>>> something similar to the Steinberger model:
> >>> >>>> subsection Material model
> >>> >>>> set Model name = Steinberger
> >>> >>>> subsection Steinberger model
> >>> >>>> set Bilinear interpolation = true
> >>> >>>> set Compressible = true
> >>> >>>>
> >>> >>>
> >>> >>> What happens if you set this to false?
> >>> >>>
> >>> >>> I don't know the parameters to the Steinberger model, so can't
> >>> >>> comment on
> >>> >>> their validity. I hope someone else on the mailing list may be able
> >>> >>> to
> >>> >>> provide more feedback.
> >>> >>>
> >>> >>> Best
> >>> >>> Wolfgang
> >>> >>> --
> >>> >>>
> >>> >>>
> ------------------------------------------------------------------------
> >>> >>> Wolfgang Bangerth email:
> >>> >>> bangerth at math.tamu.edu
> >>> >>> www:
> >>> >>> http://www.math.tamu.edu/~bangerth/
> >>> >>>
> >>> >>>
> >>> >>
> >>> >>
> >>> >> --
> >>> >> Petar Glisovic, Dr.
> >>> >> postdoctoral fellow
> >>> >>
> >>> >> *address*: GEOTOP, Université du Québec à Montréal, CP 8888,
> >>> >> succursale
> >>> >> Centre-Ville, Montréal, Québec, Canada H3C 3P8 * | ema**il**: *
> >>> >> pglisovic at gmail.com & glisovic.petar at courrier.uqam.ca* | phone:
> >>> >> *1-514-987-3000
> >>> >> #3592 & #4600* | fax: *1-514-987-3635
> >>> >>
> >>> >
> >>> >
> >>> >
> >>
> >>
> >>
> >>
> >> --
> >> Petar Glisovic, Dr.
> >> postdoctoral fellow
> >>
> >> address: GEOTOP, Université du Québec à Montréal, CP 8888, succursale
> >> Centre-Ville, Montréal, Québec, Canada H3C 3P8 | email:
> >> pglisovic at gmail.com & glisovic.petar at courrier.uqam.ca | phone:
> >> 1-514-987-3000 #3592 & #4600 | fax: 1-514-987-3635
> >
> >
> >
> >
> > --
> > Petar Glisovic, Dr.
> > postdoctoral fellow
> >
> > address: GEOTOP, Université du Québec à Montréal, CP 8888, succursale
> > Centre-Ville, Montréal, Québec, Canada H3C 3P8 | email:
> > pglisovic at gmail.com & glisovic.petar at courrier.uqam.ca | phone:
> > 1-514-987-3000 #3592 & #4600 | fax: 1-514-987-3635
>
>
>
> --
> Timo Heister
> http://www.math.clemson.edu/~heister/
>
--
Petar Glisovic, Dr.
postdoctoral fellow
*address*: GEOTOP, Université du Québec à Montréal, CP 8888, succursale
Centre-Ville, Montréal, Québec, Canada H3C 3P8 * | ema**il**: * pglisovic
@gmail.com & glisovic.petar at courrier.uqam.ca* | phone: *1-514-987-3000
#3592 & #4600* | fax: *1-514-987-3635
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20140806/b98da456/attachment-0001.html>
More information about the Aspect-devel
mailing list