[aspect-devel] convergence failure

Mohamed Gouiza M.Gouiza at leeds.ac.uk
Sat Mar 12 16:38:09 PST 2016


Hi Jonathan,

Thank you for the recommandation.
I will try another material model. Do you have any advice on which one to use instead of the Morency and Doin?

Regards,
Mohamed

On Mar 11, 2016, at 7:03 PM, Jonathan Perry-Houts <jperryh2 at uoregon.edu<mailto:jperryh2 at uoregon.edu>> wrote:

Hi Mohamed,

The Morency Doin material model has always been problematic. (I can
verify that your model also fails to converge on my computer, for what
it's worth.)

I'll look into it more later, but to be honest my recommendation is to
avoid using that material model. It's a good starting point for building
your own nonlinear material model, but for some reason it's always been
trouble in practice.

Sorry to not be more help. Maybe after looking more closely at your
model I'll have more suggestions, but at first glance it doesn't look
like you've changed anything too significant from the cookbook.

-Jonathan

On 03/11/2016 12:34 AM, Mohamed Gouiza wrote:
*Hi All,***

* *

*I am trying to run a modified version of the Morency & Doin example.*

*The major modification I made is to the materiel model which now
consists of a crust (30km thick) and a mantle lithosphere with variable
thickness (170km thick for 0<x<1000km; 90km thick for 1000<x<2000km). I
also modified the temperature model accordingly (the temperature
increase linearly until the base of the lithosphere which is at 1330C;
temperature in the sublisthospheric mantle is constant at 1330C;
temperature increase linearly in the lower mantle)*

*The model doesn’t converge on timestep 0 (t=0 years).*

* *

*I increased the Max nonlinear iterations to 100,000, it didn’t work.*

*I changed the solver scheme from iterated IMPES to iterated Stokes, it
didn’t work.*

* *

*Am I doing something wrong?*

* *

*I’ve attached the parameter file as well (m-cvx-003.prm).*

* *

*Thank you for your help.*

* *





## Hosts assigned to job 404757.1:

##

## h3s3b12.arc2.leeds.ac.uk<http://h3s3b12.arc2.leeds.ac.uk> 16 slots

## h3s3b14.arc2.leeds.ac.uk<http://h3s3b14.arc2.leeds.ac.uk> 16 slots

## h3s3b15.arc2.leeds.ac.uk<http://h3s3b15.arc2.leeds.ac.uk> 16 slots

## h6s2b4.arc2.leeds.ac.uk<http://h6s2b4.arc2.leeds.ac.uk> 16 slots

##

## Resources granted:

##

## h_vmem = 2G (per slot)

## h_rt   = 48:00:00



## nodes/ppn parallel request:

##

## nodes = 4     np  = 64

## ppn   = 16    tpp = 1



-----------------------------------------------------------------------------

-- This is ASPECT, the Advanced Solver for Problems in Earth's ConvecTion.

--     . version 1.4.0-pre

--     . running in DEBUG mode

--     . running with 64 MPI processes

--     . using Trilinos

-----------------------------------------------------------------------------





-----------------------------------------------------------------------------

The output directory <output/> provided in the input file appears not to
exist.

ASPECT will create it for you.

-----------------------------------------------------------------------------





Number of active cells: 4,096 (on 6 levels)

Number of degrees of freedom: 87,782 (33,410+4,257+16,705+16,705+16,705)



*** Timestep 0:  t=0 years

  Solving temperature system... 0 iterations.

  Solving crust system ... 0 iterations.

  Solving lithosphere system ... 0 iterations.

  Rebuilding Stokes preconditioner...

  Solving Stokes system... 0+82 iterations.

  Residual after nonlinear iteration 1: 1



  Rebuilding Stokes preconditioner...

  Solving Stokes system...





----------------------------------------------------

Exception on processing:



--------------------------------------------------------

An error occurred in line <760> of file
</home/ufaserv1_k/earmgo/safe/aspect/source/simulator/solver.cc<http://solver.cc>> in function

   double aspect::Simulator<dim>::solve_stokes() [with int dim = 2]

The violated condition was:

   false

The name and call sequence of the exception was:

   ExcMessage (std::string("The iterative Stokes solver " "did not
converge. It reported the following error:\n\n") + exc.what() + "\n See
" + parameters.output_directory+"solver_history.txt" + " for convergence
history.")

Additional Information:

The iterative Stokes solver did not converge. It reported the following
error:





--------------------------------------------------------

An error occurred in line <1216> of file
</home/ufaserv1_k/earmgo/safe/dealii/include/deal.II/lac/solver_gmres.h>
in function

   void dealii::SolverFGMRES<VectorType>::solve(const MatrixType&,
VectorType&, const VectorType&, const PreconditionerType&) [with
MatrixType = aspect::internal::StokesBlock; PreconditionerType =
aspect::internal::BlockSchurPreconditioner<dealii::TrilinosWrappers::PreconditionAMG,
dealii::TrilinosWrappers::PreconditionILU>; VectorType =
dealii::TrilinosWrappers::MPI::BlockVector]

The violated condition was:

   false

The name and call sequence of the exception was:

   SolverControl::NoConvergence (accumulated_iterations, res)

Additional Information:

Iterative method reported convergence failure in step 37667. The
residual in the last step was 1.50021e+12.



This error message can indicate that you have simply not allowed a
sufficiently large number of iterations for your iterative solver to
converge. This often happens when you increase the size of your problem.
In such cases, the last residual will likely still be very small, and
you can make the error go away by increasing the allowed number of
iterations when setting up the SolverControl object that determines the
maximal number of iterations you allow.



The other situation where this error may occur is when your matrix is
not invertible (e.g., your matrix has a null-space), or if you try to
apply the wrong solver to a matrix (e.g., using CG for a matrix that is
not symmetric or not positive definite). In these cases, the residual in
the last iteration is likely going to be large.

--------------------------------------------------------



See output/solver_history.txt for convergence history.

--------------------------------------------------------



Aborting!

----------------------------------------------------

--------------------------------------------------------------------------

mpirun noticed that the job aborted, but has no info as to the process

that caused that situation.

--------------------------------------------------------------------------





-------------------------------------------------
Mohamed Gouiza, Research Fellow
Basin Structure Group, Institute of Applied Geosciences
University of Leeds, School of Earth and Environment

Leeds,  LS2 9JT, UK

M.Gouiza at leeds.ac.uk<mailto:M.Gouiza at leeds.ac.uk> <mailto:M.Gouiza at leeds.ac.uk>
+44 7985 782073
-------------------------------------------------





_______________________________________________
Aspect-devel mailing list
Aspect-devel at geodynamics.org<mailto:Aspect-devel at geodynamics.org>
http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel


_______________________________________________
Aspect-devel mailing list
Aspect-devel at geodynamics.org<mailto:Aspect-devel at geodynamics.org>
http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20160313/c262c5d3/attachment-0001.html>


More information about the Aspect-devel mailing list