[aspect-devel] ASPECT on Mac

Menno Fraters menno.fraters at outlook.com
Fri Oct 17 09:25:36 PDT 2014


Dear Timo,

Thanks for the link to the new package. It did unfortunatly not work, outputting the errors shown in the attached file. But it seems I got a version of trilinos (11.8.1) and deal.ii working which can run the free-surface benchmark without problems up to at least 30 timesteps. 

There also seems to be a problem with the latent heat cookbook as shown below (in both versions).

Thanks!

Menno

Mennos-MacBook-Pro:aspect Menno$ mpirun -np 1 ./aspect cookbooks/latent-heat.prm 
-----------------------------------------------------------------------------
-- This is ASPECT, the Advanced Solver for Problems in Earth's ConvecTion.
--     . version 1.2.pre
--     . running in DEBUG mode
--     . running with 1 MPI process
--     . using Trilinos
-----------------------------------------------------------------------------

Number of active cells: 16384 (on 8 levels)
Number of degrees of freedom: 214788 (132098+16641+66049)

Message:  Unexpected token "1000" found at position 0.
Formula:  1000.0 
Token:    1000
Position: 0
Errc:     1


----------------------------------------------------
Exception on processing: 

--------------------------------------------------------
An error occurred in line <395> of file </Users/Menno/Documents/Phd/aspect/dealii/source/base/function_parser.cc> in function
    virtual double dealii::FunctionParser<2>::value(const Point<dim> &, const unsigned int) const [dim = 2]
The violated condition was: 
    false
The name and call sequence of the exception was:
    ExcParseError(e.GetCode(), e.GetMsg().c_str())
Additional Information: 
Parsing Error at Column 1. The parser said: Unexpected token "1000" found at position 0.
--------------------------------------------------------

Aborting!
----------------------------------------------------
--------------------------------------------------------------------------
mpirun noticed that the job aborted, but has no info as to the process
that caused that situation.
--------------------------------------------------------------------------



> From: heister at clemson.edu
> Date: Fri, 17 Oct 2014 11:25:24 -0400
> To: aspect-devel at geodynamics.org
> Subject: Re: [aspect-devel] ASPECT on Mac
> 
> Dear Menno,
> 
> Luca just released a new deal.II mac os package:
> https://github.com/luca-heltai/dealii/releases/tag/v8.2pre_v3
> 
> Can you give it a try?
> 
> 
> 
> On Fri, Oct 17, 2014 at 10:32 AM, Menno Fraters
> <menno.fraters at outlook.com> wrote:
> > Dear all,
> >
> > The last few days I also have been working on getting ASPECT to work on a
> > mac. Following the comments on the bug (which I also encountered in the free
> > surface cookbook when running it on multiple mpi processors) by Timo, I
> > installed first the newest version of deal.ii and when that didn't solve the
> > problem also the newest version of trilinos separately from the package
> > installed by the dmg. All other dependencies like p4est, openmpi and petsc
> > are still linked to this package. This didn't solve the problem either.
> > Furthermore I have also tried it with older version of trilinos (stable
> > release 11.10.1 and 11.8.1).  With PETSC it does work.
> >
> > I must note that to be able to successfully install the newest git clone of
> > trilinos I had to disable the packeges Claps, RBGen, Aristos and FEApp in
> > PackagesList.cmake, because they could not be found. This was not the case
> > in the mentioned stabled releases I tried.
> >
> > To test if this is a problem with mac, or just the most recent version of
> > trilinos, deal.ii and aspect, I installed it on a linux machine. It turned
> > out this gave me the same problems.
> >
> > So my question is if the problem with trilinos and deal.ii is actually
> > resolved or might I be doing something wrong with the installation (see
> > settings below)?
> >
> > Hope anyone can help with this!
> >
> > Cheers,
> >
> > Menno
> >
> > parameters used for making trilinos:
> > cmake -D TrilinosFramework_ENABLE_MPI:BOOL=ON -D
> > CMAKE_INSTALL_PREFIX:PATH=/Users/Menno/Documents/Phd/aspect/trilinos/build/
> > -D TPL_ENABLE_MPI:BOOL=ON -D BUILD_SHARED_LIBS:BOOL=ON -D
> > CMAKE_BUILD_TYPE:STRING=RELEASE -D Trilinos_ENABLE_Fortran:BOOL=OFF -D
> > Trilinos_WARNINGS_AS_ERRORS_FLAGS:STRING="" -D
> > CMAKE_VERBOSE_MAKEFILE:BOOL=TRUE -D Trilinos_ENABLE_TESTS:BOOL=OFF -D
> > Trilinos_ENABLE_ALL_PACKAGES:BOOL=OFF -D
> > Trilinos_ENABLE_ALL_OPTIONAL_PACKAGES:BOOL=ON -D
> > Trilinos_ENABLE_Epetra:BOOL=ON -D Trilinos_ENABLE_EpetraExt:BOOL=ON -D
> > Trilinos_ENABLE_Tpetra:BOOL=ON -D Trilinos_ENABLE_Jpetra:BOOL=ON -D
> > Trilinos_ENABLE_Kokkos:BOOL=ON -D Trilinos_ENABLE_Sacado:BOOL=ON -D
> > Trilinos_ENABLE_Amesos:BOOL=ON -D Trilinos_ENABLE_AztecOO:BOOL=ON -D
> > Trilinos_ENABLE_Ifpack:BOOL=ON -D Trilinos_ENABLE_Teuchos:BOOL=ON -D
> > Trilinos_ENABLE_Rythmos:BOOL=ON -D Trilinos_ENABLE_Piro:BOOL=ON -D
> > Trilinos_ENABLE_MOOCHO:BOOL=ON -D Trilinos_ENABLE_ML:BOOL=ON -D
> > Trilinos_ENABLE_Thyra:BOOL=ON -D Trilinos_ENABLE_TrilinosCouplings:BOOL=ON
> > $EXTRA_ARGS ../
> >
> > The resulting output for two mpi processes (the more mpi processes are used,
> > the faster the problem is encountered):
> > -----------------------------------------------------------------------------
> > -- This is ASPECT, the Advanced Solver for Problems in Earth's ConvecTion.
> > --     . version 1.2.pre
> > --     . running in DEBUG mode
> > --     . running with 2 MPI processes
> > --     . using Trilinos
> > -----------------------------------------------------------------------------
> >
> > Number of active cells: 10240 (on 6 levels)
> > Number of degrees of freedom: 134692 (82818+10465+41409)
> >
> > Number of free surface degrees of freedom: 20930
> > *** Timestep 0:  t=0 years
> >    Solving mesh velocity system... 0 iterations.
> >    Solving temperature system... 0 iterations.
> >    Rebuilding Stokes preconditioner...
> >    Solving Stokes system... 30+3 iterations.
> >
> >    Postprocessing:
> >      Writing graphical output: output/solution-00000
> >      Topography min/max:       0 m, 0 m,
> >      RMS, max velocity:        0.0109 m/year, 0.0297 m/year
> >
> > *** Timestep 1:  t=5255.3 years
> >    Solving mesh velocity system... 9 iterations.
> >    Solving temperature system... 11 iterations.
> >    Rebuilding Stokes preconditioner...
> >    Solving Stokes system... 30+5 iterations.
> >
> >    Postprocessing:
> >      Topography min/max: -53.33 m, 93.26 m,
> >      RMS, max velocity:  0.00578 m/year, 0.0176 m/year
> >
> > *** Timestep 2:  t=14136.7 years
> >    Solving mesh velocity system... 12 iterations.
> >    Solving temperature system... 11 iterations.
> >    Rebuilding Stokes preconditioner...
> >    Solving Stokes system... 30+5 iterations.
> >
> >    Postprocessing:
> >      Topography min/max: -15.43 m, 36.03 m,
> >      RMS, max velocity:  0.00694 m/year, 0.023 m/year
> >
> > *** Timestep 3:  t=20932.2 years
> >    Solving mesh velocity system... 25 iterations.
> >    Solving temperature system... 11 iterations.
> >    Rebuilding Stokes preconditioner...
> >    Solving Stokes system... 30+4 iterations.
> >
> >    Postprocessing:
> >      Topography min/max: -20.8 m, 59.68 m,
> >      RMS, max velocity:  0.00612 m/year, 0.0212 m/year
> >
> > *** Timestep 4:  t=28303.4 years
> >    Solving mesh velocity system... 90 iterations.
> >    Solving temperature system... 10 iterations.
> >    Rebuilding Stokes preconditioner...
> >    Solving Stokes system... 30+3 iterations.
> >
> >    Postprocessing:
> >      Topography min/max: -21.81 m, 61.13 m,
> >      RMS, max velocity:  0.00608 m/year, 0.0211 m/year
> >
> > *** Timestep 5:  t=35711.1 years
> >    Solving mesh velocity system... 27 iterations.
> >    Solving temperature system... 10 iterations.
> >    Rebuilding Stokes preconditioner...
> >    Solving Stokes system... 30+1 iterations.
> >
> >    Postprocessing:
> >      Topography min/max: -21.59 m, 61.11 m,
> >      RMS, max velocity:  0.00609 m/year, 0.0211 m/year
> >
> > *** Timestep 6:  t=43113.6 years
> >
> >
> > ----------------------------------------------------
> > Exception on processing:
> >
> > --------------------------------------------------------
> > An error occurred in line <486> of file
> > </Users/Menno/Documents/Phd/aspect/dealii/build/include/deal.II/lac/solver_cg.h>
> > in
> >
> > ----- function
> >     void
> > ---------dealii::SolverCG<de----------------------alii::Tr------------ilinosWrapper----
> > Exceptis::MPI::Vector>:on on processing:
> > :solve(const MATRIX &, VECTOR &, const VECTOR &, const PRECONDI
> > ----TIONER &) [VE------CTOR =
> > de------------alii::TrilinosWrap------pers::MP----------I::Vector, MATRIX =
> > -------------dealii::Tri-----
> > An error olinosWrappers::Sccurred iparseMatrn line <486> of filix,
> > PRECONDITIONER = dealii::TrilinosWrappers::PreconditionAMG]
> > The violae </Users/Menno/Docted condition was:
> >     false
> > The name uments/Phd/aspeand call sect/dealii/build/includequence of the
> > exception was:/deal.II/lac/s
> >     SolverContrololver_cg::NoConvergen.h> in functionce (this->control()
> >     voi.last_step(),d dealii::Solve this->control().lasrCG<dealit_value())
> > Addi::TrilinosWraitional Information:
> > ppers::MPIterative method reported convergeI::Vector>::solve(const Mnce
> > failure in step 10ATRIX &, VECTOR &4650. The residual in the las, const Vt
> > step was 4.25519e-19.
> >
> > This errECTOR &, const PRECONDor messageITIONER &) [V can indicate that you
> > have simpECTOR = dealii::Trilinoly not allowed a suffisWrappersciently large
> > number ::MPI::Vector, MATRIX of iteratio= dealii:ns for
> > your:TrilinosWrappers::SparseMa iterative solver to trix, PRECONDconverge.
> > This often ITIONER = deahappens when you increalii::Trilise the size of
> > ynosWraouppers:r prob:Precondilem. In tionAMG]such cas
> > The ves, the last reiolated conditionsidual  was:
> >  will likely s   false
> > till be veThe name ry small, aand cnd you call sean make thequence of the
> > exception was:
> >    error go away by i  SolverConncreasing trol::NoConvergence
> > (this->control().last_step(), this->controlthe allowed number of iterations
> > when setting up the SolverCont().last_varol object that determlue())
> > Additional Infoines the maximal number of irmation:
> > Iterative methoterations yd reported conou allow.
> >
> > Thvergencee other si failure in stetuation where this errp 104650. The
> > residuaor may occur l in the last is when your mstep was 4.25519e-19.
> >
> > atrix is not invertibleThis error (e.g., y message can inour matrix has
> > dicate tha null-sat you have sipace), or if ymply not allowou try to appled
> > a sufficiently large ny the wrong solver to a matrix (eumber of iterations
> > for your ite.g., using CG for a matrrative solver to conix that is not
> > symmetverric or not positive definite)ge. This often happ. In these caens
> > whenses, the resi you incrdual in the laease the size of st iteration is
> > likelyyour problem. In suc going to be large.
> > h cases, the last-------------------------- residual will
> > l----------------------ikely --------still be v
> >
> > Abortiery smallng!
> > ------------, an----------------d you can make the---------------------
> > error go aw---
> > ay by increasing the allowed number of iterations when setting up the
> > SolverControl object that determines the maximal number of iterations you
> > allow.
> >
> > The other situation where this error may occur is when your matrix is not
> > invertible (e.g., your matrix has a null-space), or if you try to apply the
> > wrong solver to a matrix (e.g., using CG for a matrix that is not symmetric
> > or not positive definite). In these cases, the residual in the last
> > iteration is likely going to be large.
> > --------------------------------------------------------
> >
> > Aborting!
> > ----------------------------------------------------
> > --------------------------------------------------------------------------
> > mpirun noticed that the job aborted, but has no info as to the process
> > that caused that situation.
> > --------------------------------------------------------------------------
> >
> >
> >
> >
> >> Date: Sat, 11 Oct 2014 19:18:45 +0200
> >> From: sylvia.rockel at fu-berlin.de
> >> To: aspect-devel at geodynamics.org
> >> Subject: Re: [aspect-devel] ASPECT on Mac
> >
> >>
> >> Dear all,
> >>
> >> with PETSc instead of Trilinos the mpirun seems to work (step 480 now and
> >> still running). Thanks for the hint.
> >>
> >> Concerning the Trilinos bug, I was totally aware of the fact that I'll
> >> have to wait until it's fixed in the mac package, since unfortunately it
> >> is Trilinos which I just can't manage to get it compiled without errors
> >> after I reset my whole computer. Wolfgang's reply was just to an older
> >> mail. So everything is fine. Until the Trilinos bug is fixed I can now use
> >> the Aspect_with_PETCs-version in the meantime.
> >>
> >> But thanks anyways for your time and help.
> >>
> >> Best,
> >> Sylvia
> >>
> >> > Hey all,
> >> >
> >> > maybe I wasn't clear enough about this:
> >> >
> >> > The mac package deal.ii-8.2pre-v2 is broken because it bundles a
> >> > Trilinos version that is not compatible. As a result the code will
> >> > produce strange errors after a couple of timesteps in parallel
> >> > (basically the matrix is assembled wrong). You _have_ to wait until
> >> > Luca releases a new version or setup and compile deal.II by yourself.
> >> >
> >> >
> >> >> Since you have the deal.II package installed, you should also have
> >> >> PETSc.
> >> >> Can you try to compile ASPECT with PETSc support instead of using
> >> >> Trilinos?
> >> >> Maybe that gets you any further.
> >> >
> >> > Yes, that might work.
> >> >
> >> >
> >> > --
> >> > Timo Heister
> >> > http://www.math.clemson.edu/~heister/
> >> > _______________________________________________
> >> > Aspect-devel mailing list
> >> > Aspect-devel at geodynamics.org
> >> > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
> >> >
> >>
> >> _______________________________________________
> >> Aspect-devel mailing list
> >> Aspect-devel at geodynamics.org
> >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
> >
> > _______________________________________________
> > Aspect-devel mailing list
> > Aspect-devel at geodynamics.org
> > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
> 
> -- 
> Timo Heister
> http://www.math.clemson.edu/~heister/
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
 		 	   		  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20141017/ef1f4cc2/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: free-surface-with-mpi.pdf
Type: application/pdf
Size: 51355 bytes
Desc: not available
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20141017/ef1f4cc2/attachment-0001.pdf>


More information about the Aspect-devel mailing list