[aspect-devel] PETSc support
Ian Rose
ian.rose at berkeley.edu
Fri Jan 17 10:10:07 PST 2014
On Fri, Jan 17, 2014 at 7:55 AM, Timo Heister <heister at clemson.edu> wrote:
> > Figured that I should give this a shot, since I was the one interested in
> > PETSc/Trilinos comparisons. I am using petsc 3.4.2 optimized and
> Trilinos
> > 11.2.3. For the most part here I am following suit by testing with
> > composition-passive.prm.
>
> Thanks for looking into this!
>
> > 1) I don't see any assertions being hit in debug mode (but see later).
>
> good.
>
> I take it back, I did find an Assertion being tripped, but in the Trilinos
debug mode. In the temperature statistics postprocessor, calls to
local_range() are failing the assertion
end-begin == vector->Map().NumMyElements()
in trilinos_vector_base.h. Seems like probably an easy fix somewhere?
> > 2) Visually, the results seem to be pretty much the same for what I have
> > looked at. I am, however, seeing surpisingly large differences in
> iteration
> > counts. Specifically, for the temperature/composition solves with PETSc,
> > iteration counts are 30-50, as compared to 10-15 for Trilinos.
>
> This might be related to different handling of residuals (do we use
> Trilinos, PETSc, or deal.II inner solvers?) but most likely this is
> just an artifact of different preconditioners (Hyper vs ML). They were
> similar for me in the past, but only for scalar Poisson type problems.
>
> Sure, especially for the AMG preconditioners. I'm a little surprised that
there is such a difference for the ILU preconditioners.
> > 3) The overall speed of the two versions is not wildly different for
> what I
> > have looked at. Certainly nothing like 50x difference. Perhaps it comes
> > down to optimized vs debug PETSc?
>
> This is aspect running in debug or optimized mode? My 50x was optimized.
>
> Optimized mode, but only on one process.
> > Okay, I have run into a big problem, though. Something is up with
> running
> > on several processors with PETSc in optimized mode. Basically, aspect
> never
> > gets started. It looks like not all the processors are returning from
> the
> > MPI initialization step, but I'm still trying to track it down. It runs
> > fine in debug mode, or with one process in optimized mode.
>
> PETSc optimized or Aspect optimized?
>
> Both are optimized.
> Do you get any output if you do
> mpirun -n 2 ./aspect -log_summary
> or "-log_trace out" and look at the out.* files?
>
>
> Okay, due to the hanging, log_summary does not produce any output.
log_trace, however, seems to have useful information. I was wrong about
it not getting past the MPI_Init stage.
In general, there are several thousand petsc calls per processor, and then
the processors appear to get out of sync and hang. That is to say, each
process is reporting the same set of calls for a while, but then start
reporting different calls. Very strange. Here is some output of `paste
out.0 out.1 | awk '$5 != $10'` for a two process run:
[0] 1.06899 Event begin: MatMult [1] 1.06912 Event begin: VecCopy
[0] 1.069 Event begin: VecScatterBegin [1] 1.06913 Event end: VecCopy
[0] 1.06901 Event end: VecScatterBegin [1] 1.06914 Event begin: VecScale
[0] 1.06904 Event begin: VecScatterEnd [1] 1.06914 Event end: VecScale
[1] 1.06918 Event begin: PCApply
[1] 1.06919 Event begin: VecPointwiseMult
[1] 1.06919 Event end: VecPointwiseMult
[1] 1.0692 Event end: PCApply
[1] 1.0692 Event begin: VecCopy
[1] 1.06921 Event end: VecCopy
[1] 1.06921 Event begin: VecScale
[1] 1.06922 Event end: VecScale
[1] 1.06922 Event begin: VecDot
This is where it hangs, after ~15k calls that are matched. For more than
two processes, the same general thing seems to happen: after a few thousand
PETSc calls, one or more of the processors starts doing something different
and the whole thing hangs.
--
> Timo Heister
> http://www.math.clemson.edu/~heister/
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://geodynamics.org/pipermail/aspect-devel/attachments/20140117/a9325281/attachment-0001.html>
More information about the Aspect-devel
mailing list