[aspect-devel] PETSc support

Timo Heister heister at clemson.edu
Tue Jan 21 11:06:10 PST 2014


To take this back to the mailing list:
Ian and I figured out what the problem was and the scalar systems now
take the same number of iterations with PETSc and Trilinos. The Stokes
system is even solved in fewer iterations, maybe because of
differences between Trilinos ML.

On Tue, Jan 21, 2014 at 12:50 PM, Ian Rose <ian.rose at berkeley.edu> wrote:
> The fix to PetscWrappers::VectorBase::all_zero seems to have done the trick.
> However, I am still seeing large iteration counts on the advection systems.
> Overall, this is a small(ish) portion of the runtime, though it is curious.
> It seems to be the same whether I use one or several processors.  Can you
> reproduce that?
>
>
> On Tue, Jan 21, 2014 at 6:20 AM, Timo Heister <heister at clemson.edu> wrote:
>>
>> > SolverCG<LinearAlgebra::Vector> solver(solver_control);
>> > with
>> > PETScWrappers::SolverCG solver(solver_control);
>>
>> Can you please check if deal.II r32242 fixes this problem for you?
>>
>> Note that I still don't quite understand the differences in number of
>> iterations, especially because the composition solver uses ILU(0) and
>> the deal.II GMRES so they should be very similar. Could you check if
>> this changes if we are on 1 cpu?
>>
>> --
>> Timo Heister
>> http://www.math.clemson.edu/~heister/
>> _______________________________________________
>> Aspect-devel mailing list
>> Aspect-devel at geodynamics.org
>> http://geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>
>
>
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel



-- 
Timo Heister
http://www.math.clemson.edu/~heister/


More information about the Aspect-devel mailing list