[aspect-devel] get_dof_indices() only works on active cells

Cox, Samuel P. spc29 at leicester.ac.uk
Tue Oct 11 02:23:07 PDT 2016


Hi Lev,

I’m not quite sure of which line is causing this error: as John says, could you clarify whether this only breaks with a periodic boundary condition?

I’ve tried running your parameter file and your code, but the problem is not exactly small - could you shrink it further? Running it on 6 cores I gave up waiting for the 3rd timestep to solve the Stokes system after half an hour in debug mode. Since adaptivity only takes place every 10 time steps in your .prm, I can’t trace which call to get_dof_indices() is failing until I run the code. A less computationally expensive example is needed!

The problems seems to appear at different stages of the simulation
Do you mean that identical runs give an error at different time steps, or that varying some parameters changes the time of the error’s appearance?

Bests,
Sam

On 11 Oct 2016, at 07:19, John Naliboff <jbnaliboff at ucdavis.edu<mailto:jbnaliboff at ucdavis.edu>> wrote:

Hi Lev,

Does the error only occur with the periodic boundary conditions and/or mesh refinement?

I’m honestly not sure what the specific issue is here as it relates to the DG method and the error message below.  Ying, Sam, et al. will hopefully an idea.

Cheers,
John

*************************************************
John Naliboff
Assistant Project Scientist, CIG
Earth & Planetary Sciences Dept., UC Davis






On Oct 10, 2016, at 10:47 PM, Lev Karatun <lev.karatun at gmail.com<mailto:lev.karatun at gmail.com>> wrote:

Hi everyone,

I was trying to run a model with the discontinuous boundaries feature, and got the following error:

[titan:62377] *** An error occurred in MPI_Allreduce
[titan:62377] *** reported by process [140014581841921,0]
[titan:62377] *** on communicator MPI COMMUNICATOR 3 DUP FROM 0
[titan:62377] *** MPI_ERR_IN_STATUS: error code in status
[titan:62377] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will now abort,
[titan:62377] ***    and potentially your MPI job)

After making the problem smaller and running in debug version, I got the following error:

--------------------------------------------------------
An error occurred in line <3334> of file </home/lev/aspect/dealiitest/include/deal.II/dofs/dof_accessor.templates.h> in function
    void dealii::DoFCellAccessor<DoFHandlerType, lda>::get_dof_indices(std::vector<unsigned int, std::allocator<unsigned int> >&) const [with DoFHandlerType = dealii::DoFHandler<3, 3>, bool level_dof_access = true]
The violated condition was:
    this->active()
Additional information:
    get_dof_indices() only works on active cells.

Stacktrace:
-----------
#0  /home/lev/aspect/dealiitest/lib/libdeal_II.g.so.8.5.0-pre: dealii::DoFCellAccessor<dealii::DoFHandler<3, 3>, true>::get_dof_indices(std::vector<unsigned int, std::allocator<unsigned int> >&) const
#1  /home/lev/aspect/dealiitest/lib/libdeal_II.g.so.8.5.0-pre:
#2  /home/lev/aspect/dealiitest/lib/libdeal_II.g.so.8.5.0-pre: void dealii::DoFTools::make_flux_sparsity_pattern<dealii::DoFHandler<3, 3>, dealii::TrilinosWrappers::BlockSparsityPattern>(dealii::DoFHandler<3, 3> const&, dealii::TrilinosWrappers::BlockSparsityPattern&, dealii::ConstraintMatrix const&, bool, dealii::Table<2, dealii::DoFTools::Coupling> const&, dealii::Table<2, dealii::DoFTools::Coupling> const&, unsigned int)
#3  ../aspect: aspect::Simulator<3>::setup_system_matrix(std::vector<dealii::IndexSet, std::allocator<dealii::IndexSet> > const&)
#4  ../aspect: aspect::Simulator<3>::setup_dofs()
#5  ../aspect: aspect::Simulator<3>::refine_mesh(unsigned int)
#6  ../aspect: aspect::Simulator<3>::maybe_refine_mesh(double, unsigned int&)
#7  ../aspect: aspect::Simulator<3>::run()
#8  ../aspect: main
--------------------------------------------------------

--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 60556 on node titan exited on signal 11 (Segmentation fault).
--------------------------------------------------------------------------


Both Aspect and dealII are updated to the latest version. The problems seems to appear at different stages of the simulation, but always after a mesh refinement cycle. I attached the prm file used to reproduce the problem. The fork containing the plugins is here https://github.com/lkaratun/aspect/tree/nz
If someone could give me an idea of what can cause the problem, I would appreciate it.

Thanks in advance!

Best regards,
Lev Karatun.
<268s_fail.prm>_______________________________________________
Aspect-devel mailing list
Aspect-devel at geodynamics.org<mailto:Aspect-devel at geodynamics.org>
http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel

_______________________________________________
Aspect-devel mailing list
Aspect-devel at geodynamics.org<mailto:Aspect-devel at geodynamics.org>
http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20161011/5fbdf606/attachment.html>


More information about the Aspect-devel mailing list