[aspect-devel] get_dof_indices() only works on active cells

Lev Karatun lev.karatun at gmail.com
Mon Oct 10 22:47:58 PDT 2016

Hi everyone,

I was trying to run a model with the discontinuous boundaries feature, and
got the following error:

[titan:62377] *** An error occurred in MPI_Allreduce
[titan:62377] *** reported by process [140014581841921,0]
[titan:62377] *** on communicator MPI COMMUNICATOR 3 DUP FROM 0
[titan:62377] *** MPI_ERR_IN_STATUS: error code in status
[titan:62377] *** MPI_ERRORS_ARE_FATAL (processes in this communicator will
now abort,
[titan:62377] ***    and potentially your MPI job)

After making the problem smaller and running in debug version, I got the
following error:

An error occurred in line <3334> of file
in function
    void dealii::DoFCellAccessor<DoFHandlerType,
lda>::get_dof_indices(std::vector<unsigned int, std::allocator<unsigned
int> >&) const [with DoFHandlerType = dealii::DoFHandler<3, 3>, bool
level_dof_access = true]
The violated condition was:
Additional information:
    get_dof_indices() only works on active cells.

#0  /home/lev/aspect/dealiitest/lib/libdeal_II.g.so.8.5.0-pre:
dealii::DoFCellAccessor<dealii::DoFHandler<3, 3>,
true>::get_dof_indices(std::vector<unsigned int, std::allocator<unsigned
int> >&) const
#1  /home/lev/aspect/dealiitest/lib/libdeal_II.g.so.8.5.0-pre:
#2  /home/lev/aspect/dealiitest/lib/libdeal_II.g.so.8.5.0-pre: void
dealii::DoFTools::make_flux_sparsity_pattern<dealii::DoFHandler<3, 3>,
dealii::TrilinosWrappers::BlockSparsityPattern>(dealii::DoFHandler<3, 3>
const&, dealii::TrilinosWrappers::BlockSparsityPattern&,
dealii::ConstraintMatrix const&, bool, dealii::Table<2,
dealii::DoFTools::Coupling> const&, dealii::Table<2,
dealii::DoFTools::Coupling> const&, unsigned int)
#3  ../aspect:
std::allocator<dealii::IndexSet> > const&)
#4  ../aspect: aspect::Simulator<3>::setup_dofs()
#5  ../aspect: aspect::Simulator<3>::refine_mesh(unsigned int)
#6  ../aspect: aspect::Simulator<3>::maybe_refine_mesh(double, unsigned
#7  ../aspect: aspect::Simulator<3>::run()
#8  ../aspect: main

mpirun noticed that process rank 1 with PID 60556 on node titan exited on
signal 11 (Segmentation fault).

Both Aspect and dealII are updated to the latest version. The problems
seems to appear at different stages of the simulation, but always after a
mesh refinement cycle. I attached the prm file used to reproduce the
problem. The fork containing the plugins is here
If someone could give me an idea of what can cause the problem, I would
appreciate it.

Thanks in advance!

Best regards,
Lev Karatun.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20161011/76a6c748/attachment.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: 268s_fail.prm
Type: application/octet-stream
Size: 9876 bytes
Desc: not available
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20161011/76a6c748/attachment.obj>

More information about the Aspect-devel mailing list