[aspect-devel] get_dof_indices() only works on active cells

Lev Karatun lev.karatun at gmail.com
Thu Oct 13 10:57:47 PDT 2016


Thank you Sam!!

I'm having trouble compiling the latest version of dealII unfortunately:
[ 86%] Building CXX object
source/grid/CMakeFiles/obj_grid.release.dir/grid_reordering.cc.o
/home/lev/aspect/dealii/source/grid/grid_reordering.cc: In function ‘void
dealii::internal::GridReordering2d::orient_one_set_of_parallel_edges(const
std::vector<dealii::internal::GridReordering2d::Cell<2>,
std::allocator<dealii::internal::GridReordering2d::Cell<2> > >&,
std::vector<dealii::internal::GridReordering2d::Edge<2>,
std::allocator<dealii::internal::GridReordering2d::Edge<2> > >&, unsigned
int, unsigned int)’:
/home/lev/aspect/dealii/source/grid/grid_reordering.cc:558: error: using
‘typename’ outside of template

I understand it's not due to the change that you made though. Is it best to
post this error message to dealII google group or is this mailing list
fine?

Best regards,
Lev Karatun.

2016-10-13 9:58 GMT-04:00 Cox, Samuel P. <spc29 at leicester.ac.uk>:

> Hi Lev,
>
> I’ve posted a pull request at https://github.com/dealii/dealii/pull/3232
> to fix this subtle bug. It’s a single line change. Thanks for your tips on
> making the example smaller - it certainly helped!
>
> Bests,
> Sam
>
>
> On 12 Oct 2016, at 05:51, Lev Karatun <lev.karatun at gmail.com> wrote:
>
> Hi John, Sam, Timo,
>
> Thank you for the replies. The bug does occur on the updated version of
> dealII. If I turn off periodic boundaries the solver fails to converge
> exactly the step after the 1st refinement cycle (not counting the initial
> one). When I disabled the adaptive refinement the error occurs at timestep
> 30 instead of 10. When I disable both refiement and periodic b.c., the code
> runs without any problems.
>
> Do you mean that identical runs give an error at different time steps, or
>> that varying some parameters changes the time of the error’s appearance?
>
> the latter. For example, if I change the frequency of mesh refinement from
> 10 to 5 or 7 times steps, the code just hangs at timestep following the
> refinemet instead. When I changed it to 3, the error occured after time
> step 6 (2nd refinement cycle).
>
> A smaller test case (since we're really interested in the mesh, not the
>> multiple iterative solves and matrix assemblies) would definitely help.
>
>
> I understand that, but unfortunately I wasn't able to reproduce the
> problem on a small setup. The only things that I was able to simplify is
> replacing the solver with IMPES and changing refinement frequency to 3 time
> steps (in this case error occurs after the 6th time step)
>
> Best regards,
> Lev Karatun.
>
> 2016-10-11 13:05 GMT-04:00 Cox, Samuel P. <spc29 at leicester.ac.uk>:
>
>> I managed to get this running, but am not sure yet what is going on, as I
>> can't devote any time to it.
>>
>> I'm using the latest dealii as of yesterday, so it includes the fix Timo
>> mentioned.
>>
>> On 6 cores it crashed as Lev said, after timestep 10 after it adapts the
>> mesh.
>>
>> On one core, it ran until timestep 11, failing to converge one of the
>> solves. So it didn't hit the bug. If these are reproducible (I'm not going
>> to try as the 1-core took me over 2h to run) it looks like some combination
>> of periodicity and non-local ownership breaks the logic of the sparsity
>> pattern function. That's as far as I've got. A smaller test case (since
>> we're really interested in the mesh, not the multiple iterative solves and
>> matrix assemblies) would definitely help.
>>
>> Sam
>>
>> On 11 Oct 2016 5:50 pm, Timo Heister <heister at clemson.edu> wrote:
>>
>> Lev,
>>
>> you are running into a bug that was fixed 5 days ago in the
>> development version of deal.II:
>> https://urldefense.proofpoint.com/v2/url?u=https-3A__github.
>> com_dealii_dealii_pull_3210&d=CwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhc
>> qsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1I
>> YuzKV_Zk&m=Iq5T1c5Ha5z1DNUXITqTR6TqZrHmG0TGyTBNKgmBzD8&s=h1s
>> wr-LDAIQYp0muY6f7qBSHRxHD_KRiASfFbKV_zkk&e=
>>
>> It only happens if you use the development version of deal.II with DG
>> and periodic boundaries. So you can update deal.II or change your
>> setup.
>>
>> On Tue, Oct 11, 2016 at 2:19 AM, John Naliboff <jbnaliboff at ucdavis.edu>
>> wrote:
>> > Hi Lev,
>> >
>> > Does the error only occur with the periodic boundary conditions and/or
>> mesh
>> > refinement?
>> >
>> > I’m honestly not sure what the specific issue is here as it relates to
>> the
>> > DG method and the error message below.  Ying, Sam, et al. will
>> hopefully an
>> > idea.
>> >
>> > Cheers,
>> > John
>> >
>> > *************************************************
>> > John Naliboff
>> > Assistant Project Scientist, CIG
>> > Earth & Planetary Sciences Dept., UC Davis
>> >
>> >
>> >
>> >
>> >
>> >
>> > On Oct 10, 2016, at 10:47 PM, Lev Karatun <lev.karatun at gmail.com>
>> wrote:
>> >
>> > Hi everyone,
>> >
>> > I was trying to run a model with the discontinuous boundaries feature,
>> and
>> > got the following error:
>> >
>> > [titan:62377] *** An error occurred in MPI_Allreduce
>> > [titan:62377] *** reported by process [140014581841921,0]
>> > [titan:62377] *** on communicator MPI COMMUNICATOR 3 DUP FROM 0
>> > [titan:62377] *** MPI_ERR_IN_STATUS: error code in status
>> > [titan:62377] *** MPI_ERRORS_ARE_FATAL (processes in this communicator
>> will
>> > now abort,
>> > [titan:62377] ***    and potentially your MPI job)
>> >
>> > After making the problem smaller and running in debug version, I got the
>> > following error:
>> >
>> > --------------------------------------------------------
>> > An error occurred in line <3334> of file
>> > </home/lev/aspect/dealiitest/include/deal.II/dofs/dof_access
>> or.templates.h>
>> > in function
>> >     void dealii::DoFCellAccessor<DoFHandlerType,
>> > lda>::get_dof_indices(std::vector<unsigned int,
>> std::allocator<unsigned int>
>> >>&) const [with DoFHandlerType = dealii::DoFHandler<3, 3>, bool
>> > level_dof_access = true]
>> > The violated condition was:
>> >     this->active()
>> > Additional information:
>> >     get_dof_indices() only works on active cells.
>> >
>> > Stacktrace:
>> > -----------
>> > #0  /home/lev/aspect/dealiitest/lib/libdeal_II.g.so.8.5.0-pre:
>> > dealii::DoFCellAccessor<dealii::DoFHandler<3, 3>,
>> > true>::get_dof_indices(std::vector<unsigned int,
>> std::allocator<unsigned
>> > int> >&) const
>> > #1  /home/lev/aspect/dealiitest/lib/libdeal_II.g.so.8.5.0-pre:
>> > #2  /home/lev/aspect/dealiitest/lib/libdeal_II.g.so.8.5.0-pre: void
>> > dealii::DoFTools::make_flux_sparsity_pattern<dealii::DoFHandler<3, 3>,
>> > dealii::TrilinosWrappers::BlockSparsityPattern>(dealii::DoFHandler<3,
>> 3>
>> > const&, dealii::TrilinosWrappers::BlockSparsityPattern&,
>> > dealii::ConstraintMatrix const&, bool, dealii::Table<2,
>> > dealii::DoFTools::Coupling> const&, dealii::Table<2,
>> > dealii::DoFTools::Coupling> const&, unsigned int)
>> > #3  ../aspect:
>> > aspect::Simulator<3>::setup_system_matrix(std::vector<dealii::IndexSet,
>> > std::allocator<dealii::IndexSet> > const&)
>> > #4  ../aspect: aspect::Simulator<3>::setup_dofs()
>> > #5  ../aspect: aspect::Simulator<3>::refine_mesh(unsigned int)
>> > #6  ../aspect: aspect::Simulator<3>::maybe_refine_mesh(double, unsigned
>> > int&)
>> > #7  ../aspect: aspect::Simulator<3>::run()
>> > #8  ../aspect: main
>> > --------------------------------------------------------
>> >
>> > ------------------------------------------------------------
>> --------------
>> > mpirun noticed that process rank 1 with PID 60556 on node titan exited
>> on
>> > signal 11 (Segmentation fault).
>> > ------------------------------------------------------------
>> --------------
>> >
>> >
>> > Both Aspect and dealII are updated to the latest version. The problems
>> seems
>> > to appear at different stages of the simulation, but always after a mesh
>> > refinement cycle. I attached the prm file used to reproduce the
>> problem. The
>> > fork containing the plugins is here
>> > https://urldefense.proofpoint.com/v2/url?u=https-3A__github.
>> com_lkaratun_aspect_tree_nz&d=CwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhc
>> qsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1I
>> YuzKV_Zk&m=Iq5T1c5Ha5z1DNUXITqTR6TqZrHmG0TGyTBNKgmBzD8&s=6MI
>> BgqN12tDhwVHiA6hEsfXoTH76VQwsXmsP_HsuO0M&e=
>> > If someone could give me an idea of what can cause the problem, I would
>> > appreciate it.
>> >
>> > Thanks in advance!
>> >
>> > Best regards,
>> > Lev Karatun.
>> > <268s_fail.prm>_______________________________________________
>> > Aspect-devel mailing list
>> > Aspect-devel at geodynamics.org
>> > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.ge
>> odynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=Cw
>> IFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq
>> 4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=Iq5T1c5Ha5z1DNUXITqTR
>> 6TqZrHmG0TGyTBNKgmBzD8&s=ogNzlC_wGb2zMkzTsld5m3T70_VQl7kUXOk946_cHSM&e=
>> >
>> >
>> >
>> > _______________________________________________
>> > Aspect-devel mailing list
>> > Aspect-devel at geodynamics.org
>> > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.ge
>> odynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=Cw
>> IGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9J
>> C99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=b3Lvok-1PGZVSKocAuzgi
>> Aqbdgpoa0b0CdCmOCgH_Os&s=bXodap_iiJI5EE7xLM0FSsuWZBkTjcupVWOyGbS6oAY&e=
>>
>> --
>> Timo Heister
>> http://www.math.clemson.edu/~heister/
>> _______________________________________________
>> Aspect-devel mailing list
>> Aspect-devel at geodynamics.org
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>>
>>
>>
>> _______________________________________________
>> Aspect-devel mailing list
>> Aspect-devel at geodynamics.org
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>>
>
> <268fail.prm>_______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>
>
>
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20161013/b69811ed/attachment-0001.html>


More information about the Aspect-devel mailing list