[aspect-devel] Number of grouped files = 1 -> code doesn't exit after it's finished
Max Rudolph
maxwellr at gmail.com
Mon Mar 21 11:41:43 PDT 2016
Following on what Timo wrote, I thought that it would be worth mentioning
to the list that 'mpicc' and 'mpic++' are just wrapper programs in openmpi,
so even if your system provides intel compilers with openmpi by default,
you can change this behavior by setting environment variables (I do this in
my .bashrc):
export OMPI_CXX=/home/rmaxwell/sw/bin/g++-4.8.4
export OMPI_CC=/home/rmaxwell/sw/bin/gcc-4.8.4
Here, I have selected a copy of a more recent g++ than is available
generally on our university cluster. Some people probably know about this,
but I figured that this information might help some readers of the
aspect-devel list. If you run into a situation where your sysadmins are
reluctant to upgrade gcc systemwide (or even provide it at all) for some
reason, this is one workaround.
Cheers
Max
On Sun, Mar 20, 2016 at 8:21 AM, Timo Heister <heister at clemson.edu> wrote:
> I don't knw anything about that bug but it sounds like a configuration
> issue. Does a simple "hello world MPI" example work? If not that is
> something your cluster admins should look into.
> Just a comment, I would advice against using the intel compiler if you
> can avoid it or you are running into issues because we mainly test
> gcc/clang.
>
> On Sun, Mar 20, 2016 at 3:53 PM, Austermann, Jacqueline
> <jaustermann at fas.harvard.edu> wrote:
> > Hey Timo,
> >
> > Yeah I thought it probably is a problem specific to my configuration (I
> run
> > it on the Harvard cluster). I use openmpi 1.8.3. and the intel 15.0.0.
> > compiler. I tried switching to openmpi 1.10.2 and recompiled all the
> > libraries and aspect but then running it I got the error message below
> and
> > couldn’t figure out why so I switched back to openmpi 1.8.3.
> > Anyways, if you have any ideas let me know but otherwise I’ll get in
> touch
> > with the Research Computing crew here that manages the cluster.
> >
> > Thanks so much!
> > Jacky
> >
> >
> >
> --------------------------------------------------------------------------
> > It looks like orte_init failed for some reason; your parallel process is
> > likely to abort. There are many reasons that a parallel process can
> > fail during orte_init; some of which are due to configuration or
> > environment problems. This failure appears to be an internal failure;
> > here's some additional information (which may only be relevant to an
> > Open MPI developer):
> >
> > PMI2_Job_GetId failed failed
> > --> Returned value (null) (14) instead of ORTE_SUCCESS
> >
> --------------------------------------------------------------------------
> >
> >
> >
> >
> > On Mar 20, 2016, at 2:18 AM, Timo Heister <heister at clemson.edu> wrote:
> >
> > Hey Jacky,
> >
> > the example is working here.
> >
> > Grouping of files will use MPI I/O, so the problem can be related to
> > the filesystem you are writing to (is it an NSF, parallel file system,
> > or a normal local directory?). Also, what MPI library are you using?
> >
> >
> >
> >
> > On Sat, Mar 19, 2016 at 5:06 PM, Austermann, Jacqueline
> > <jaustermann at fas.harvard.edu> wrote:
> >
> > Hi,
> >
> > After updating my master with the upstream master I’ve been running into
> the
> > following problem: When I set Number of grouped files in subsection
> > Postprocess, subsection Visualization to 1 (and probably to anything
> other
> > than 0) the code runs through without problems till the end (after it
> prints
> > the table with the different run times) but does then not exit. It just
> > hangs and I can cancel it manually with the right output but if I don’t
> do
> > that it just stays there until the runtime for the submitted run is
> > exceeded. I tried this out with a few cookbooks and always had the same
> > problem (I attached a .prm that produces this error).
> > Let me know if you have any ideas why this could be.
> >
> > Thanks!
> > Jacky
> >
> >
> >
> >
> > _______________________________________________
> > Aspect-devel mailing list
> > Aspect-devel at geodynamics.org
> > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
> >
> >
> >
> >
> > --
> > Timo Heister
> > http://www.math.clemson.edu/~heister/
> > _______________________________________________
> > Aspect-devel mailing list
> > Aspect-devel at geodynamics.org
> > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
> >
> >
> >
> > _______________________________________________
> > Aspect-devel mailing list
> > Aspect-devel at geodynamics.org
> > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>
>
>
> --
> Timo Heister
> http://www.math.clemson.edu/~heister/
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20160321/0c72bc17/attachment.html>
More information about the Aspect-devel
mailing list