[aspect-devel] Error in moving temporary files during output

Thomas Geenen geenen at gmail.com
Tue Jan 14 01:09:33 PST 2014


he Rene,

nope its the cray compilers fault.
expect it to compile for almost a day for dealII (9.5 hours with version
8.1.4)......
i have been trying to get it to work on the cray @hlrs with the help of
their cray engineers.
8.1.4 does not work for sure i hoped that the newer versions of the
compiler would be more succesfull.
i will have to dig up my notes and gain access to a cray system again to
test it.
is this part of a prace PA project? on what system are you running?

cheers
Thomas


On Mon, Jan 13, 2014 at 7:32 PM, Rene Gassmoeller <rengas at gfz-potsdam.de>wrote:

> Hi Thomas,
> actually I used the gcc (4.8.1) after trying the cray compiler (8.1.9)
> for a while without success, I am however no expert in this, so it might
> have been my fault and not the compilers. I just rechecked the part
> about static linking and you are right. Static linking simply is the
> default option proposed by the admin. If you have instructions available
> for the dynamic linking I would be happy to see them.
>
> Cheers,
> Rene
>
> On 01/13/2014 06:30 PM, Thomas Geenen wrote:
> > he Rene,
> >
> > what is the status of getting dealii to compile with the cray compiler?
> > last year we still needed a pre-release of the compiler to get it to work
> > its possible to build with shared libraries as well (gcc)
> > let me know if you need help with setting that up.
> > the instructions are for the pre cmake migration so i will have to do a
> > quick check to adapt the hacks to the new build system.
> >
> > cheers
> > Thomas
> >
> >
> > On Mon, Jan 13, 2014 at 6:05 PM, Rene Gassmoeller <rengas at gfz-potsdam.de
> >wrote:
> >
> >> Dear all,
> >> I have a follow up question on the Input/Output issue Wolfgang had. I am
> >> currently doing scaling tests on a Cray XC30 up to 4000 cores and
> >> getting the same error messages. I am writing in a designated $WORK
> >> directory that is intended for data output, however there is also a fast
> >> local $TMPDIR directory on each compute node.
> >> I guess my question is: Is the Number of grouped files = 1 parameter
> >> always useful in case of a large parallel computation and an existent
> >> MPI I/O system or is this cluster specific (in that case I will just
> >> contact the system administrators for help)?
> >>
> >> Another thing I would like to mention is that this system only allows
> >> for static linking. With my limited knowledge I was only able to compile
> >> ASPECT by commenting out the option to dynamically load external
> >> libraries by user input. Could somebody who introduced the possibility
> >> to dynamically load libraries at runtime comment on the work to make
> >> this a switch at compile-time? I dont know much about this, otherwise I
> >> would search for a solution myself. In case this creates a longer
> >> discussion I will open up a new thread on the mailing list.
> >>
> >> Thanks for comments and suggestions,
> >> Rene
> >>
> >> On 10/14/2013 05:02 PM, Wolfgang Bangerth wrote:
> >>> On 10/11/2013 01:32 PM, Timo Heister wrote:
> >>>>> I'm running this big computation and I'm getting these errors:
> >>>>
> >>>> For a large parallel computation I would use MPI I/O using
> >>>>      set Number of grouped files       = 1
> >>>
> >>> Good point. I had forgotten about this flag.
> >>>
> >>>>
> >>>>> ***** ERROR: could not move /tmp/tmp.gDzuQS to
> >>>>> output/solution-00001.0007.vtu *****
> >>>>
> >>>> Is this on brazos? Are you writing/moving into your slow NFS ~/? I got
> >>>> several GB/s writing with MPI I/O to the parallel filesystem (was it
> >>>> called data or fdata?).
> >>>
> >>> Yes. I was just writing to something under $HOME. Maybe not very
> smart...
> >>>
> >>>
> >>>> retry and otherwise abort the computation I would say.
> >>>
> >>> OK, that's what I'm doing now. It shouldn't just fail any more now.
> >>>
> >>> Best
> >>>  W.
> >>>
> >>>
> >> _______________________________________________
> >> Aspect-devel mailing list
> >> Aspect-devel at geodynamics.org
> >> http://geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
> >>
> >
> >
> >
> > _______________________________________________
> > Aspect-devel mailing list
> > Aspect-devel at geodynamics.org
> > http://geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
> >
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org
> http://geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://geodynamics.org/pipermail/aspect-devel/attachments/20140114/4d92a0c9/attachment.html>


More information about the Aspect-devel mailing list