[aspect-devel] [CIG-MC] Installing ASPECT on Cray XC30

Marine Lasbleis marine.lasbleis at gmail.com
Tue Jul 11 22:13:14 PDT 2017


Dear Timo and Wolfgang, 

I tried with the flag  -D ASPECT_USE_FP_EXCEPTIONS=OFF, and ASPECT is indeed running!
(by the way, which would be the best example to use from the cookbook to check how well aspect is running? I tried the convection-box. I want to use ASPECT for inner core, so I’ll be also running the inner core cookbook soon)

amongst other things we also tried: 
- same than before, with apron -n 1 (so monoproc), same error. 
- static installation (following subsection  3.3.5 Compiling a static ASPECT executable)
It did not work on our cluster (not sure why, but building of trilinos failed)


So, is there any reason the -D ASPECT_USE_FP_EXCEPTIONS=OFF should not be used? 
"Do you get the same error if you run with one processor? If so, do you know how to generate a backtrace in a debugger to figure out where this is happening?” 
> Yes, same error. No, I don’t know how to do a backtrace. If you think I should try, please let me know! (either with explanation on how to do it, or I’ll have a look at internet)

I will investigate a little bit more how well the runs are, and trying a couple of examples to be sure it seems to work. I’ll install it on other sessions of the cluster as well. 

Thank you! 
And thank you so much for the candi.sh script adapted to Cray :-) I tried installing ASPECT on the cluster (and on my mac) a couple of years ago, but with no success. Also, as we are a couple of people here in ELSI this summer wanting to use ASPECT, I installed it with docker on another colleague’s mac, and it seems to be working correctly (at first, I gave her my old virtual machine from a summer workshop in 2014). It’s likely that you’ll get questions from people in Nantes (France) willing to use ASPECT on their own cluster as well. 

Best, 
Marine

> On 11 Jul 2017, at 22:00, Timo Heister <heister at clemson.edu> wrote:
> 
> Marine,
> 
> 1. can you post your detailed.log from your ASPECT directory?
> 
> 2. can you try to call
> cmake -D ASPECT_USE_FP_EXCEPTIONS=OFF .
> 
> inside your aspect build directory, and then "make" again? This should
> disable the floating point exceptions (we use this as a tool to find
> errors and bugs), but you will likely run into other problems then.
> But maybe this is enough to make things work.
> 
> 
> On Tue, Jul 11, 2017 at 2:09 AM, Marine Lasbleis
> <marine.lasbleis at elsi.jp <mailto:marine.lasbleis at elsi.jp>> wrote:
>> Hi all,
>> 
>> Sorry, it appears I used the wrong mailing list!
>> Let me know if someone has experience building ASPECT (and required
>> libraries) on a CRAY XC30.
>> 
>> Best,
>> Marine
>> 
>> 
>> Marine Lasbleis
>> 
>> =====
>> ELSI Research Scientist
>> Earth Life Science Institute
>> Tokyo Institute of Technology
>> Tokyo, Japan
>> +81 70 1572 5070
>> marine.lasbleis at elsi.jp
>> 
>> https://urldefense.proofpoint.com/v2/url?u=https-3A__members.elsi.jp_-7Emarine.lasbleis_&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=LoLdd_1ebxXqVRRXpWavR7d1dGghq8WrxfMfJW5bv90&e= <https://urldefense.proofpoint.com/v2/url?u=https-3A__members.elsi.jp_-7Emarine.lasbleis_&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=LoLdd_1ebxXqVRRXpWavR7d1dGghq8WrxfMfJW5bv90&e=> 
>> https://urldefense.proofpoint.com/v2/url?u=http-3A__elsi.jp_en_&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=7qmW2cZqp2dDQA8f55NM9FLl5v83q1G1A8WVaCxGC14&e= <https://urldefense.proofpoint.com/v2/url?u=http-3A__elsi.jp_en_&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=7qmW2cZqp2dDQA8f55NM9FLl5v83q1G1A8WVaCxGC14&e=> 
>> 
>> 
>> Begin forwarded message:
>> 
>> From: Max Rudolph <maxrudolph at ucdavis.edu <mailto:maxrudolph at ucdavis.edu>>
>> Subject: Re: [CIG-MC] Installing ASPECT on Cray XC30
>> Date: 11 Jul 2017 11:52:51 GMT+9
>> To: <cig-mc at geodynamics.org <mailto:cig-mc at geodynamics.org>>
>> Reply-To: cig-mc at geodynamics.org <mailto:cig-mc at geodynamics.org>
>> 
>> Marine,
>> You may want to post this on the aspect-devel mailing list if you haven't
>> done so already.
>> 
>> Max
>> 
>> On Mon, Jul 10, 2017 at 5:00 PM, Marine Lasbleis <marine.lasbleis at gmail.com <mailto:marine.lasbleis at gmail.com>>
>> wrote:
>>> 
>>> Hi all,
>>> 
>>> This is my first message here, I hope it’s OK.
>>> I’m started to work on ASPECT, and installed it already on a desktop
>>> computer (debian with 8 cores). But would like to install it on the
>>> available clusters. (I have access to 3 different clusters. Not sure which
>>> one is the best for that… And definitely no real admin for the clusters.
>>> They are “self-organised”, which is not always for the best)
>>> 
>>> I’m trying to install ASPECT on the ELSI cluster, which is a CRAY CX30,
>>> and while having problems, I found that you may have done the same a couple
>>> of weeks ago (I saw this conversation:
>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__dealii.narkive.com_jCU1oGdB_deal-2Dii-2Dget-2Derrors-2Dwhen-2Dinstalling-2Ddealii-2Don-2Dopensuse-2Dleap-2D42-2D1-2Dby-2Dusing-2Dcandi&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=3hIE02zdEMfEZQ-54THur4-xXpYPz_X9QkkDRViqAOc&e= <https://urldefense.proofpoint.com/v2/url?u=http-3A__dealii.narkive.com_jCU1oGdB_deal-2Dii-2Dget-2Derrors-2Dwhen-2Dinstalling-2Ddealii-2Don-2Dopensuse-2Dleap-2D42-2D1-2Dby-2Dusing-2Dcandi&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=3hIE02zdEMfEZQ-54THur4-xXpYPz_X9QkkDRViqAOc&e=> 
>>> )
>>> 
>>> For now, what we’ve done: (before seeing candi installation)
>>> - switch to PrgEnv-gnu
>>> - try to install p4est. But it seems that we need to use “ftn” and not
>>> fortran or others, so he can’t do anything, and stop very soon. I tried to
>>> modify by hand the configure file (adding ftn where I could find the system
>>> was looking for fortran of mpif77.) But I guess it’s definitely not a good
>>> idea, and I am obviously still missing a couple of call because I still got
>>> the same error.
>>> 
>>> So, with the conversation, I guessed that https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_dealii_candi&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=PNy4RPfYtSh0YwU5s31LHSfmZj0RgQv3wznnVVHjRXw&e= <https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_dealii_candi&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=PNy4RPfYtSh0YwU5s31LHSfmZj0RgQv3wznnVVHjRXw&e=> 
>>> can actually install everything for me.
>>> Since I’m using a slightly different cluster (CRAY XC30), I will try to
>>> give you updates on my progress.
>>> I’m not familiar with candi, but I decided to give a try, so please excuse
>>> me if I am doing obvious mistakes.
>>> 
>>> I changed the configuration as requested, and loaded the required modules
>>> and defined new variables for the info on the compilers.
>>> In this particular cluster, we need to be careful with the path where to
>>> install (the default one is on a drive that is very slow to access, and
>>> compilation takes forever), so I had to use a -p path option. Also, I think
>>> I used first too many cores to compile, and got a memory error (internal
>>> compiler error raised, which seems to be related to available memory)
>>> 
>>> So, from my day trying to install:
>>> - I finished the candi.sh script, apparently everything correctly
>>> installed.
>>> - I built ASPECT (with this particular cluster, be careful with cmake. By
>>> default, the cmake is not up-to-date, and in particular even after
>>> installation with candi.sh, the available cmake is not the one that was
>>> installed)
>>> I got a couple of warnings, mostly about PETSc, that I thought were only
>>> warnings and not problems.
>>> Most of them were along the line of this one:
>>> warning: 'dealii::PETScWrappers::MPI::Vector::supports_distributed_data'
>>> is deprecated [-Wdeprecated-declarations] , for either PETSc or Trilinos.
>>> 
>>> - I’ve run a couple of examples from the cookbook. None are working.
>>> 
>>> I got this from running ASPEC using aprun -n4 ../aspect burnman.prm
>>> 
>>> -----------------------------------------------------------------------------
>>> -- This is ASPECT, the Advanced Solver for Problems in Earth's ConvecTion.
>>> --     . version 1.5.0
>>> --     . running in DEBUG mode
>>> --     . running with 4 MPI processes
>>> --     . using Trilinos
>>> 
>>> -----------------------------------------------------------------------------
>>> 
>>> [0]PETSC ERROR: [1]PETSC ERROR: [3]PETSC ERROR: [2]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [0]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> ------------------------------------------------------------------------
>>> [2]PETSC ERROR:
>>> ------------------------------------------------------------------------
>>> [1]PETSC ERROR: [3]PETSC ERROR: Caught signal number 8 FPE: Floating Point
>>> Exception,probably divide by zero
>>> [1]PETSC ERROR: Try option -start_in_debugger or -on_error_attach_debugger
>>> Caught signal number 8 FPE: Floating Point Exception,probably divide by
>>> zero
>>> [1]PETSC ERROR: [3]PETSC ERROR: or see
>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html-23valgrind&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=EwRL-vfnsOaRgZyq9yA5cgFyGt6wyqbxmPnUR8DEnjo&e= <https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html-23valgrind&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=EwRL-vfnsOaRgZyq9yA5cgFyGt6wyqbxmPnUR8DEnjo&e=> 
>>> Try option -start_in_debugger or -on_error_attach_debugger
>>> [1]PETSC ERROR: [3]PETSC ERROR: or try https://urldefense.proofpoint.com/v2/url?u=http-3A__valgrind.org&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=ozKrZdv65t35wUP3q4s5YnSV27_oA6xmJCjsqYFNx3s&e= <https://urldefense.proofpoint.com/v2/url?u=http-3A__valgrind.org&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=ozKrZdv65t35wUP3q4s5YnSV27_oA6xmJCjsqYFNx3s&e=>  on GNU/linux
>>> and Apple Mac OS X to find memory corruption errors
>>> or see https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html-23valgrind&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=EwRL-vfnsOaRgZyq9yA5cgFyGt6wyqbxmPnUR8DEnjo&e= <https://urldefense.proofpoint.com/v2/url?u=http-3A__www.mcs.anl.gov_petsc_documentation_faq.html-23valgrind&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=EwRL-vfnsOaRgZyq9yA5cgFyGt6wyqbxmPnUR8DEnjo&e=> 
>>> [1]PETSC ERROR: [3]PETSC ERROR: configure using --with-debugging=yes,
>>> recompile, link, and run
>>> or try https://urldefense.proofpoint.com/v2/url?u=http-3A__valgrind.org&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=ozKrZdv65t35wUP3q4s5YnSV27_oA6xmJCjsqYFNx3s&e= <https://urldefense.proofpoint.com/v2/url?u=http-3A__valgrind.org&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=ozKrZdv65t35wUP3q4s5YnSV27_oA6xmJCjsqYFNx3s&e=>  on GNU/linux and Apple Mac OS X to find memory
>>> corruption errors
>>> [1]PETSC ERROR: [3]PETSC ERROR: to get more information on the crash.
>>> configure using --with-debugging=yes, recompile, link, and run
>>> [3]PETSC ERROR: to get more information on the crash.
>>> [1]PETSC ERROR: --------------------- Error Message
>>> --------------------------------------------------------------
>>> Caught signal number 8 FPE: Floating Point Exception,probably divide by
>>> zero
>>> 
>>> 
>>> 
>>> 
>>> 
>>> Any idea where this could come from?
>>> (any additional files I should show you?)
>>> 
>>> 
>>> Thanks! (and many thanks to the person who did the candi.sh script for
>>> Cray XC40 :-) )
>>> Marine
>>> 
>>> 
>>> 
>>> 
>>> 
>>> _______________________________________________
>>> CIG-MC mailing list
>>> CIG-MC at geodynamics.org <mailto:CIG-MC at geodynamics.org>
>>> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_cig-2Dmc&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=7tnjS3DwD11pd0Hw_1BkS8zxpTVmv_DKo8iWlM7fRuE&e= <https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_cig-2Dmc&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=7tnjS3DwD11pd0Hw_1BkS8zxpTVmv_DKo8iWlM7fRuE&e=> 
>> 
>> 
>> 
>> 
>> --
>> ---------------------------------
>> Max Rudolph
>> Assistant Professor, Earth and Planetary Sciences, UC Davis
>> webpage
>> _______________________________________________
>> CIG-MC mailing list
>> CIG-MC at geodynamics.org <mailto:CIG-MC at geodynamics.org>
>> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_cig-2Dmc&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=7tnjS3DwD11pd0Hw_1BkS8zxpTVmv_DKo8iWlM7fRuE&e= <https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_cig-2Dmc&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=zrvFCXon07qCkF8qXzZ6rHu6CjY30e55CUFHopV1Hm8&s=7tnjS3DwD11pd0Hw_1BkS8zxpTVmv_DKo8iWlM7fRuE&e=> 
>> 
>> 
>> 
>> _______________________________________________
>> Aspect-devel mailing list
>> Aspect-devel at geodynamics.org <mailto:Aspect-devel at geodynamics.org>
>> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=-Akgl8D9XHMqGKMmF_ZJUPsW7I_-1G6mLeoE2Wpszrc&s=0Gsi6keZz8ri0Vnr9RwBCIGZqCsFqbzuDynW8S3C41U&e= <https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=-Akgl8D9XHMqGKMmF_ZJUPsW7I_-1G6mLeoE2Wpszrc&s=0Gsi6keZz8ri0Vnr9RwBCIGZqCsFqbzuDynW8S3C41U&e=>
> 
> -- 
> Timo Heister
> http://www.math.clemson.edu/~heister/ <http://www.math.clemson.edu/~heister/>
> _______________________________________________
> Aspect-devel mailing list
> Aspect-devel at geodynamics.org <mailto:Aspect-devel at geodynamics.org>
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20170712/9295a40f/attachment-0001.html>


More information about the Aspect-devel mailing list