[aspect-devel] Error on running subduction models

John Naliboff jbnaliboff at ucdavis.edu
Thu Sep 27 13:17:28 PDT 2018


Hi all,

I am familiar with the cluster in question and using set "Number of 
grouped files = 0" should indeed fix the issue below. To write restart 
files one also needs to disable MPI-IO in p4est.

Cheers,
John

On 09/27/2018 08:48 AM, Timo Heister wrote:
> Mengxue,
>
> this is an error that happened during parallel I/O in the graphical
> output. This can have several reasons:
> - you are running out of disk space
> - the file system you are writing to is not usable for parallel IO
> (most clusters have a slow home directory on NSF and a fast parallel
> filesystem for I/O)
> - something else, like your MPI does not support MPI I/O.
>
> Things to try:
> - You can try to run one of the cookbooks in parallel and see if
> graphical output works.
> - You can change the output directory.
> - You can change the "Number of grouped files" to 0 to see if output
> works without using parallel I/O.
>
> On Thu, Sep 27, 2018 at 11:38 AM mxliu <mxliu at math.tsinghua.edu.cn> wrote:
>> Hi all,
>> I'm now running a subduction model with aspect-2.0(dealii-9.0), I set Nonlinear solver tolerance = 1e-3, it seems to converge on timestep 0(on 3 levels), however, it failed to do with Postprocessing, here is part of the error message:
>>
>> .......
>>
>> Number of active cells: 124,000 (on 3 levels)
>> Number of degrees of freedom: 5,116,081 (998,182+125,171+499,091+499,091+499,091+499,091+499,091+499,091+499,091+499,091)
>>
>> Number of free surface degrees of freedom: 250342
>> *** Timestep 0:  t=0 years
>>     Solving mesh velocity system... 0 iterations.
>>     Solving temperature system... 0 iterations.
>>     Solving ContinentalUpperCrust system ... 0 iterations.
>>     Solving ContinentalLowerCrust system ... 0 iterations.
>>     Solving ContinentalMantle system ... 0 iterations.
>>     Solving Sediment system ... 0 iterations.
>>     Solving OceanicCrust system ... 0 iterations.
>>     Solving OceanicMantle system ... 0 iterations.
>>     Solving WeakZone system ... 0 iterations.
>>     Rebuilding Stokes preconditioner...
>>     Solving Stokes system... 200+9 iterations.
>>        Relative nonlinear residual (Stokes system) after nonlinear iteration 1: 1
>>
>>     Rebuilding Stokes preconditioner...
>>     Solving Stokes system... 200+12 iterations.
>>        Relative nonlinear residual (Stokes system) after nonlinear iteration 2: 0.00559706
>> ......
>>
>>     Rebuilding Stokes preconditioner...
>>     Solving Stokes system... 200+3 iterations.
>>        Relative nonlinear residual (Stokes system) after nonlinear iteration 28: 0.000800572
>>
>>
>>     Postprocessing:
>>
>>
>> ----------------------------------------------------
>> Exception on MPI process <16> while running postprocessor <N6aspect11Postprocess13VisualizationILi2EEE>:
>>
>>
>> ----------------------------------------------------
>> Exception on MPI process <19> while running postprocessor <N6aspect11Postprocess13VisualizationILi2EEE>:
>>
>> --------------------------------------------------------
>> An error occurred in line <6632> of file </home/mgxliu/software/dealii/dealii-9.0/dealii-9.0.0/source/base/data_out_base.cc> in function
>>      void dealii::DataOutInterface<dim, spacedim>::write_vtu_in_parallel(const char*, MPI_Comm) const [with int dim = 2; int spacedim = 2; MPI_Comm = ompi_communicator_t*]
>> The violated condition was:
>>      ierr == MPI_SUCCESS
>> Additional information:
>> deal.II encountered an error while calling an MPI function.
>> The description of the error provided by MPI is "MPI_ERR_OTHER: known error not in list".
>> The numerical value of the original error code is 16.
>> --------------------------------------------------------
>>
>> Aborting!
>> ----------------------------------------------------
>>
>> Any ideas on how this might be solved? Thanks in advance.
>>
>> Cheers,
>> Mengxue
>>
>> _______________________________________________
>> Aspect-devel mailing list
>> Aspect-devel at geodynamics.org
>> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=3YF5UBWv7QtvW2xEDKaM5Lc9Ws5MU4W3zChJR4c29Vg&s=_nX_-72XBVEzLgYRh4AQglxsFszSzHpwgc92PPG-Tbg&e=

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20180927/e8cf653c/attachment.html>


More information about the Aspect-devel mailing list