[CIG-SHORT] dynamic benchmark problem

Roby Douilly rdouilly at purdue.edu
Mon Apr 15 11:08:00 PDT 2013


Ok Thank you. I will look at TPV24-25. 

Regards, 

Roby
On Apr 15, 2013, at 2:03 PM, Brad Aagaard <baagaard at usgs.gov> wrote:

> Roby,
> 
> The SCEC benchmarking site http://scecdata.usc.edu/cvws/cgi-bin/cvws.cgi 
> shows the results from various codes. There are various implementations 
> and discretizations. Unfortunately, they don't show the effect of 
> discretization size very well. The benchmarks were designed to be run at 
> 100m, but some benchmarks are more robust than others. The branching 
> benchmarks tend to more dependent upon discretization size than others 
> due to the effects of dynamic stresses.
> 
> See TPV24-25 for examples (I ran these at both 100m and 200m 
> resolutions). In TPV24-25 it is clear that there is a greater 
> sensitivity to discretization size when the rupture is just on the verge 
> of propagating. A finer discretization will tend to yield larger peaks 
> in dynamic stresses, thereby making it easier for triggering of ruptures 
> on favorably oriented faults.
> 
> Regards,
> Brad
> 
> 
> On 4/15/13 10:52 AM, Roby Douilly wrote:
>> Thank you Brad, I will try changing the shear wave speed and wave period
>> to see. But I do have a question
>> regarding this benchmark. I run the 200m resolution and in this case the
>> rupture didn't propagate in the branch
>> fault. The reason I was trying to run the 100m resolution is to see if
>> the rupture could propagate on the branch fault.
>> Do you think that the rupture will propagate on the branch fault at 100m
>>  and also is 200m to large for the rupture to
>> jump on the branch fault?
>> 
>> Roby
>> 
>> On Apr 15, 2013, at 1:46 PM, Brad Aagaard <baagaard at usgs.gov
>> <mailto:baagaard at usgs.gov>> wrote:
>> 
>>> Roby,
>>> 
>>> It has been several years since I ran this benchmark. I am not sure if I
>>> have actually used the 100m settings for this benchmark. I have used
>>> this mesh in TPV24-25. In those cases, when I ran the 100m resolution
>>> case, I used the following scales for nondimensionalizing the problem:
>>> 
>>> normalizer.shear_wave_speed = 3333*m/s
>>> normalizer.wave_period = 0.3*s
>>> 
>>> NOTE: You will need a machine with lots of memory or a cluster to run
>>> the simulation at 100m. Remember that in 3-D using a mesh with 1/2 the
>>> grid spacing increases the problem size by a factor of 8x.
>>> 
>>> Regards,
>>> Brad
>>> 
>>> On 4/15/13 10:32 AM, Roby Douilly wrote:
>>>> Brad,
>>>> 
>>>> I run it with this command:
>>>> 
>>>> pylith tet4.cfg tpv15.cfg tpv15_tet4_100.cfg
>>>> 
>>>> 
>>>> Roby
>>>> 
>>>> 
>>>> On Apr 15, 2013, at 1:28 PM, Brad Aagaard <baagaard at usgs.gov
>>>> <mailto:baagaard at usgs.gov>
>>>> <mailto:baagaard at usgs.gov>> wrote:
>>>> 
>>>>> Roby,
>>>>> 
>>>>> Please give the *exact* command line arguments you are using. You could
>>>>> be leaving out a .cfg file or using outdated files.
>>>>> 
>>>>> Brad
>>>>> 
>>>>> On 4/15/13 10:24 AM, Roby Douilly wrote:
>>>>>> Hello,
>>>>>> 
>>>>>> I am trying to run the scec dynamic benchmark tpv15 for 100m. Each time
>>>>>> I tried to run it, I got the error message below.
>>>>>> There is no error message when I run it for 200m. Is there a way to
>>>>>> avoid this error.
>>>>>> 
>>>>>> Thanks,
>>>>>> 
>>>>>> Roby
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> RuntimeError: Determinant of Jacobian (9.85786e-07) for cell 1668 is
>>>>>> smaller than minimum permissible value (1e-06)!
>>>>>> 
>>>>>> --------------------------------------------------------------------------
>>>>>> MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD
>>>>>> with errorcode -1.
>>>>>> 
>>>>>> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
>>>>>> You may or may not see output from other processes, depending on
>>>>>> exactly when Open MPI kills them.
>>>>>> --------------------------------------------------------------------------
>>>>>> --------------------------------------------------------------------------
>>>>>> mpirun has exited due to process rank 0 with PID 17553 on
>>>>>> node chilmark.eas.purdue.edu <http://chilmark.eas.purdue.edu>
>>>>>> <http://chilmark.eas.purdue.edu>
>>>>>> <http://chilmark.eas.purdue.edu> exiting
>>>>>> without calling "finalize". This may
>>>>>> have caused other processes in the application to be
>>>>>> terminated by signals sent by mpirun (as reported here).
>>>>>> --------------------------------------------------------------------------
>>>>>> /opt/pylith/bin/nemesis: mpirun: exit 255
>>>>>> /opt/pylith/bin/pylith: /opt/pylith/bin/nemesis: exit 1
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> ---------------------------------------------------------------------------------
>>>>>> Roby Douilly
>>>>>> Graduate Student
>>>>>> Department of Earth, Atmospheric, and Planetary Sciences
>>>>>> Purdue University
>>>>>> 550 Stadium Mall Dr.
>>>>>> West Lafayette, IN 47907-2051
>>>>>> rdouilly at purdue.edu <mailto:rdouilly at purdue.edu>
>>>>>> <mailto:rdouilly at purdue.edu>
>>>>>> <mailto:rdouilly at purdue.edu>
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> 
>>>>>> _______________________________________________
>>>>>> CIG-SHORT mailing list
>>>>>> CIG-SHORT at geodynamics.org <mailto:CIG-SHORT at geodynamics.org>
>>>>>> <mailto:CIG-SHORT at geodynamics.org>
>>>>>> http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>>>>>> 
>>>>> 
>>>>> _______________________________________________
>>>>> CIG-SHORT mailing list
>>>>> CIG-SHORT at geodynamics.org <mailto:CIG-SHORT at geodynamics.org>
>>>>> <mailto:CIG-SHORT at geodynamics.org>
>>>>> http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>>>> 
>>>> 
>>>> 
>>>> Roby
>>>> 
>>>> ---------------------------------------------------------------------------------
>>>> Roby Douilly
>>>> Graduate Student
>>>> Department of Earth, Atmospheric, and Planetary Sciences
>>>> Purdue University
>>>> 550 Stadium Mall Dr.
>>>> West Lafayette, IN 47907-2051
>>>> rdouilly at purdue.edu <mailto:rdouilly at purdue.edu>
>>>> <mailto:rdouilly at purdue.edu>
>>>> 
>>>> 
>>>> 
>>>> 
>>>> 
>>>> 
>>>> _______________________________________________
>>>> CIG-SHORT mailing list
>>>> CIG-SHORT at geodynamics.org <mailto:CIG-SHORT at geodynamics.org>
>>>> http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>>>> 
>>> 
>>> _______________________________________________
>>> CIG-SHORT mailing list
>>> CIG-SHORT at geodynamics.org <mailto:CIG-SHORT at geodynamics.org>
>>> http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> 
>> 
>> 
>> Roby
>> 
>> ---------------------------------------------------------------------------------
>> Roby Douilly
>> Graduate Student
>> Department of Earth, Atmospheric, and Planetary Sciences
>> Purdue University
>> 550 Stadium Mall Dr.
>> West Lafayette, IN 47907-2051
>> rdouilly at purdue.edu <mailto:rdouilly at purdue.edu>
>> 
>> 
>> 
>> 
>> 
>> 
>> _______________________________________________
>> CIG-SHORT mailing list
>> CIG-SHORT at geodynamics.org
>> http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> 
> 
> _______________________________________________
> CIG-SHORT mailing list
> CIG-SHORT at geodynamics.org
> http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short



Roby

---------------------------------------------------------------------------------
Roby Douilly
Graduate Student 
Department of Earth, Atmospheric, and Planetary Sciences
Purdue University
550 Stadium Mall Dr.
West Lafayette, IN 47907-2051
rdouilly at purdue.edu




-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://geodynamics.org/pipermail/cig-short/attachments/20130415/02cdacb1/attachment.htm 


More information about the CIG-SHORT mailing list