[CIG-SHORT] elapsed time concerns
alberto cominelli
alberto.cominelli at gmail.com
Thu Nov 24 08:16:50 PST 2016
Brad,
I removed all spatialDn and now my runs are exrremely fast, say 8500 nodes
in some secs.I
are now fighting to input first Material property using a simpleGriddb
I followed you Appendix C.3 example to prepare material property input and
I built the attached file.
The results is this error:
Fatal error. Calling MPI_Abort() to abort PyLith application.
Traceback (most recent call last):
File
"/home/cominelli/Pylith2.1.3/lib64/python2.7/site-packages/pylith/apps/PetscApplication.py",
line 64, in onComputeNodes
self.main(*args, **kwds)
File
"/home/cominelli/Pylith2.1.3/lib64/python2.7/site-packages/pylith/apps/PyLithApp.py",
line 125, in main
self.problem.initialize()
File
"/home/cominelli/Pylith2.1.3/lib64/python2.7/site-packages/pylith/problems/TimeDependent.py",
line 120, in initialize
self.formulation.initialize(self.dimension, self.normalizer)
File
"/home/cominelli/Pylith2.1.3/lib64/python2.7/site-packages/pylith/problems/Implicit.py",
line 121, in initialize
self._initialize(dimension, normalizer)
File
"/home/cominelli/Pylith2.1.3/lib64/python2.7/site-packages/pylith/problems/Formulation.py",
line 478, in _initialize
integrator.initialize(totalTime, numTimeSteps, normalizer)
File
"/home/cominelli/Pylith2.1.3/lib64/python2.7/site-packages/pylith/feassemble/ElasticityImplicit.py",
line 56, in initialize
ModuleElasticityImplicit.initialize(self, self.mesh())
File
"/home/cominelli/Pylith2.1.3/lib64/python2.7/site-packages/pylith/feassemble/feassemble.py",
line 359, in initialize
def initialize(self, *args): return
_feassemble.IntegratorElasticity_initialize(self, *args)
RuntimeError: Error occurred while reading spatial database file
'matprops.spatialdb'.
Error reading coordinates from buffer ''.
I do not understand what is wrong in my material database. Do x/y/ and z
coordinates live on single lines?
Regards and thanks in advance,
Alberto.
2016-11-23 22:16 GMT+01:00 alberto cominelli <alberto.cominelli at gmail.com>:
> Thanks Brad,
> I am using simpeDB for the boundary conditions (x+/-, y+/-,z+/-.) and
> material properties matprops.spatialdb.
> I will simplify also those conditions and properties and I check elapsed
> time scaling-.
> I will let you know tomorrow.
> regards,
> Alberto.
>
>
> 2016-11-23 21:57 GMT+01:00 Brad Aagaard <baagaard at usgs.gov>:
>
>> Alberto,
>>
>> My comments on scaling apply to BOTH setting the parameters of the
>> material properties AND initial stress/strain.
>>
>> If you want to understand the scaling, then I suggest starting with using
>> a UniformDB EVERYWHERE you use a SimpleDB. A UniformDB should scale with
>> the number of cells. Also look at the details of the log summary, including
>> the time for each event (not just the stages).
>>
>> Regards,
>> Brad
>>
>>
>> On 11/23/2016 12:38 PM, alberto cominelli wrote:
>>
>>> Brad,
>>> In my table I was also comparing elapsed times for models where I do not
>>> use initial condition at all.
>>> The section on inital conditions is commented:
>>> # ----------------------------------------------------------------------
>>> # initial stresses
>>> # ----------------------------------------------------------------------
>>> # We must specify initial stresses for each material.
>>> # We provide a filename for the spatial database that gives the stresses,
>>> # and we change the query_type from the default 'nearest' to 'linear'.
>>> # alberto
>>> #[pylithapp.timedependent.materials.material]
>>> #db_initial_stress = spatialdata.spatialdb.SimpleDB
>>> #db_initial_stress.label = Initial stress
>>> #db_initial_stress.iohandler.filename = initial_stress.spatialdb
>>> #db_initial_stress.query_type = nearest
>>> #
>>> ## ------------------------------------------------------------
>>> ----------
>>> # boundary conditions
>>> # ----------------------------------------------------------------------
>>>
>>> Still also in this case set-up takes from 87% to 94% of the elapsed time.
>>> I am attaching my cfg cases. I am puzzled: these cases should be eeven
>>> better than SinpleGridDB initilisation.
>>> What do you think? Am I doing anything wrong?
>>> Regards,
>>> Alberto.
>>>
>>>
>>> 2016-11-23 21:17 GMT+01:00 Brad Aagaard <baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>>:
>>>
>>> Alberto,
>>>
>>> If n is the number of cells, the setting the material parameters
>>> scales as O(n).
>>>
>>> If you are using a SimpleDB with one or more points per cell, then
>>> each SimpleDB query scales as O(n), and setting the material
>>> parameters scales as O(n)*O(n). This is why we usually only use
>>> SimpleDB for simple 1-D or 2-D spatial variations with a small
>>> number of points.
>>>
>>> If you are using a SimpleGridDB, then each query scales with
>>> O(log(p)) where p is the number of points in that dimension, so
>>> setting the material parameters scales as O(n)*O(log(p)). With p <<
>>> n, setting the material parameters should scale close to O(n).
>>>
>>> Regards,
>>> Brad
>>>
>>>
>>> On 11/23/2016 11:46 AM, alberto cominelli wrote:
>>>
>>> Brad,
>>> I made some tests with and w/o initial stress conditions.
>>> My exctation was that set-up phase would dectrease realtive to
>>> the rest
>>> of the computations, Counterintuitively, this did not happen, as
>>> the
>>> table below shows.
>>> Model # cells # nodes Total Time
>>> Main Stage Meshing Setup
>>> Reform Jacobian Reform Residual Solve Prestep
>>> Step Poststep Finalize
>>> 2000x2000x100 4800 5733 No In. Cond. 22.28 2.70%
>>> 1.10% 80.30%
>>> 1.30% 0.50% 2.80% 0.70% 0.70% 9.00% 1.00%
>>> In. Cond. 34.03 0.80% 0.90%
>>> 89.00% 0.90% 0.30% 1.80%
>>> 0.20% 0.40% 5.30% 0.40%
>>> 1000x1000x100 7280 8505 No In. Cond. 48.73 0.50%
>>> 0.50% 88.60%
>>> 1.10% 0.40% 2.30% 0.30% 0.60% 5.40% 0.40%
>>> In. Cond. 86.55 0.30% 0.30%
>>> 91.80% 0.60% 0.20% 1.50%
>>> 0.30% 1.20% 3.50% 0.20%
>>> 500x500x50 20520 22971 No In. Cond. 317.1 0.10%
>>> 0.10% 94.00%
>>> 0.40% 0.30% 190.00% 0.00% 0.10% 3.00% 0.10%
>>> In. Cond. 662.1 0.10% 0.30%
>>> 97.40% 0.20% 0.10% 0.90%
>>> 0.00% 0.00% 1.00% 0.00%
>>>
>>> The time saving w/o initial conditions is negligible, and it
>>> does not
>>> decrease (relative to..) when I increase the number of nodes,
>>> Wat am I
>>> doing wrong? Is it what you expect? Consoilidation type problems
>>> are
>>> set-up dominated in pylith?
>>>
>>> Regards,
>>> Alberto.
>>>
>>> P.S: these numbers are for a case w/o explicit fault.
>>>
>>>
>>> 2016-11-22 21:37 GMT+01:00 alberto cominelli
>>> <alberto.cominelli at gmail.com
>>> <mailto:alberto.cominelli at gmail.com>
>>> <mailto:alberto.cominelli at gmail.com
>>> <mailto:alberto.cominelli at gmail.com>>>:
>>>
>>> Brad,
>>> I definitely agree with you on the issue we have the
>>> coarsest mesh.
>>> The error in any norm will be gigantic. The point is to run
>>> a model
>>> fine enough that i can assume it is the "truth". My fault is
>>> 20
>>> degree far form vertical - my grid should be fairly smooth.
>>> Now I am
>>> running a very fine case 25x25x25 m^3, which should be the
>>> best I
>>> can probably run, the most accurate solution possible. This
>>> means
>>> 4199000 elements and 4316895 nodes. Unfortunately the case
>>> is
>>> running using SimpleDB to initiate stress with fluid
>>> pressure
>>> included and it is still running:
>>>
>>> 54172 cominelli 20 0 7393m 6.7g 11m R 94.4 10.7
>>> 5989:47
>>> mpinemesis
>>>
>>> Not sure it will survive for long. If I linearly scale with
>>> respect
>>> to node number from a previous case where I run with 52000
>>> nodes in
>>> 3000 sec, this should take 64 hours serial, but it seems to
>>> be scale
>>> worse than linearly.
>>>
>>> I will check if I can use SimpleGridDB. Please, apologize if
>>> I
>>> bother you again on this, but could you confirm that a grid
>>> as in
>>> the pictures below can be filled with cellwise constant
>>> initial
>>> stresses using SimpleGridDB.
>>>
>>> regards,
>>>
>>> Alberto,
>>>
>>>
>>> Il 22/Nov/2016 19:19, "Brad Aagaard" <baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>> ha
>>> scritto:
>>> >
>>> > Alberto,
>>> >
>>> > Cells with dimensions 2km x 2km x 0.1km have a very poor
>>> aspect
>>> ratio. They will certainly degrade the rate of convergence
>>> and
>>> affect the accuracy of the results. Skewing cells so they
>>> conform to
>>> the fault should be okay as long as the sides have about
>>> equal
>>> dimension and the angles are greater than 45 degrees. Aspect
>>> ratio
>>> and/or condition number mesh quality metrics can be used in
>>> both
>>> cases to assess the quality of you mesh.
>>> >
>>> > Regards,
>>> > Brad
>>> >
>>> >
>>> >
>>> > On 11/21/2016 02:44 PM, alberto cominelli wrote:
>>> >>
>>> >> Dear Brad,
>>> >>
>>> >> my convergence studies begins with very coarse cells -
>>> say 2000 x
>>> 2000 x
>>> >> 100 m^3 cells and then I am further refining the grid by
>>> a 2x2x2
>>> ratio
>>> >> at each stage. The fines possible grid I could run should
>>> provide the
>>> >> reference solution, "the truth". Actually simulation time
>>> is becoming
>>> >> prohibitive for 100x100x25 m^3 cells.
>>> >>
>>> >> This should reflect the logic of subsurface models where
>>> usually
>>> cells
>>> >> are thin (e.g. 100m x 100m x 5m). Do you think I should
>>> use "regular"
>>> >> cubic cells? As regards elapsed time spent in the set-up
>>> phase, does
>>> >> your suggestion apply also in the case I am working with
>>> skewed cells
>>> >> like the ones in my pdf, which ideally should mimic those
>>> in the
>>> picture?
>>> >>
>>> >> Regards,
>>> >>
>>> >> Alberto.
>>> >>
>>> >> Immagine incorporata 1
>>> >>
>>> >> 2016-11-21 23:28 GMT+01:00 Brad Aagaard
>>> <baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>
>>> >> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>>>:
>>> >>
>>> >>
>>> >> Alberto,
>>> >>
>>> >> If your cells have aspect ratios as shown in the
>>> figure, then
>>> this
>>> >> will certainly degrade the convergence. The aspect
>>> ratios and
>>> >> condition number metrics should be close to 1.0. In
>>> CUBIT/Trelis we
>>> >> try to get condition numbers down to less than 2.0.
>>> >>
>>> >> Brad
>>> >>
>>> >>
>>> >> On 11/21/2016 02:20 PM, alberto cominelli wrote:
>>> >>
>>> >> Thank you so much Brad.
>>> >> i will try tomorrow.
>>> >> I wonder if you suggestions do apply aslo for a
>>> skewed
>>> cartesian
>>> >> grid..
>>> >> Acualy my grid is skwed to follow a sloping
>>> fault, hence
>>> cell cross
>>> >> section paralle to y is not a square. I am
>>> attaching a
>>> pdf to show a
>>> >> (poor) view of the grid and some vtlk files to
>>> explain
>>> better my
>>> >> geometry.
>>> >> regards,
>>> >> Alberto.
>>> >>
>>> >>
>>> >> 2016-11-21 21:58 GMT+01:00 Brad Aagaard
>>> <baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>>>>:
>>> >>
>>> >> Alberto,
>>> >>
>>> >> The log shows that the Setup Stage is mostly
>>> spent in
>>> "ElIm
>>> >> init",
>>> >> which is ElasticityImplicit.initialize().
>>> This is
>>> most likely
>>> >> associated with setting the initial stresses
>>> using a
>>> >> SimpleDB object.
>>> >>
>>> >> The SimpleGridDB provides much faster
>>> interpolation than
>>> >> SimpleDB for a logically Cartesian grid
>>> because it
>>> can find the
>>> >> relevant points without a global search. The
>>> points
>>> need to
>>> >> conform
>>> >> to a grid, but the x, y, and z coordinates do
>>> not
>>> have to be
>>> >> spaced
>>> >> uniformly.
>>> >>
>>> >> See Appendix C.3 of the manual for an example
>>> of the
>>> >> SimpleGridDB.
>>> >>
>>> >> Regards,
>>> >> Brad
>>> >>
>>> >>
>>> >> On 11/21/2016 12:34 PM, alberto cominelli
>>> wrote:
>>> >>
>>> >> Brad,
>>> >> I have included also my cfg files..
>>> >> regards,
>>> >> Alberto.
>>> >>
>>> >> 2016-11-21 19:49 GMT+01:00 Brad Aagaard
>>> >> <baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>>>>>:
>>> >>
>>> >> Alberto,
>>> >>
>>> >> Please send the entire output of the
>>> PETSc log
>>> >> (everything after
>>> >> "PETSc Performance Summary") for a
>>> representative
>>> >> simulation. It is
>>> >> usually easiest to simply send the
>>> entire
>>> output of
>>> >> stdout
>>> >> (gzip it
>>> >> if necessary to reduce size). The
>>> individual
>>> event
>>> >> logging
>>> >> provides
>>> >> more specifics than the summary of
>>> stages. We
>>> add custom
>>> >> events in
>>> >> the PETSc logging for many of the
>>> PyLith
>>> routines.
>>> >>
>>> >> If you need help understanding the
>>> format of the
>>> >> summary,
>>> >> then see
>>> >> the Profiling chapter of the PETSc
>>> manual:
>>> >>
>>> >>
>>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>>
>>> >>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>>>
>>> >>
>>> >>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>>
>>> >>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>>>>
>>> >>
>>> >>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>>
>>> >>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>>>
>>> >>
>>> >>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>>
>>> >>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>
>>> >>>>.
>>> >>
>>> >>
>>> >> Regards,
>>> >> Brad
>>> >>
>>> >>
>>> >>
>>> >>
>>> >>
>>> >> On 11/19/2016 08:09 AM, alberto
>>> cominelli wrote:
>>> >>
>>> >> Brad,
>>> >> I followed you suggestion and I
>>> also
>>> modified a
>>> >> bit the
>>> >> code to
>>> >> track
>>> >> the time spent in integrator:
>>> >>
>>> >> start_time = time.time()
>>> >>
>>> integrator.initialize(totalTime,
>>> numTimeSteps,
>>> >> normalizer)
>>> >> str = "--- %s seconds in
>>> >> integrator.initialize ---" %
>>> >> (time.time()
>>> >> - start_time)
>>> >> self._info.log(str)
>>> >> (import time at the beginning
>>> >> of
>>> >>
>>> >>
>>> lib64/python2.7/site-packages/pylith/problems/Formulation.py
>>> )
>>> >> The I run a simple case with 5733
>>> nodes/ 4800
>>> >> elements
>>> >> and pylith
>>> >> spent 37 seconds to run with
>>> 26.5418641567
>>> >> seconds in
>>> >> integrator.initialize.
>>> >> If I look at Petsc log at the
>>> end I get
>>> this:
>>> >> Summary of Stages: ----- Time
>>> ------
>>> ----- Flops
>>> >> ----- ---
>>> >> Messages
>>> >> --- -- Message Lengths -- --
>>> Reductions --
>>> >> Avg
>>> %Total
>>> Avg
>>> >> %Total counts
>>> >> %Total Avg %Total
>>> counts
>>> %Total
>>> >> 0: Main Stage: 1.3829e-01
>>> 0.4%
>>> 0.0000e+00
>>> >> 0.0% 0.000e+00
>>> >> 0.0% 0.000e+00 0.0%
>>> 0.000e+00 0.0%
>>> >> 1: Meshing: 1.5950e-01
>>> 0.4%
>>> 1.7262e+04
>>> >> 0.0% 0.000e+00
>>> >> 0.0% 3.874e-02 0.0%
>>> 8.000e+00 100.0%
>>> >> 2: Setup: 2.7486e+01
>>> 77.3%
>>> 2.7133e+07
>>> >> 0.2% 8.000e+00
>>> >> 1.9% 2.181e+01 0.0%
>>> 0.000e+00 0.0%
>>> >> 3: Reform Jacobian: 2.8208e-01
>>> 0.8%
>>> 4.1906e+08
>>> >> 3.5% 0.000e+00
>>> >> 0.0% 0.000e+00 0.0%
>>> 0.000e+00 0.0%
>>> >> 4: Reform Residual: 9.8572e-02
>>> 0.3%
>>> 6.1111e+07
>>> >> 0.5% 8.000e+00
>>> >> 1.9% 1.967e+03 3.1%
>>> 0.000e+00 0.0%
>>> >> 5: Solve: 5.5077e+00
>>> 15.5%
>>> 1.1537e+10
>>> >> 95.1% 3.970e+02
>>> >> 96.1% 6.197e+04 96.9%
>>> 0.000e+00
>>> 0.0%
>>> >> 6: Prestep: 5.7586e-02
>>> 0.2%
>>> 0.0000e+00
>>> >> 0.0% 0.000e+00
>>> >> 0.0% 0.000e+00 0.0%
>>> 0.000e+00 0.0%
>>> >> 7: Step: 8.9577e-02
>>> 0.3%
>>> 0.0000e+00
>>> >> 0.0% 0.000e+00
>>> >> 0.0% 0.000e+00 0.0%
>>> 0.000e+00 0.0%
>>> >> 8: Poststep: 1.6417e+00
>>> 4.6%
>>> 8.2252e+07
>>> >> 0.7% 0.000e+00
>>> >> 0.0% 0.000e+00 0.0%
>>> 0.000e+00 0.0%
>>> >> 9: Finalize: 7.7139e-02
>>> 0.2%
>>> 0.0000e+00
>>> >> 0.0% 0.000e+00
>>> >> 0.0% 0.000e+00 0.0%
>>> 0.000e+00 0.0%
>>> >>
>>> >> As far as I understand 27 seconds
>>> are
>>> spent in
>>> >> setup,
>>> >> which I
>>> >> suppose
>>> >> includes integrators.
>>> >> I simplified the problem using a
>>> linear
>>> >> interpolation
>>> >> between
>>> >> two points
>>> >> to define the initial stress
>>> state but still
>>> >> setup phase
>>> >> takes
>>> >> 80% of
>>> >> the time.
>>> >> Is it fine this timing?
>>> >> I may send you my cfg files if
>>> you like,
>>> >> Regards,
>>> >> Alberto.
>>> >>
>>> >> P.S: I noticed that Petsc log
>>> makes my little
>>> >> modification into
>>> >> python
>>> >> scripts useless..I will remove.
>>> >>
>>> >>
>>> >> 2016-11-19 0:04 GMT+01:00 Brad
>>> Aagaard
>>> >> <baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>>>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>>>>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>>>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>>>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>>
>>> >> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov> <mailto:baagaard at usgs.gov
>>> <mailto:baagaard at usgs.gov>>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>>>>>>:
>>> >>
>>> >>
>>> >>
>>> >> Alberto,
>>> >>
>>> >> The PETSc log summary
>>> provides important
>>> >> performance
>>> >> information.
>>> >>
>>> >> Use these settings to see
>>> what is
>>> happening
>>> >> in the
>>> >> solver
>>> >> and the
>>> >> performance (as used in
>>> >> examples/3d/hex8/pylithapp.cf
>>> <http://pylithapp.cf>g):
>>> >>
>>> >>
>>> >> [pylithapp.petsc]
>>> >> ksp_monitor = true
>>> >> ksp_view = true
>>> >> snes_monitor = true
>>> >> snes_view = true
>>> >> log_view = true
>>> >>
>>> >> Regards,
>>> >> Brad
>>> >>
>>> >>
>>> >>
>>> >> On 11/18/16 2:24 PM, alberto
>>> cominelli wrote:
>>> >>
>>> >> Dear All,
>>> >>
>>> >> I am using pylith to make
>>> a
>>> convergence
>>> >> study on
>>> >> a 12
>>> >> core Xeon box,
>>> >> with Intel(R) Xeon(R)
>>> E5-2643 v2
>>> cpus
>>> >> running at @
>>> >> 3.50GHz and
>>> >> 64 gb of
>>> >> memory.
>>> >> The problem at hand is a
>>> 3D domain
>>> >> consisting of two
>>> >> layers, the
>>> >> upper
>>> >> one dry, with 25000kg/m3
>>> density
>>> and the
>>> >> lower
>>> >> on water
>>> >> saturated with a
>>> >> 20% porosity. Besides
>>> differences in
>>> >> saturated
>>> >> condistions, rock is
>>> >> characterised as an
>>> elastic,
>>> istropic and
>>> >> homogeneous
>>> >> material.
>>> >> The domain is
>>> discretised by
>>> means of
>>> >> hexaedral
>>> >> elements using a
>>> >> tartan type grid
>>> developed around a
>>> >> fault, a 20%
>>> >> sloping
>>> >> fault.
>>> >> Fault
>>> >> rehology is very simple,
>>> a friction
>>> >> model with
>>> >> 0.6 friction
>>> >> coefficient,
>>> >>
>>> >> To simulate a
>>> consolidation
>>> problem, fluid
>>> >> pressure is
>>> >> included
>>> >> in the
>>> >> model using initial
>>> stress on a
>>> cell basis
>>> >> assuming that
>>> >> pressure is
>>> >> constant inside each cell.
>>> >> This means I input a
>>> >> initial_stress.spatialdb file
>>> >> containg data for
>>> >> ncells * 8 quadrature
>>> points.
>>> >> I am a bit surprised by
>>> elapsed time
>>> >> values I
>>> >> get along my
>>> >> convergence
>>> >> study.
>>> >> For instance, one case
>>> consists
>>> of 52731
>>> >> nodes
>>> >> and 48630
>>> >> elements. To
>>> >> properly initialise the
>>> model I give
>>> >> initial stress
>>> >> values in
>>> >> 386880. I
>>> >> make two steps in 48
>>> minutes,
>>> with most
>>> >> of the
>>> >> time spent in
>>> >> integrators
>>> >> - as far as I understand.
>>> >>
>>> >> With "Integrators" I mean
>>> what is
>>> >> labelled by
>>> >> these lines in
>>> >> pylith output:
>>> >> -- Initializing
>>> integrators.
>>> >> >>
>>> >>
>>> >>
>>> >>
>>> >>
>>>
>>> /home/comi/Pylith2.1.3/lib64/python2.7/site-packages/pylith/
>>> problems/Formulation.py
>>> >> [0m:474 [0m:_initialize
>>> [0m
>>> >> I guess this step means
>>> building
>>> >> residuals and
>>> >> stiffness
>>> >> matrices, but I
>>> >> am not sure about.
>>> Notably, in the
>>> >> second steo I
>>> >> do not
>>> >> change
>>> >> anything
>>> >> and then I get very few
>>> linear/non linear
>>> >> iteration in the
>>> >> latter step.
>>> >>
>>> >> I wonder if this time is
>>> fine
>>> according
>>> >> to you
>>> >> experience and if
>>> >> it is
>>> >> worth going parallel to
>>> improve
>>> >> computational
>>> >> efficiency. I am
>>> >> willing
>>> >> to make much more complx
>>> cases
>>> up to some
>>> >> millions of
>>> >> nodes and I
>>> >> wonder how far I can go
>>> using
>>> only one core.
>>> >> Regards,
>>> >> Alberto.
>>> >>
>>> >> I am attaching a snapshot
>>> of one
>>> >> simulation log
>>> >> (not for
>>> >> the entire
>>> >> case) in case it may help.
>>> >> Regards,
>>> >> Alberto.
>>> >>
>>> >>
>>> >>
>>> >>
>>> >>
>>> >> _______________________________________________
>>> >> CIG-SHORT mailing list
>>> >> CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>>>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>
>>> >> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>
>>> <mailto:CIG-SHORT at geodynamics.org
>>> <mailto:CIG-SHORT at geodynamics.org>>>>>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/cig-short/attachments/20161124/063b272f/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: matprops.spatialdb.gz
Type: application/x-gzip
Size: 196829 bytes
Desc: not available
URL: <http://lists.geodynamics.org/pipermail/cig-short/attachments/20161124/063b272f/attachment-0001.bin>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: out1.log
Type: application/octet-stream
Size: 10919 bytes
Desc: not available
URL: <http://lists.geodynamics.org/pipermail/cig-short/attachments/20161124/063b272f/attachment-0003.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: pylithapp.cfg
Type: application/octet-stream
Size: 4096 bytes
Desc: not available
URL: <http://lists.geodynamics.org/pipermail/cig-short/attachments/20161124/063b272f/attachment-0004.obj>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: TTG_1000x1000x100_DP100_LF_mu_06_cross.cfg
Type: application/octet-stream
Size: 9340 bytes
Desc: not available
URL: <http://lists.geodynamics.org/pipermail/cig-short/attachments/20161124/063b272f/attachment-0005.obj>
More information about the CIG-SHORT
mailing list