[CIG-SHORT] elapsed time concerns
alberto cominelli
alberto.cominelli at gmail.com
Mon Nov 21 14:44:08 PST 2016
Dear Brad,
my convergence studies begins with very coarse cells - say 2000 x 2000 x
100 m^3 cells and then I am further refining the grid by a 2x2x2 ratio at
each stage. The fines possible grid I could run should provide the
reference solution, "the truth". Actually simulation time is becoming
prohibitive for 100x100x25 m^3 cells.
This should reflect the logic of subsurface models where usually cells are
thin (e.g. 100m x 100m x 5m). Do you think I should use "regular" cubic
cells? As regards elapsed time spent in the set-up phase, does your
suggestion apply also in the case I am working with skewed cells like the
ones in my pdf, which ideally should mimic those in the picture?
Regards,
Alberto.
[image: Immagine incorporata 1]
2016-11-21 23:28 GMT+01:00 Brad Aagaard <baagaard at usgs.gov>:
> Alberto,
>
> If your cells have aspect ratios as shown in the figure, then this will
> certainly degrade the convergence. The aspect ratios and condition number
> metrics should be close to 1.0. In CUBIT/Trelis we try to get condition
> numbers down to less than 2.0.
>
> Brad
>
>
> On 11/21/2016 02:20 PM, alberto cominelli wrote:
>
>> Thank you so much Brad.
>> i will try tomorrow.
>> I wonder if you suggestions do apply aslo for a skewed cartesian grid..
>> Acualy my grid is skwed to follow a sloping fault, hence cell cross
>> section paralle to y is not a square. I am attaching a pdf to show a
>> (poor) view of the grid and some vtlk files to explain better my geometry.
>> regards,
>> Alberto.
>>
>>
>> 2016-11-21 21:58 GMT+01:00 Brad Aagaard <baagaard at usgs.gov
>> <mailto:baagaard at usgs.gov>>:
>>
>> Alberto,
>>
>> The log shows that the Setup Stage is mostly spent in "ElIm init",
>> which is ElasticityImplicit.initialize(). This is most likely
>> associated with setting the initial stresses using a SimpleDB object.
>>
>> The SimpleGridDB provides much faster interpolation than
>> SimpleDB for a logically Cartesian grid because it can find the
>> relevant points without a global search. The points need to conform
>> to a grid, but the x, y, and z coordinates do not have to be spaced
>> uniformly.
>>
>> See Appendix C.3 of the manual for an example of the SimpleGridDB.
>>
>> Regards,
>> Brad
>>
>>
>> On 11/21/2016 12:34 PM, alberto cominelli wrote:
>>
>> Brad,
>> I have included also my cfg files..
>> regards,
>> Alberto.
>>
>> 2016-11-21 19:49 GMT+01:00 Brad Aagaard <baagaard at usgs.gov
>> <mailto:baagaard at usgs.gov>
>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>>:
>>
>> Alberto,
>>
>> Please send the entire output of the PETSc log (everything
>> after
>> "PETSc Performance Summary") for a representative
>> simulation. It is
>> usually easiest to simply send the entire output of stdout
>> (gzip it
>> if necessary to reduce size). The individual event logging
>> provides
>> more specifics than the summary of stages. We add custom
>> events in
>> the PETSc logging for many of the PyLith routines.
>>
>> If you need help understanding the format of the summary,
>> then see
>> the Profiling chapter of the PETSc manual:
>> http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>
>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf
>> <http://www.mcs.anl.gov/petsc/petsc-current/docs/manual.pdf>>.
>>
>>
>> Regards,
>> Brad
>>
>>
>>
>>
>>
>> On 11/19/2016 08:09 AM, alberto cominelli wrote:
>>
>> Brad,
>> I followed you suggestion and I also modified a bit the
>> code to
>> track
>> the time spent in integrator:
>>
>> start_time = time.time()
>> integrator.initialize(totalTime, numTimeSteps,
>> normalizer)
>> str = "--- %s seconds in integrator.initialize ---"
>> %
>> (time.time()
>> - start_time)
>> self._info.log(str)
>> (import time at the beginning
>> of
>> lib64/python2.7/site-packages/pylith/problems/Formulation.py )
>> The I run a simple case with 5733 nodes/ 4800 elements
>> and pylith
>> spent 37 seconds to run with 26.5418641567 seconds in
>> integrator.initialize.
>> If I look at Petsc log at the end I get this:
>> Summary of Stages: ----- Time ------ ----- Flops
>> ----- ---
>> Messages
>> --- -- Message Lengths -- -- Reductions --
>> Avg %Total Avg
>> %Total counts
>> %Total Avg %Total counts %Total
>> 0: Main Stage: 1.3829e-01 0.4% 0.0000e+00
>> 0.0% 0.000e+00
>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
>> 1: Meshing: 1.5950e-01 0.4% 1.7262e+04
>> 0.0% 0.000e+00
>> 0.0% 3.874e-02 0.0% 8.000e+00 100.0%
>> 2: Setup: 2.7486e+01 77.3% 2.7133e+07
>> 0.2% 8.000e+00
>> 1.9% 2.181e+01 0.0% 0.000e+00 0.0%
>> 3: Reform Jacobian: 2.8208e-01 0.8% 4.1906e+08
>> 3.5% 0.000e+00
>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
>> 4: Reform Residual: 9.8572e-02 0.3% 6.1111e+07
>> 0.5% 8.000e+00
>> 1.9% 1.967e+03 3.1% 0.000e+00 0.0%
>> 5: Solve: 5.5077e+00 15.5% 1.1537e+10
>> 95.1% 3.970e+02
>> 96.1% 6.197e+04 96.9% 0.000e+00 0.0%
>> 6: Prestep: 5.7586e-02 0.2% 0.0000e+00
>> 0.0% 0.000e+00
>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
>> 7: Step: 8.9577e-02 0.3% 0.0000e+00
>> 0.0% 0.000e+00
>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
>> 8: Poststep: 1.6417e+00 4.6% 8.2252e+07
>> 0.7% 0.000e+00
>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
>> 9: Finalize: 7.7139e-02 0.2% 0.0000e+00
>> 0.0% 0.000e+00
>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
>>
>> As far as I understand 27 seconds are spent in setup,
>> which I
>> suppose
>> includes integrators.
>> I simplified the problem using a linear interpolation
>> between
>> two points
>> to define the initial stress state but still setup phase
>> takes
>> 80% of
>> the time.
>> Is it fine this timing?
>> I may send you my cfg files if you like,
>> Regards,
>> Alberto.
>>
>> P.S: I noticed that Petsc log makes my little
>> modification into
>> python
>> scripts useless..I will remove.
>>
>>
>> 2016-11-19 0:04 GMT+01:00 Brad Aagaard
>> <baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>
>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>
>> <mailto:baagaard at usgs.gov <mailto:baagaard at usgs.gov>>>>:
>>
>>
>>
>> Alberto,
>>
>> The PETSc log summary provides important performance
>> information.
>>
>> Use these settings to see what is happening in the
>> solver
>> and the
>> performance (as used in examples/3d/hex8/pylithapp.cfg
>> ):
>>
>>
>> [pylithapp.petsc]
>> ksp_monitor = true
>> ksp_view = true
>> snes_monitor = true
>> snes_view = true
>> log_view = true
>>
>> Regards,
>> Brad
>>
>>
>>
>> On 11/18/16 2:24 PM, alberto cominelli wrote:
>>
>> Dear All,
>>
>> I am using pylith to make a convergence study on
>> a 12
>> core Xeon box,
>> with Intel(R) Xeon(R) E5-2643 v2 cpus running at
>> @
>> 3.50GHz and
>> 64 gb of
>> memory.
>> The problem at hand is a 3D domain consisting of
>> two
>> layers, the
>> upper
>> one dry, with 25000kg/m3 density and the lower
>> on water
>> saturated with a
>> 20% porosity. Besides differences in saturated
>> condistions, rock is
>> characterised as an elastic, istropic and
>> homogeneous
>> material.
>> The domain is discretised by means of hexaedral
>> elements using a
>> tartan type grid developed around a fault, a 20%
>> sloping
>> fault.
>> Fault
>> rehology is very simple, a friction model with
>> 0.6 friction
>> coefficient,
>>
>> To simulate a consolidation problem, fluid
>> pressure is
>> included
>> in the
>> model using initial stress on a cell basis
>> assuming that
>> pressure is
>> constant inside each cell.
>> This means I input a initial_stress.spatialdb file
>> containg data for
>> ncells * 8 quadrature points.
>> I am a bit surprised by elapsed time values I
>> get along my
>> convergence
>> study.
>> For instance, one case consists of 52731 nodes
>> and 48630
>> elements. To
>> properly initialise the model I give initial
>> stress
>> values in
>> 386880. I
>> make two steps in 48 minutes, with most of the
>> time spent in
>> integrators
>> - as far as I understand.
>>
>> With "Integrators" I mean what is labelled by
>> these lines in
>> pylith output:
>> -- Initializing integrators.
>> >>
>>
>>
>> /home/comi/Pylith2.1.3/lib64/python2.7/site-packages/pylith/
>> problems/Formulation.py
>> [0m:474 [0m:_initialize [0m
>> I guess this step means building residuals and
>> stiffness
>> matrices, but I
>> am not sure about. Notably, in the second steo I
>> do not
>> change
>> anything
>> and then I get very few linear/non linear
>> iteration in the
>> latter step.
>>
>> I wonder if this time is fine according to you
>> experience and if
>> it is
>> worth going parallel to improve computational
>> efficiency. I am
>> willing
>> to make much more complx cases up to some
>> millions of
>> nodes and I
>> wonder how far I can go using only one core.
>> Regards,
>> Alberto.
>>
>> I am attaching a snapshot of one simulation log
>> (not for
>> the entire
>> case) in case it may help.
>> Regards,
>> Alberto.
>>
>>
>>
>>
>> _______________________________________________
>> CIG-SHORT mailing list
>> CIG-SHORT at geodynamics.org
>> <mailto:CIG-SHORT at geodynamics.org>
>> <mailto:CIG-SHORT at geodynamics.org
>> <mailto:CIG-SHORT at geodynamics.org>>
>> <mailto:CIG-SHORT at geodynamics.org
>> <mailto:CIG-SHORT at geodynamics.org>
>> <mailto:CIG-SHORT at geodynamics.org
>> <mailto:CIG-SHORT at geodynamics.org>>>
>>
>>
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short>
>>
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> >>
>>
>>
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short>
>>
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> >>>
>>
>>
>> _______________________________________________
>> CIG-SHORT mailing list
>> CIG-SHORT at geodynamics.org
>> <mailto:CIG-SHORT at geodynamics.org>
>> <mailto:CIG-SHORT at geodynamics.org
>> <mailto:CIG-SHORT at geodynamics.org>>
>> <mailto:CIG-SHORT at geodynamics.org
>> <mailto:CIG-SHORT at geodynamics.org>
>> <mailto:CIG-SHORT at geodynamics.org
>> <mailto:CIG-SHORT at geodynamics.org>>>
>>
>>
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short>
>>
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> >>
>>
>>
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short>
>>
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> >>>
>>
>>
>>
>>
>> _______________________________________________
>> CIG-SHORT mailing list
>> CIG-SHORT at geodynamics.org
>> <mailto:CIG-SHORT at geodynamics.org>
>> <mailto:CIG-SHORT at geodynamics.org
>> <mailto:CIG-SHORT at geodynamics.org>>
>>
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short>
>>
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> >>
>>
>>
>> _______________________________________________
>> CIG-SHORT mailing list
>> CIG-SHORT at geodynamics.org <mailto:CIG-SHORT at geodynamics.org>
>> <mailto:CIG-SHORT at geodynamics.org
>> <mailto:CIG-SHORT at geodynamics.org>>
>>
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short>
>>
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> >>
>>
>>
>>
>>
>> _______________________________________________
>> CIG-SHORT mailing list
>> CIG-SHORT at geodynamics.org <mailto:CIG-SHORT at geodynamics.org>
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short>
>>
>>
>> _______________________________________________
>> CIG-SHORT mailing list
>> CIG-SHORT at geodynamics.org <mailto:CIG-SHORT at geodynamics.org>
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short>
>>
>>
>>
>>
>> _______________________________________________
>> CIG-SHORT mailing list
>> CIG-SHORT at geodynamics.org
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>>
>>
> _______________________________________________
> CIG-SHORT mailing list
> CIG-SHORT at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/cig-short/attachments/20161121/a85ebfbc/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: image.png
Type: image/png
Size: 5272 bytes
Desc: not available
URL: <http://lists.geodynamics.org/pipermail/cig-short/attachments/20161121/a85ebfbc/attachment-0001.png>
More information about the CIG-SHORT
mailing list