[CIG-SHORT] elapsed time concerns
alberto cominelli
alberto.cominelli at gmail.com
Mon Nov 21 12:34:24 PST 2016
Brad,
I have included also my cfg files..
regards,
Alberto.
2016-11-21 19:49 GMT+01:00 Brad Aagaard <baagaard at usgs.gov>:
> Alberto,
>
> Please send the entire output of the PETSc log (everything after "PETSc
> Performance Summary") for a representative simulation. It is usually
> easiest to simply send the entire output of stdout (gzip it if necessary to
> reduce size). The individual event logging provides more specifics than the
> summary of stages. We add custom events in the PETSc logging for many of
> the PyLith routines.
>
> If you need help understanding the format of the summary, then see the
> Profiling chapter of the PETSc manual: http://www.mcs.anl.gov/petsc/p
> etsc-current/docs/manual.pdf.
>
> Regards,
> Brad
>
>
>
>
>
> On 11/19/2016 08:09 AM, alberto cominelli wrote:
>
>> Brad,
>> I followed you suggestion and I also modified a bit the code to track
>> the time spent in integrator:
>>
>> start_time = time.time()
>> integrator.initialize(totalTime, numTimeSteps, normalizer)
>> str = "--- %s seconds in integrator.initialize ---" % (time.time()
>> - start_time)
>> self._info.log(str)
>> (import time at the beginning
>> of lib64/python2.7/site-packages/pylith/problems/Formulation.py )
>> The I run a simple case with 5733 nodes/ 4800 elements and pylith
>> spent 37 seconds to run with 26.5418641567 seconds in
>> integrator.initialize.
>> If I look at Petsc log at the end I get this:
>> Summary of Stages: ----- Time ------ ----- Flops ----- --- Messages
>> --- -- Message Lengths -- -- Reductions --
>> Avg %Total Avg %Total counts
>> %Total Avg %Total counts %Total
>> 0: Main Stage: 1.3829e-01 0.4% 0.0000e+00 0.0% 0.000e+00
>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
>> 1: Meshing: 1.5950e-01 0.4% 1.7262e+04 0.0% 0.000e+00
>> 0.0% 3.874e-02 0.0% 8.000e+00 100.0%
>> 2: Setup: 2.7486e+01 77.3% 2.7133e+07 0.2% 8.000e+00
>> 1.9% 2.181e+01 0.0% 0.000e+00 0.0%
>> 3: Reform Jacobian: 2.8208e-01 0.8% 4.1906e+08 3.5% 0.000e+00
>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
>> 4: Reform Residual: 9.8572e-02 0.3% 6.1111e+07 0.5% 8.000e+00
>> 1.9% 1.967e+03 3.1% 0.000e+00 0.0%
>> 5: Solve: 5.5077e+00 15.5% 1.1537e+10 95.1% 3.970e+02
>> 96.1% 6.197e+04 96.9% 0.000e+00 0.0%
>> 6: Prestep: 5.7586e-02 0.2% 0.0000e+00 0.0% 0.000e+00
>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
>> 7: Step: 8.9577e-02 0.3% 0.0000e+00 0.0% 0.000e+00
>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
>> 8: Poststep: 1.6417e+00 4.6% 8.2252e+07 0.7% 0.000e+00
>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
>> 9: Finalize: 7.7139e-02 0.2% 0.0000e+00 0.0% 0.000e+00
>> 0.0% 0.000e+00 0.0% 0.000e+00 0.0%
>>
>> As far as I understand 27 seconds are spent in setup, which I suppose
>> includes integrators.
>> I simplified the problem using a linear interpolation between two points
>> to define the initial stress state but still setup phase takes 80% of
>> the time.
>> Is it fine this timing?
>> I may send you my cfg files if you like,
>> Regards,
>> Alberto.
>>
>> P.S: I noticed that Petsc log makes my little modification into python
>> scripts useless..I will remove.
>>
>>
>> 2016-11-19 0:04 GMT+01:00 Brad Aagaard <baagaard at usgs.gov
>> <mailto:baagaard at usgs.gov>>:
>>
>>
>> Alberto,
>>
>> The PETSc log summary provides important performance information.
>>
>> Use these settings to see what is happening in the solver and the
>> performance (as used in examples/3d/hex8/pylithapp.cfg):
>>
>>
>> [pylithapp.petsc]
>> ksp_monitor = true
>> ksp_view = true
>> snes_monitor = true
>> snes_view = true
>> log_view = true
>>
>> Regards,
>> Brad
>>
>>
>>
>> On 11/18/16 2:24 PM, alberto cominelli wrote:
>>
>> Dear All,
>>
>> I am using pylith to make a convergence study on a 12 core Xeon
>> box,
>> with Intel(R) Xeon(R) E5-2643 v2 cpus running at @ 3.50GHz and
>> 64 gb of
>> memory.
>> The problem at hand is a 3D domain consisting of two layers, the
>> upper
>> one dry, with 25000kg/m3 density and the lower on water
>> saturated with a
>> 20% porosity. Besides differences in saturated condistions, rock
>> is
>> characterised as an elastic, istropic and homogeneous material.
>> The domain is discretised by means of hexaedral elements using a
>> tartan type grid developed around a fault, a 20% sloping fault.
>> Fault
>> rehology is very simple, a friction model with 0.6 friction
>> coefficient,
>>
>> To simulate a consolidation problem, fluid pressure is included
>> in the
>> model using initial stress on a cell basis assuming that pressure
>> is
>> constant inside each cell.
>> This means I input a initial_stress.spatialdb file containg data
>> for
>> ncells * 8 quadrature points.
>> I am a bit surprised by elapsed time values I get along my
>> convergence
>> study.
>> For instance, one case consists of 52731 nodes and 48630
>> elements. To
>> properly initialise the model I give initial stress values in
>> 386880. I
>> make two steps in 48 minutes, with most of the time spent in
>> integrators
>> - as far as I understand.
>>
>> With "Integrators" I mean what is labelled by these lines in
>> pylith output:
>> -- Initializing integrators.
>> >>
>> /home/comi/Pylith2.1.3/lib64/python2.7/site-packages/pylith/
>> problems/Formulation.py
>> [0m:474 [0m:_initialize [0m
>> I guess this step means building residuals and stiffness
>> matrices, but I
>> am not sure about. Notably, in the second steo I do not change
>> anything
>> and then I get very few linear/non linear iteration in the
>> latter step.
>>
>> I wonder if this time is fine according to you experience and if
>> it is
>> worth going parallel to improve computational efficiency. I am
>> willing
>> to make much more complx cases up to some millions of nodes and I
>> wonder how far I can go using only one core.
>> Regards,
>> Alberto.
>>
>> I am attaching a snapshot of one simulation log (not for the
>> entire
>> case) in case it may help.
>> Regards,
>> Alberto.
>>
>>
>>
>>
>> _______________________________________________
>> CIG-SHORT mailing list
>> CIG-SHORT at geodynamics.org <mailto:CIG-SHORT at geodynamics.org>
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short>
>>
>>
>> _______________________________________________
>> CIG-SHORT mailing list
>> CIG-SHORT at geodynamics.org <mailto:CIG-SHORT at geodynamics.org>
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>> <http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short>
>>
>>
>>
>>
>> _______________________________________________
>> CIG-SHORT mailing list
>> CIG-SHORT at geodynamics.org
>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>>
>>
> _______________________________________________
> CIG-SHORT mailing list
> CIG-SHORT at geodynamics.org
> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-short
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/cig-short/attachments/20161121/da571e85/attachment-0001.html>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: out-in.tar.gz
Type: application/x-gzip
Size: 99371 bytes
Desc: not available
URL: <http://lists.geodynamics.org/pipermail/cig-short/attachments/20161121/da571e85/attachment-0001.bin>
More information about the CIG-SHORT
mailing list