[CIG-SHORT] Change in dt means new Jacobian

Birendra jha bjha7333 at yahoo.com
Sun Sep 16 11:36:29 PDT 2012


Hi Matt

Attached is the gdb backtrace of the crash when I use LU preconditioner.

Thanks 
Birendra

--- On Fri, 9/14/12, Matthew Knepley <knepley at mcs.anl.gov> wrote:

From: Matthew Knepley <knepley at mcs.anl.gov>
Subject: Re: [CIG-SHORT] Change in dt means new Jacobian
To: "Birendra jha" <bjha7333 at yahoo.com>
Cc: "Brad Aagaard" <baagaard at usgs.gov>, cig-short at geodynamics.org
Date: Friday, September 14, 2012, 6:40 PM

On Fri, Sep 14, 2012 at 1:53 AM, Birendra jha <bjha7333 at yahoo.com> wrote:

Dear developers



I simplified my problem setup to see if no. of KSP iterations goes down. There are 46552 nodes, 42525 hex elements in a box.  In this very simple setup, both ASM and ML preconditioner takes ~90 iterations for convergence.


Attached are the cfg files and stdout from LU and ML preconditioners. ASM stdout is not attached.

Something in your setup is causing the problem to be nearly or actually singular. Its either bad elements in
the mesh, or an inconsistent boundary condition I think. 
Why does the run with LU crash? I just used following 2 lines to use LU:

[pylithapp.petsc]

pc_type = lu

Is it incorrect?

This is interesting. Can you run in the debugger --petsc.start_in_debugger and get a stack trace?
  Thanks,
     Matt
 
For both ASM and ML, the solution looks correct in Paraview. It's just the time taken in integrating residual that I would like to reduce.



Another question: when Pylith is running on a single multicore machine, can it use OpenMPI instead of MPI (I am assuming it's using MPI by default)?



Thanks and regards

Birendra



--- On Tue, 9/4/12, Brad Aagaard <baagaard at usgs.gov> wrote:



> From: Brad Aagaard <baagaard at usgs.gov>

> Subject: Re: Change in dt means new Jacobian

> To: "Birendra jha" <bjha7333 at yahoo.com>

> Cc: cig-short at geodynamics.org

> Date: Tuesday, September 4, 2012, 7:59 PM

> On 09/03/2012 08:54 PM, Birendra jha

> wrote:

> > 3. How do I utilize multiple cores on a single machine?

> Sorry if this

> > question is already answered somewhere. I found

> instructions on

> > running parallel using a batch system but don't know

> how to run on 1

> > node and use multiple cores. Does this require

> re-building Pylith?

>

> To run in parallel all you have to do is add the command

> line argument --nodes=NUM_CORES. You do not need to

> rebuild.

>

> > 4. For HDF5 output, is it correct that the I have to

> wait until

> > finish to get the xmf file which is needed to load h5

> files in

> > Paraview? Manual says "This file is written when PyLith

> closes the

> > HDF5 file at the end of the simulation".

>

> The Xdmf file is written when PyLith closes the HDF5 file at

> the end of a simulation. There is a python script included

> with PyLith called pylith_genxdmf that will generate the

> Xdmf file given an HDF5 file. This is handy if a simulation

> did not finish or you add additional fields to an existing

> HDF5 file. Generating the Xdmf file in the middle of a

> simulation can work but it is risky because the HDF5 file

> can get corrupted if multiple processes try to access the

> file simultaneously.

>

>   pylith_genxdmf --file=HDF5_FILE

>

> Brad

>
_______________________________________________

CIG-SHORT mailing list

CIG-SHORT at geodynamics.org

http://geodynamics.org/cgi-bin/mailman/listinfo/cig-short





-- 
What most experimenters take for granted before they begin their experiments is infinitely more interesting than any results to which their experiments lead.

-- Norbert Wiener

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://geodynamics.org/pipermail/cig-short/attachments/20120916/191fa784/attachment-0001.htm 
-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: LU_error_backtrace.txt
Url: http://geodynamics.org/pipermail/cig-short/attachments/20120916/191fa784/attachment-0001.txt 


More information about the CIG-SHORT mailing list