[CIG-ALL] CIG News - December 2006

Sue Kientz sue at geodynamics.org
Mon Dec 11 16:01:20 PST 2006


*************************************
             CIG News
           December 2006
*************************************

Also available at
http://geodynamics.org/cig/proposalsndocs/newsdocs/newsletter-dec06

IN THIS ISSUE: New versions of CitComS.py and Gale released; bug fixes 
for CitComCU and SPECFEM3D (Globe and Basin); CitcomS.py now installed 
on TeraGrid; Petascale Facility Town Hall Meeting and CIG Business 
Meeting during AGU in San Francisco; Petascale computing report and 
results of computing needs survey; Computational Science Roundtable 
discussion summary; CPU time available on NCAR blueice cluster.

CIG Software Releases

    * CitComS.py release 2.1 - CitcomS is a finite element code designed 
to solve thermal convection problems relevant to Earth's mantle. This 
release has a simplified installation procedure and supports binary and 
parallel output using HDF5. This package comes with a clearer and 
easier-to-use process for setting up and running jobs, incorporates 
geoid calculations and includes scripts for visualizing results with 
OpenDX, GMT, and MayaVi. See 
http://geodynamics.org/cig/software/packages/mc/citcoms/ for source code 
and manual. Plus, now you can run CitComS.py on the TeraGrid; see 
http://geodynamics.org/cig/software/csa/ .

    * Gale release 1.0 - Gale is an Arbitrary Lagrangian Eulerian code 
for the long term tectonics community. The code solves problems related 
to orogenesis, rifting, and subduction with coupling to surface erosion 
models. The current release can simulate shortening, extension, and 
subduction models with a variety of boundary conditions in 2D and 3D in 
serial or parallel. Gale is a joint effort between CIG, Victorian 
Partnership for Advanced Computing (VPAC), and Monash University. See 
http://geodynamics.org/cig/software/packages/long/gale/ for binaries, 
source code, and manual.

Software Bug Fixes

    * CitComCU 1.0.2 - CitcomCU is a finite element parallel code 
designed to solve, on a three dimensional regional domain, 
thermochemical convection problems relevant to the earth's mantle. This 
release contains important bug fixes, e.g., it fixes a bug in the 
kinematic velocity boundary condition under spherical geometry. Users of 
previous releases are urged to upgrade to this release. See 
http://geodynamics.org/cig/software/packages/mc/citcomcu/ for source 
code and documentation.
    * SPECFEM3D GLOBE 3.6.1 - SPECFEM3D_GLOBE simulates global and 
regional (continental-scale) seismic wave propagation. See 
http://geodynamics.org/cig/software/packages/seismo/specfem3d-globe/ for 
source code and manual.
    * SPECFEM3D BASIN 1.4.1 - SPECFEM3D_BASIN simulates seismic wave 
propagation in sedimentary basins. The mesh generator is specifically 
written for the simulation of wave propagation in southern California 
but can be modified for use in other geographical areas. See 
http://geodynamics.org/cig/software/packages/seismo/specfem3d-basin/ for 
source code and manual.

CIG Software Now Preinstalled on the TeraGrid

    * Run the new CitComS.py on TeraGrid! CitComS.py is now installed on 
TeraGrid's TACC Lonestar, SDSC, and NCSA sites, with other codes soon to 
follow. See the Community Software Area on the TeraGrid web page at
http://geodynamics.org/cig/software/csa/ for instructions on using the 
community codes and an application to get a small amount of CPU time 
from CIG to get you started. Since many of CIG's codes require a fairly 
extensive installation procedure due to dependence on complex libraries 
(such as PETSc) and a Python modeling framework, we hope availability on 
the TeraGrid will expand our codes' user base.

Meetings During AGU

    * Monday, December 11: Town Hall Meeting on GEO Petascale 
Cybercollaboratory. Plan to attend this meeting during AGU from 6:30 to 
7:30 p.m. at the San Francisco Marriott in Nob Hill to further discuss 
NSF's plans for a petascale facility. This is your opportunity to learn 
more and have your voice heard.
    * Tuesday, December 12: CIG Business Meeting. Find out what's 
happening in the Geodynamics community or voice your opinion to your 
colleagues at this public forum being held at the Argent Hotel starting 
at 5:30 p.m., with a reception including hors d'oeuvres and a cash bar 
following at 7:30 p.m. See the CIG Business Meeting Announcement at
http://geodynamics.org/cig/events/business-mtg-06 .

Petascale Computing Report and Computing Needs Survey Results

    * The CIG Science Steering Committee has developed a brief report on 
its vision of the future of petacale computing, available at 
http://geodynamics.org/cig/priorities/Path2PetascaleComputing.pdf, which 
includes results of the recent computing needs survey, some further 
analysis of some answers and additional comments submitted by 
respondents. We encourage you to read this report and discuss its 
findings with your colleagues, and if you attend AGU, come share your 
opinion at the December 11 Town Hall Meeting on GEO Petascale 
Cybercollaboratory (see notice in under "Meetings During AGU" above).

Computational Science Roundtable Discussion Summarized

    * On October 18, 2006, the CIG Science Steering Committee (SSC) 
hosted a computational science roundtable at the University of Texas at 
Austin to focus the preceding workshop discussions on CIG's software 
development roadmap. See a summary of the results at 
http://geodynamics.org/cig/workinggroups/cs/workshops/austin06-workshop/roundtable-final.pdf

Substantial Computational Resource Available

    * If you are an NSF-funded investigator who could use a large chunk 
of CPU time for ground breaking science, a new parallel machine at NCAR 
(blueice) is available for a three-month period in early 2007. Blueice 
is a supercomputer cluster based on IBM's System p5 575 Symmetric 
Multi-Processor (SMP) nodes, using the POWER5 + processor, with an 
expected peak performance of 12 TeraFlops. If you can use such a 
resource, contact Ariel Shoresh ASAP at or call (626) 395-1699. For more 
on blueice, see the "NCAR Installs 12-Teraflop IBM Supercomputer" press 
release at http://www.ucar.edu/news/releases/2006/ibm.shtml .

*************************************

Committees, Staff, Etc.

CIG Administration, contracts, travel, etc.:
Ariel Shoresh, (626) 395-1699
ariel at geodynamics.org

Equation solvers (PETSc) & PyLith development:
Matt Kepley, knepley at mcs.anl.gov

SVN software repository & GALE development:
Walter Landry, (626) 395-4621
walter at geodynamics.org

Benchmark problems, visualization & CitCom:
Luis Armendariz, (626) 395-1695
luis at geodynamics.org

Build procedure & computational seismology:
Leif Strand, (626) 395-1697
leif at geodynamics.org

Mantle convection codes and benchmarks:
Eh Tan, (626) 395-1693,
tan2 at geodynamics.org

Web site and user manuals:
Sue Kientz, (626) 395-1694
sue at geodynamics.org

Geodynamo and systems administration:
Wei Mi, (626) 395-1692
wei at geodynamics.org

Software architecture & Pyre framework:
Michael Aivazis, (626) 395-1696
aivazis at caltech.edu

Administration:
Mike Gurnis, (626) 395-1698
gurnis at caltech.edu

Science Steering Committee, contact Chairman
Peter Olson (Johns Hopkins),
olson at jhu.edu

Executive Committee, contact Chairman
Mark Richards (Berkeley)
markr at seismo.berkeley.edu

-- 
Sue Kientz

Technical Writer/Web Manager
Computational Infrastructure of Geodynamics (CIG)
   http://www.geodynamics.org/
sue at geodynamics.org
ofc: (626) 395-1694  
   ~Metaphors Be With You~



More information about the CIG-ALL mailing list