CIG Software Releases
- CitComS.py release 2.1 - CitcomS is a finite element code designed to solve thermal convection problems relevant to Earth's mantle. This release has a simplified installation procedure and supports binary and parallel output using HDF5. This package comes with a clearer and easier-to-use process for setting up and running jobs, incorporates geoid calculations and includes scripts for visualizing results with OpenDX, GMT, and MayaVi. See CitcomS.py for source code and manual. Plus, now you can run CitComS.py on the TeraGrid!
- Gale release 1.0 - Gale is an Arbitrary Lagrangian Eulerian code for the long term tectonics community. The code solves problems related to orogenesis, rifting, and subduction with coupling to surface erosion models. The current release can simulate shortening, extension, and subduction models with a variety of boundary conditions in 2D and 3D in serial or parallel. Gale is a joint effort between CIG, Victorian Partnership for Advanced Computing (VPAC), and Monash University. See Gale for binaries, source code, and manual.
Software Bug Fixes
- CitComCU 1.0.2 - CitcomCU is a finite element parallel code designed to solve, on a three dimensional regional domain, thermochemical convection problems relevant to the earth's mantle. This release contains important bug fixes, e.g., it fixes a bug in the kinematic velocity boundary condition under spherical geometry. Users of previous releases are urged to upgrade to this release. See CitComCU for source code and documentation.
- SPECFEM3D GLOBE 3.6.1 - SPECFEM3D_GLOBE simulates global and regional (continental-scale) seismic wave propagation. See SPECFEM3D GLOBE for source code and manual.
- SPECFEM3D BASIN 1.4.1 - SPECFEM3D_BASIN simulates seismic wave propagation in sedimentary basins. The mesh generator is specifically written for the simulation of wave propagation in southern California but can be modified for use in other geographical areas. See SPECFEM3D BASIN for source code and manual.
CIG Software Now Preinstalled on the TeraGrid
- Run the new CitComS.py on TeraGrid! CitComS.py is now installed on TeraGrid's TACC Lonestar, SDSC, and NCSA sites, with other codes soon to follow. See Community Software Area on the TeraGrid for instructions on using the community codes and an application to get a small amount of CPU time from CIG to get you started. Since many of CIG's codes require a fairly extensive installation procedure due to dependence on complex libraries (such as PETSc) and a Python modeling framework, we hope availability on the TeraGrid will expand our codes' user base.
Meetings During AGU
- Monday, December 11: Town Hall Meeting on GEO Petascale Cybercollaboratory. Plan to attend this meeting during AGU from 6:30 to 7:30 p.m. at the San Francisco Marriott in Nob Hill to further discuss NSF's plans for a petascale facility. This is your opportunity to learn more and have your voice heard.
- Tuesday, December 12: CIG Business Meeting. Find out what's happening in the Geodynamics community or voice your opinion to your colleagues at this public forum being held at the Argent Hotel starting at 5:30 p.m., with a reception including hors d'oeuvres and a cash bar following at 7:30 p.m. See the CIG Business Meeting Announcement.
Petascale Computing Report and Computing Needs Survey Results
- The CIG Science Steering Committee has developed a brief report on its vision of the future of petacale computing, The Path to Petascale Computing in Geodynamics, which includes results of the recent computing needs survey, some further analysis of some answers and additional comments submitted by respondents. We encourage you to read this report and discuss its findings with your colleagues, and if you attend AGU, come share your opinion at the December 11 Town Hall Meeting on GEO Petascale Cybercollaboratory.
Computational Science Roundtable Discussion Summarized
- On October 18, 2006, the CIG Science Steering Committee (SSC) hosted a computational science roundtable at the University of Texas at Austin to focus the preceding workshop discussions on CIG's software development roadmap. See a summary of the results at Computational Science Roundtable Discussion.
Substantial Computational Resource Available
- If you are an NSF-funded investigator who could use a large chunk of CPU time for ground breaking science, a new parallel machine at NCAR (blueice) is available for a three-month period in early 2007. Blueice is a supercomputer cluster based on IBM's System p5 575 Symmetric Multi-Processor (SMP) nodes, using the POWER5 + processor, with an expected peak performance of 12 TeraFlops. If you can use such a resource, contact Ariel Shoresh ASAP at or call (626) 395-1699. For more on blueice, see NCAR Installs 12-Teraflop IBM Supercomputer press release.
Committees, Staff, Etc.
CIG Administration, contracts, travel, etc.: Ariel Shoresh, (626) 395-1699,
Equation solvers (PETSc) and PyLith development: Matt Knepley,
SVN software repository and GALE development: Walter Landry, (626) 395-4621,
Benchmark problems, visualization, and CitCom: Luis Armendariz, (626) 395-1695,
Build procedure and computational seismology: Leif Strand, (626) 395-1697,
Mantle convection codes and benchmarks: Eh Tan, (626) 395-1693,
Website and user manuals: Sue Kientz, (626) 395-1694,
Geodynamo and systems administration: Wei Mi, (626) 395-1692,
Software architecture and Pyre framework: Michael Aivazis, (626) 395-1696,
Administration: Mike Gurnis, (626) 395-1698,
Science Steering Committee: contact Chairman Peter Olson (Johns Hopkins),
Executive Committee: contact Chairman Mark Richards (Berkeley),