[CIG-MC] CitcomS working examples?
Eh Tan
tan2 at geodynamics.org
Fri Jul 24 15:45:53 PDT 2009
John McCorquodale wrote:
> Aha! I suspected I was just confused in some such fundamental way. :) I'll
> try getting the Pyre version going on my box, but am still very interested
> in the C/MPI-only version for ease of deployment other places (Amazon EC2,
> Teragrid).
Hi John,
Installing CitcomS on Teragrid is easy. We have installed CitcomS on
several Teragrid sites. Although the information on this page only talks
about using CitcomS on Teragrid, the setting of softenv and environment
variables might be useful to you:
http://www.geodynamics.org/cig/software/csa/instructions/
> If I get the Pyre version going, is there a way to get the Python
> front-end to emit the input file for the MPI code so that I can start to
> understand the mapping from the simpler cookbook exmaples to the full-blown
> input file? Where is the code that does that translation? Is it centralized
> and clean enough that I can just go read it?
>
Each time you run the pyre version, it will generate a pid-file, with a
name like pid12345.cfg. This file contains almost all parameters either
from the input file or from the default values, barring a few
exceptions, and can be used as the input file for the non-pyre version.
These execptions are: maxstep, storage_spacing, mgunitx, mgunity,
mgunitz. (The first two parameters are named differently in pyre
version, and the last three parameters are not used in pyre version.)
> On the C/MPI side I now get:
>
> $ mkdir /scratch
> $ cd examples/Full
> $ mpirun CitcomSFull input.sample
> ...lots of happy-sounding values get printed out...
> !!!! # of requested CPU is incorrect
>
> Presumably there is some restriction on the MPI rank that the code is
> prepared to deal with and I'm just not giving it what it expects? Can you
> shed some light on this? I'm guessing that openmpi launches a rank-1 job
> when given no args...
>
CitcomSFull requires at least 12 processors to run. Also, some mpirun
implementation will change the working directory of the computing jobs.
It's safer to specify the full paths to the executable and input file,
like this:
mpirun -np 12 /path1/to/CitcomSFull /path2/to/input.sample
Cheers,
--
Eh Tan
Staff Scientist
Computational Infrastructure for Geodynamics
California Institute of Technology, 158-79
Pasadena, CA 91125
(626) 395-1693
http://www.geodynamics.org
More information about the CIG-MC
mailing list