<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<HTML><HEAD>
<META content="text/html; charset=iso-8859-1" http-equiv=Content-Type>
<META name=GENERATOR content="MSHTML 8.00.6001.18852">
<STYLE></STYLE>
</HEAD>
<BODY 
style="WORD-WRAP: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space" 
bgColor=#ffffff>
<DIV><FONT face=Arial>
<DIV><SPAN 
style="FONT-FAMILY: 'Times New Roman'; FONT-SIZE: 10.5pt; mso-ansi-language: EN-US; mso-fareast-font-family: &#23435;&#20307;; mso-font-kerning: 1.0pt; mso-fareast-language: ZH-CN; mso-bidi-language: AR-SA" 
lang=EN-US><FONT size=3 face=Arial>Hi, all. Sorry, the added code should 
be:</FONT></SPAN></DIV>
<DIV><FONT face=Arial><SPAN 
style="FONT-FAMILY: 'Times New Roman'; FONT-SIZE: 10.5pt; mso-ansi-language: EN-US; mso-fareast-font-family: &#23435;&#20307;; mso-font-kerning: 1.0pt; mso-fareast-language: ZH-CN; mso-bidi-language: AR-SA" 
lang=EN-US>----------------------------------------------------------------------------</SPAN><FONT 
size=3>&nbsp;</FONT></FONT></DIV>
<DIV>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-ansi-language: DA; mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" 
lang=DA><FONT face=Arial><SPAN style="mso-tab-count: 1">&nbsp;&nbsp;&nbsp; 
</SPAN>for(j = 0; j &lt; E-&gt;parallel.nproc; j++)<BR><SPAN 
style="mso-tab-count: 2">&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; </SPAN>for(i 
= 0; i &lt;= E-&gt;parallel.nproc; i++)<BR><SPAN 
style="mso-tab-count: 3">&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
</SPAN>{<BR><SPAN 
style="mso-tab-count: 4">&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
</SPAN>E-&gt;parallel.mst1[j][i] = 1;<BR><SPAN 
style="mso-tab-count: 4">&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
</SPAN>E-&gt;parallel.mst2[j][i] = 2;</FONT></SPAN></P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-ansi-language: DA; mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" 
lang=DA><FONT face=Arial><SPAN 
style="mso-tab-count: 4">&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
</SPAN>E-&gt;parallel.mst3[j][i] = 3;<BR><SPAN 
style="mso-tab-count: 3">&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
</SPAN></FONT></SPAN><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" lang=EN-US><FONT 
face=Arial>}<BR>----------------------------------------------------------------------------</FONT></SPAN></P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" lang=EN-US>a "2" 
should be "3".</SPAN></P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" 
lang=EN-US></SPAN>&nbsp;</P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" lang=EN-US>Good 
luck!</SPAN></P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" 
lang=EN-US></SPAN>&nbsp;</P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" 
lang=EN-US>Jinshui</SPAN></P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" 
lang=EN-US></SPAN>&nbsp;</P></DIV></FONT></DIV>
<DIV style="FONT: 10pt arial">----- Original Message ----- 
<DIV style="BACKGROUND: #e4e4e4; font-color: black"><B>From:</B> <A 
title=jshhuang@ustc.edu.cn href="mailto:jshhuang@ustc.edu.cn">jshhuang</A> 
</DIV>
<DIV><B>To:</B> <A title=mibillen@ucdavis.edu 
href="mailto:mibillen@ucdavis.edu">Magali Billen</A> </DIV>
<DIV><B>Cc:</B> <A title=tan2@geodynamics.org 
href="mailto:tan2@geodynamics.org">tan2@geodynamics.org</A> ; <A 
title=Shijie.Zhong@Colorado.Edu href="mailto:Shijie.Zhong@Colorado.Edu">Shijie 
Zhong</A> ; <A title=cig-mc@geodynamics.org 
href="mailto:cig-mc@geodynamics.org">cig-mc@geodynamics.org</A> </DIV>
<DIV><B>Sent:</B> Wednesday, November 18, 2009 6:01 PM</DIV>
<DIV><B>Subject:</B> Re: [CIG-MC] Fwd: MPI_Isend error</DIV></DIV>
<DIV><BR></DIV>
<DIV><FONT face=Arial>Hi, Magali,</FONT></DIV>
<DIV><FONT face=Arial></FONT>&nbsp;</DIV>
<DIV><FONT face=Arial>You can try to add the following to the subroutine: <SPAN 
style="FONT-FAMILY: 'Times New Roman'; FONT-SIZE: 10.5pt; mso-ansi-language: EN-US; mso-fareast-font-family: &#23435;&#20307;; mso-font-kerning: 1.0pt; mso-fareast-language: ZH-CN; mso-bidi-language: AR-SA" 
lang=EN-US><FONT size=3>void parallel_domain_decomp1(struct All_variables *E) 
in&nbsp;Parallel_related.c:</FONT></SPAN></FONT></DIV>
<DIV><FONT face=Arial><SPAN 
style="FONT-FAMILY: 'Times New Roman'; FONT-SIZE: 10.5pt; mso-ansi-language: EN-US; mso-fareast-font-family: &#23435;&#20307;; mso-font-kerning: 1.0pt; mso-fareast-language: ZH-CN; mso-bidi-language: AR-SA" 
lang=EN-US>----------------------------------------------------------------------------</SPAN><FONT 
size=3>&nbsp;</FONT></FONT></DIV>
<DIV>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-ansi-language: DA; mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" 
lang=DA><FONT face=Arial><SPAN style="mso-tab-count: 1">&nbsp;&nbsp;&nbsp; 
</SPAN>for(j = 0; j &lt; E-&gt;parallel.nproc; j++)<BR><SPAN 
style="mso-tab-count: 2">&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; </SPAN>for(i 
= 0; i &lt;= E-&gt;parallel.nproc; i++)<BR><SPAN 
style="mso-tab-count: 3">&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
</SPAN>{<BR><SPAN 
style="mso-tab-count: 4">&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
</SPAN>E-&gt;parallel.mst1[j][i] = 1;<BR><SPAN 
style="mso-tab-count: 4">&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
</SPAN>E-&gt;parallel.mst2[j][i] = 2;</FONT></SPAN></P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-ansi-language: DA; mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" 
lang=DA><FONT face=Arial><SPAN 
style="mso-tab-count: 4">&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
</SPAN>E-&gt;parallel.mst2[j][i] = 3;<BR><SPAN 
style="mso-tab-count: 3">&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
</SPAN></FONT></SPAN><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" lang=EN-US><FONT 
face=Arial>}<BR>----------------------------------------------------------------------------</FONT></SPAN></P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" lang=EN-US><FONT 
face=Arial></FONT></SPAN>&nbsp;</P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" lang=EN-US><FONT 
face=Arial>I'm not sure if it works, but I thought it deserve a try. This is a 
machine-dependent issue. </FONT></SPAN></P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" lang=EN-US><FONT 
face=Arial></FONT></SPAN>&nbsp;</P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" lang=EN-US><FONT 
face=Arial>Good luck!</FONT></SPAN></P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" lang=EN-US><FONT 
face=Arial></FONT></SPAN>&nbsp;</P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" lang=EN-US><FONT 
face=Arial></FONT></SPAN>&nbsp;</P>
<P style="MARGIN: 0cm 0cm 0pt" class=MsoPlainText><SPAN 
style="mso-hansi-font-family: &#23435;&#20307;; mso-bidi-font-family: &#23435;&#20307;" lang=EN-US><FONT 
face=Arial>Jinshui Huang<BR>---------------------------------------<BR>School of 
Earth and Space Sciences<BR>University of Science and Technology of 
China<BR>Hefei, Anhui 230026, 
China<BR>0551-3606781<BR>---------------------------------------</FONT></SPAN></P></DIV>
<BLOCKQUOTE 
style="BORDER-LEFT: #000000 2px solid; PADDING-LEFT: 5px; PADDING-RIGHT: 0px; MARGIN-LEFT: 5px; MARGIN-RIGHT: 0px">
  <DIV style="FONT: 10pt arial">----- Original Message ----- </DIV>
  <DIV 
  style="FONT: 10pt arial; BACKGROUND: #e4e4e4; font-color: black"><B>From:</B> 
  <A title=mibillen@ucdavis.edu href="mailto:mibillen@ucdavis.edu">Magali 
  Billen</A> </DIV>
  <DIV style="FONT: 10pt arial"><B>To:</B> <A title=tan2@geodynamics.org 
  href="mailto:tan2@geodynamics.org">Eh Tan</A> </DIV>
  <DIV style="FONT: 10pt arial"><B>Cc:</B> <A title=cig-mc@geodynamics.org 
  href="mailto:cig-mc@geodynamics.org">cig-mc@geodynamics.org</A> </DIV>
  <DIV style="FONT: 10pt arial"><B>Sent:</B> Wednesday, November 18, 2009 10:23 
  AM</DIV>
  <DIV style="FONT: 10pt arial"><B>Subject:</B> [?? Probable Spam] Re: [CIG-MC] 
  Fwd: MPI_Isend error</DIV>
  <DIV><BR></DIV>Hello Eh, 
  <DIV><BR></DIV>
  <DIV>This is a run on 8 processors.&nbsp;<SPAN style="FONT-FAMILY: monospace" 
  class=Apple-style-span>If I print the stack I get:</SPAN></DIV>
  <DIV><FONT class=Apple-style-span face=monospace><BR></FONT></DIV>
  <DIV><SPAN style="FONT-FAMILY: monospace" class=Apple-style-span>(gdb) 
  bt<BR>#0 &nbsp;0x00002b943e3c208a in opal_progress () 
  from<BR>/share/apps/openmpisb-1.3/gcc-4.4/lib/libopen-pal.so.0<BR>#1 
  &nbsp;0x00002b943def5c85 in ompi_request_default_wait_all () 
  from<BR>/share/apps/openmpisb-1.3/gcc-4.4/lib/libmpi.so.0<BR>#2 
  &nbsp;0x00002b943df229d3 in PMPI_Waitall () 
  from<BR>/share/apps/openmpisb-1.3/gcc-4.4/lib/libmpi.so.0<BR>#3 
  &nbsp;0x0000000000427ef5 in exchange_id_d20 ()<BR>#4 &nbsp;0x00000000004166f3 
  in gauss_seidel ()<BR>#5 &nbsp;0x000000000041884b in multi_grid ()<BR>#6 
  &nbsp;0x0000000000418c44 in solve_del2_u ()<BR>#7 &nbsp;0x000000000041b151 in 
  solve_Ahat_p_fhat ()<BR>#8 &nbsp;0x000000000041b9a1 in 
  solve_constrained_flow_iterative ()<BR>#9 &nbsp;0x0000000000411ca6 in 
  general_stokes_solver ()<BR>#10 0x0000000000409c21 in main ()<BR></SPAN>
  <DIV>
  <DIV><BR></DIV>
  <DIV>I've attached the version of Parallel_related.c that is used... I have 
  not modified this in anyway</DIV>
  <DIV>from the CIG release of CitcomCU.</DIV>
  <DIV><BR></DIV>
  <DIV></DIV></DIV></DIV>
  <P>
  <HR>

  <P></P>
  <DIV>
  <DIV>
  <DIV></DIV>
  <DIV>Luckily, there are commented fprintf statements in just that part of the 
  code... we'll continue to dig...</DIV>
  <DIV><BR></DIV>
  <DIV>Oh, and just to eliminate the new cluster from suspicion, we downloaded, 
  compiled and ran CitcomS</DIV>
  <DIV>example1.cfg on the same cluster with the same compilers, and their was 
  not problem.</DIV>
  <DIV><BR></DIV>
  <DIV>Maybe this is the sign that I'm suppose to finally switch from CitcomCU 
  to CitcomS... :-(</DIV>
  <DIV>Magali</DIV>
  <DIV><BR></DIV>
  <DIV>On Nov 17, 2009, at 5:02 PM, Eh Tan wrote:</DIV><BR 
  class=Apple-interchange-newline>
  <BLOCKQUOTE type="cite">
    <DIV>Hi Magali,<BR><BR>How many processors are you using? If more than 100 
    processors are used,<BR>you are seeing this bug:<BR><A 
    href="http://www.geodynamics.org/pipermail/cig-mc/2008-March/000080.html">http://www.geodynamics.org/pipermail/cig-mc/2008-March/000080.html</A><BR><BR><BR>Eh<BR><BR><BR><BR>Magali 
    Billen wrote:<BR>
    <BLOCKQUOTE type="cite">One correction to the e-mail below, we've been 
      compiling CitcomCU<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">using openmpi on our old<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">cluster, so the compiler on the new cluster is the 
      same. The big<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">difference is that the cluster<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">is about twice as fast as the 5-year old cluster. 
      This suggests that<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">this change to a much faster<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">clsuter may have exposed an existing race 
      condition in CitcomCU??<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">Magali<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">Begin forwarded message:<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">*From: *Magali Billen 
        &lt;mibillen@ucdavis.edu<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE 
      type="cite">&lt;mailto:mibillen@ucdavis.edu&gt;&gt;<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">*Date: *November 17, 2009 4:23:45 PM 
      PST<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">*To: *cig-mc@geodynamics.org 
        &lt;mailto:cig-mc@geodynamics.org&gt;<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">*Subject: **[CIG-MC] MPI_Isend 
      error*<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">Hello,<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">I'm using CitcomCU and am having a strange 
        problem with problem<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">either hanging (no error, just doesn't 
      <BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">go anywhere) or it dies with an MPI_Isend error 
        (see below). &nbsp;I seem<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">to recall having problems with the MPI_Isend 
      <BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">command and the lam-mpi version of mpi, but I've 
        not had any problems<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">with mpich-2.<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">On the new cluster we are compling with openmpi 
        instead of MPICH-2.<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">The MPI_Isend error seems to occur during 
        Initialization in the call<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">to the function mass_matrix, which then 
      <BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">calls exchange_node_f20, which is where the call 
        to MPI_Isend is.<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">--snip--<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">ok14: parallel shuffle element and id 
      arrays<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">ok15: construct shape 
    functions<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">[farm.caes.ucdavis.edu:27041] *** An error 
        occurred in MPI_Isend<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">[farm.caes.ucdavis.edu:27041] *** on 
        communicator MPI_COMM_WORLD<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">[farm.caes.ucdavis.edu:27041] *** MPI_ERR_RANK: 
        invalid rank<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">[farm.caes.ucdavis.edu:27041] *** 
        MPI_ERRORS_ARE_FATAL (your MPI job<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">will now abort)<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">Has this (or these) types of error occurred for 
        other versions of<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">Citcom using MPI_Isend (it seems that CitcomS 
        uses<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">this command also). &nbsp;&nbsp;I'm not sure how 
        to debug this error,<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">especially since sometimes it just hangs with no 
        error.<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">Any advice you have would be 
      hepful,<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">Magali<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE 
    type="cite">-----------------------------<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">Associate Professor, U.C. 
    Davis<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">Department of 
    Geology/KeckCAVEs<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">Physical &amp; Earth Sciences Bldg, rm 
      2129<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">Davis, CA 95616<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">-----------------<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">mibillen@ucdavis.edu 
        &lt;mailto:mibillen@ucdavis.edu&gt;<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">(530) 754-5696<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE 
    type="cite">*-----------------------------*<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">*** Note new e-mail, building, 
      office*<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">* &nbsp;&nbsp;&nbsp;information as of Sept. 2009 
        ***<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE 
    type="cite">-----------------------------<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE 
        type="cite">_______________________________________________<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">CIG-MC mailing list<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE type="cite">CIG-MC@geodynamics.org 
        &lt;mailto:CIG-MC@geodynamics.org&gt;<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">
      <BLOCKQUOTE 
        type="cite">http://geodynamics.org/cgi-bin/mailman/listinfo/cig-mc<BR></BLOCKQUOTE></BLOCKQUOTE>
    <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">-----------------------------<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">Associate Professor, U.C. Davis<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">Department of Geology/KeckCAVEs<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">Physical &amp; Earth Sciences Bldg, rm 
    2129<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">Davis, CA 95616<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">-----------------<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">mibillen@ucdavis.edu 
      &lt;mailto:mibillen@ucdavis.edu&gt;<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">(530) 754-5696<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">*-----------------------------*<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">*** Note new e-mail, building, 
    office*<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">* &nbsp;&nbsp;&nbsp;information as of Sept. 2009 
      ***<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">-----------------------------<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE>
    <BLOCKQUOTE 
      type="cite">------------------------------------------------------------------------<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE>
    <BLOCKQUOTE 
    type="cite">_______________________________________________<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">CIG-MC mailing list<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite">CIG-MC@geodynamics.org<BR></BLOCKQUOTE>
    <BLOCKQUOTE 
      type="cite">http://geodynamics.org/cgi-bin/mailman/listinfo/cig-mc<BR></BLOCKQUOTE>
    <BLOCKQUOTE type="cite"><BR></BLOCKQUOTE><BR>-- <BR>Eh Tan<BR>Staff 
    Scientist<BR>Computational Infrastructure for Geodynamics<BR>California 
    Institute of Technology, 158-79<BR>Pasadena, CA 91125<BR>(626) 
    395-1693<BR>http://www.geodynamics.org<BR><BR>_______________________________________________<BR>CIG-MC 
    mailing 
    list<BR>CIG-MC@geodynamics.org<BR>http://geodynamics.org/cgi-bin/mailman/listinfo/cig-mc<BR></DIV></BLOCKQUOTE></DIV><BR>
  <DIV apple-content-edited="true">
  <DIV 
  style="WORD-WRAP: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space"><SPAN 
  style="WIDOWS: 2; TEXT-TRANSFORM: none; TEXT-INDENT: 0px; BORDER-COLLAPSE: separate; FONT: medium 'Lucida Grande'; WHITE-SPACE: normal; ORPHANS: 2; LETTER-SPACING: normal; COLOR: rgb(0,0,0); WORD-SPACING: 0px; -webkit-border-horizontal-spacing: 0px; -webkit-border-vertical-spacing: 0px; -webkit-text-decorations-in-effect: none; -webkit-text-size-adjust: auto; -webkit-text-stroke-width: 0px" 
  class=Apple-style-span>
  <DIV 
  style="WORD-WRAP: break-word; -webkit-nbsp-mode: space; -webkit-line-break: after-white-space">
  <DIV>
  <DIV>-----------------------------</DIV>
  <DIV>Associate Professor,&nbsp;U.C. Davis</DIV>
  <DIV>Department of Geology/KeckCAVEs</DIV>
  <DIV>Physical &amp; Earth Sciences Bldg, rm 2129</DIV>
  <DIV>Davis, CA 95616</DIV>
  <DIV>-----------------</DIV>
  <DIV><A href="mailto:mibillen@ucdavis.edu">mibillen@ucdavis.edu</A></DIV>
  <DIV>(530) 754-5696</DIV>
  <DIV><B>-----------------------------</B></DIV>
  <DIV><B>** Note new e-mail, building, office</B></DIV>
  <DIV><B>&nbsp;&nbsp; &nbsp;information as of Sept. 2009 **</B></DIV>
  <DIV>-----------------------------</DIV></DIV></DIV></SPAN></DIV></DIV><BR></DIV>
  <P>
  <HR>

  <P></P>Hello Eh,<BR><BR>This is a run on 8 processors. If I print the stack I 
  get:<BR><BR>(gdb) bt<BR>#0&nbsp; 0x00002b943e3c208a in opal_progress () 
  from<BR>/share/apps/openmpisb-1.3/gcc-4.4/lib/libopen-pal.so.0<BR>#1&nbsp; 
  0x00002b943def5c85 in ompi_request_default_wait_all () 
  from<BR>/share/apps/openmpisb-1.3/gcc-4.4/lib/libmpi.so.0<BR>#2&nbsp; 
  0x00002b943df229d3 in PMPI_Waitall () 
  from<BR>/share/apps/openmpisb-1.3/gcc-4.4/lib/libmpi.so.0<BR>#3&nbsp; 
  0x0000000000427ef5 in exchange_id_d20 ()<BR>#4&nbsp; 0x00000000004166f3 in 
  gauss_seidel ()<BR>#5&nbsp; 0x000000000041884b in multi_grid ()<BR>#6&nbsp; 
  0x0000000000418c44 in solve_del2_u ()<BR>#7&nbsp; 0x000000000041b151 in 
  solve_Ahat_p_fhat ()<BR>#8&nbsp; 0x000000000041b9a1 in 
  solve_constrained_flow_iterative ()<BR>#9&nbsp; 0x0000000000411ca6 in 
  general_stokes_solver ()<BR>#10 0x0000000000409c21 in main ()<BR><BR>I've 
  attached the version of Parallel_related.c that is used... I have&nbsp; 
  <BR>not modified this in anyway<BR>from the CIG release of 
  CitcomCU.<BR><BR><BR>Luckily, there are commented fprintf statements in just 
  that part of&nbsp; <BR>the code... we'll continue to dig...<BR><BR>Oh, and 
  just to eliminate the new cluster from suspicion, we&nbsp; <BR>downloaded, 
  compiled and ran CitcomS<BR>example1.cfg on the same cluster with the same 
  compilers, and their&nbsp; <BR>was not problem.<BR><BR>Maybe this is the sign 
  that I'm suppose to finally switch from&nbsp; <BR>CitcomCU to CitcomS... 
  :-(<BR>Magali<BR><BR>On Nov 17, 2009, at 5:02 PM, Eh Tan wrote:<BR><BR>&gt; Hi 
  Magali,<BR>&gt;<BR>&gt; How many processors are you using? If more than 100 
  processors are&nbsp; <BR>&gt; used,<BR>&gt; you are seeing this bug:<BR>&gt; 
  http://www.geodynamics.org/pipermail/cig-mc/2008-March/000080.html<BR>&gt;<BR>&gt;<BR>&gt; 
  Eh<BR>&gt;<BR>&gt;<BR>&gt;<BR>&gt; Magali Billen wrote:<BR>&gt;&gt; One 
  correction to the e-mail below, we've been compiling CitcomCU<BR>&gt;&gt; 
  using openmpi on our old<BR>&gt;&gt; cluster, so the compiler on the new 
  cluster is the same. The big<BR>&gt;&gt; difference is that the 
  cluster<BR>&gt;&gt; is about twice as fast as the 5-year old cluster. This 
  suggests that<BR>&gt;&gt; this change to a much faster<BR>&gt;&gt; clsuter may 
  have exposed an existing race condition in CitcomCU??<BR>&gt;&gt; 
  Magali<BR>&gt;&gt;<BR>&gt;&gt;<BR>&gt;&gt; Begin forwarded 
  message:<BR>&gt;&gt;<BR>&gt;&gt;&gt; *From: *Magali Billen 
  &lt;mibillen@ucdavis.edu<BR>&gt;&gt;&gt; 
  &lt;mailto:mibillen@ucdavis.edu&gt;&gt;<BR>&gt;&gt;&gt; *Date: *November 17, 
  2009 4:23:45 PM PST<BR>&gt;&gt;&gt; *To: *cig-mc@geodynamics.org 
  &lt;mailto:cig-mc@geodynamics.org&gt;<BR>&gt;&gt;&gt; *Subject: **[CIG-MC] 
  MPI_Isend error*<BR>&gt;&gt;&gt;<BR>&gt;&gt;&gt; 
  Hello,<BR>&gt;&gt;&gt;<BR>&gt;&gt;&gt; I'm using CitcomCU and am having a 
  strange problem with problem<BR>&gt;&gt;&gt; either hanging (no error, just 
  doesn't<BR>&gt;&gt;&gt; go anywhere) or it dies with an MPI_Isend error (see 
  below).&nbsp; I seem<BR>&gt;&gt;&gt; to recall having problems with the 
  MPI_Isend<BR>&gt;&gt;&gt; command and the lam-mpi version of mpi, but I've not 
  had any&nbsp; <BR>&gt;&gt;&gt; problems<BR>&gt;&gt;&gt; with 
  mpich-2.<BR>&gt;&gt;&gt; On the new cluster we are compling with openmpi 
  instead of MPICH-2.<BR>&gt;&gt;&gt;<BR>&gt;&gt;&gt; The MPI_Isend error seems 
  to occur during Initialization in the call<BR>&gt;&gt;&gt; to the function 
  mass_matrix, which then<BR>&gt;&gt;&gt; calls exchange_node_f20, which is 
  where the call to MPI_Isend is.<BR>&gt;&gt;&gt;<BR>&gt;&gt;&gt; 
  --snip--<BR>&gt;&gt;&gt; ok14: parallel shuffle element and id 
  arrays<BR>&gt;&gt;&gt; ok15: construct shape functions<BR>&gt;&gt;&gt; 
  [farm.caes.ucdavis.edu:27041] *** An error occurred in 
  MPI_Isend<BR>&gt;&gt;&gt; [farm.caes.ucdavis.edu:27041] *** on communicator 
  MPI_COMM_WORLD<BR>&gt;&gt;&gt; [farm.caes.ucdavis.edu:27041] *** MPI_ERR_RANK: 
  invalid rank<BR>&gt;&gt;&gt; [farm.caes.ucdavis.edu:27041] *** 
  MPI_ERRORS_ARE_FATAL (your MPI job<BR>&gt;&gt;&gt; will now 
  abort)<BR>&gt;&gt;&gt;<BR>&gt;&gt;&gt; Has this (or these) types of error 
  occurred for other versions of<BR>&gt;&gt;&gt; Citcom using MPI_Isend (it 
  seems that CitcomS uses<BR>&gt;&gt;&gt; this command also).&nbsp;&nbsp; I'm 
  not sure how to debug this error,<BR>&gt;&gt;&gt; especially since sometimes 
  it just hangs with no error.<BR>&gt;&gt;&gt;<BR>&gt;&gt;&gt; Any advice you 
  have would be hepful,<BR>&gt;&gt;&gt; 
  Magali<BR>&gt;&gt;&gt;<BR>&gt;&gt;&gt;<BR>&gt;&gt;&gt; 
  -----------------------------<BR>&gt;&gt;&gt; Associate Professor, U.C. 
  Davis<BR>&gt;&gt;&gt; Department of Geology/KeckCAVEs<BR>&gt;&gt;&gt; Physical 
  &amp; Earth Sciences Bldg, rm 2129<BR>&gt;&gt;&gt; Davis, CA 
  95616<BR>&gt;&gt;&gt; -----------------<BR>&gt;&gt;&gt; mibillen@ucdavis.edu 
  &lt;mailto:mibillen@ucdavis.edu&gt;<BR>&gt;&gt;&gt; (530) 
  754-5696<BR>&gt;&gt;&gt; *-----------------------------*<BR>&gt;&gt;&gt; *** 
  Note new e-mail, building, office*<BR>&gt;&gt;&gt; *&nbsp;&nbsp;&nbsp; 
  information as of Sept. 2009 ***<BR>&gt;&gt;&gt; 
  -----------------------------<BR>&gt;&gt;&gt;<BR>&gt;&gt;&gt; 
  _______________________________________________<BR>&gt;&gt;&gt; CIG-MC mailing 
  list<BR>&gt;&gt;&gt; CIG-MC@geodynamics.org 
  &lt;mailto:CIG-MC@geodynamics.org&gt;<BR>&gt;&gt;&gt; 
  http://geodynamics.org/cgi-bin/mailman/listinfo/cig-mc<BR>&gt;&gt;<BR>&gt;&gt; 
  -----------------------------<BR>&gt;&gt; Associate Professor, U.C. 
  Davis<BR>&gt;&gt; Department of Geology/KeckCAVEs<BR>&gt;&gt; Physical &amp; 
  Earth Sciences Bldg, rm 2129<BR>&gt;&gt; Davis, CA 95616<BR>&gt;&gt; 
  -----------------<BR>&gt;&gt; mibillen@ucdavis.edu 
  &lt;mailto:mibillen@ucdavis.edu&gt;<BR>&gt;&gt; (530) 754-5696<BR>&gt;&gt; 
  *-----------------------------*<BR>&gt;&gt; *** Note new e-mail, building, 
  office*<BR>&gt;&gt; *&nbsp;&nbsp;&nbsp; information as of Sept. 2009 
  ***<BR>&gt;&gt; -----------------------------<BR>&gt;&gt;<BR>&gt;&gt; 
  ------------------------------------------------------------------------<BR>&gt;&gt;<BR>&gt;&gt; 
  _______________________________________________<BR>&gt;&gt; CIG-MC mailing 
  list<BR>&gt;&gt; CIG-MC@geodynamics.org<BR>&gt;&gt; 
  http://geodynamics.org/cgi-bin/mailman/listinfo/cig-mc<BR>&gt;&gt;<BR>&gt;<BR>&gt; 
  -- <BR>&gt; Eh Tan<BR>&gt; Staff Scientist<BR>&gt; Computational 
  Infrastructure for Geodynamics<BR>&gt; California Institute of Technology, 
  158-79<BR>&gt; Pasadena, CA 91125<BR>&gt; (626) 395-1693<BR>&gt; 
  http://www.geodynamics.org<BR>&gt;<BR>&gt; 
  _______________________________________________<BR>&gt; CIG-MC mailing 
  list<BR>&gt; CIG-MC@geodynamics.org<BR>&gt; 
  http://geodynamics.org/cgi-bin/mailman/listinfo/cig-mc<BR><BR>-----------------------------<BR>Associate 
  Professor, U.C. Davis<BR>Department of Geology/KeckCAVEs<BR>Physical &amp; 
  Earth Sciences Bldg, rm 2129<BR>Davis, CA 
  95616<BR>-----------------<BR>mibillen@ucdavis.edu<BR>(530) 
  754-5696<BR>-----------------------------<BR>** Note new e-mail, building, 
  office<BR>&nbsp;&nbsp;&nbsp;&nbsp; information as of Sept. 2009 
  **<BR>-----------------------------<BR><BR>
  <P>
  <HR>

  <P></P>_______________________________________________<BR>CIG-MC mailing 
  list<BR>CIG-MC@geodynamics.org<BR>http://geodynamics.org/cgi-bin/mailman/listinfo/cig-mc<BR></BLOCKQUOTE></BODY></HTML>