[aspect-devel] How to grab the radius of every interior sphere made up from cell faces in the domain and some MPI questions

Shangxin Liu sxliu at vt.edu
Wed Oct 21 19:41:23 PDT 2015


Hi Wolfgang,

Let me briefly describe my workflow because computing geoid in spherical
harmonic is a bit complicated in detail. There are three parts, surface
dynamic topo, CMB dynamic topo, and density anomaly throughout the
spherical domain that contribute to geoid. The surface and CMB dynamic
topography are easier in programming. I just use the results of surface
dynamic topography and then change the if loop condition to search for the
core mantle boundary cells to compute CMB topo. After getting the
topography value of each cell along the boundary, there is a another
template in my code to transfer them to the form of spherical harmonic
coefficients.

As for the density anomaly, The formula needs to first get the density
anomaly value per cell of each depth layer (in my code, the number of total
sphere layers is determined by initial refinement degree, i.e, there are 2
layers in 0 refinement degree, each sphere layer has 48 density values, one
value per cell). Then I transfer the density anomaly values for each layers
into spherical harmonic coefficients, rho_l_m(r), and integral the
coefficients from r_CMB to r_surface (in the 0 refinement degree, the
numerical integral contains two layers, 3480 to 4501km and 4501 to 6371
km). So in this integral, I need to grab the radius value of each internal
sphere, i.e, the radius of the cell's upper face globally (in simplest
refinement degree 0 case there are three radius 3480,4501,and 6371). My
current approach is to loop over all the cells and get the location of each
cell's upper face midpoint, then compute the distance between earth center
and these midpoint, remove the same distance value of the cell at the same
depth, finally push_back them to a "R_sphere vector" ((3480,4501,6371) in 0
refinement degree). But I think this approach is not an elegant way and
also will not work in parallel since each processor only holds a
sub-domain. So is there a fast way to directly grab the radius of each
internal sphere determined by the cell face in deal.ii/ASPECT?

My second problem is the comparison between the results of different
processor. To be concise, there is a loop through all the cells to find
which cell's upper face midpoint is the closest to a fixed point. So in
parallel running, each processor returns a local_min_distance value and the
location of cell face's midpoint(x,y,z) corresponding to this value. I need
to find the min distance among them and also the location of the cell
face's midpoint corresponding to this min_distance. I know in ASPECT the
namespace Utilities::MPI::min can return the min value among the results of
all the active processors, but I also need to grab the location of the cell
face's midpoint correspond to this min_distance contained by one of the
processors. I don't know how to do this. Is there any ways to loop through
all the MPI processors to find both the min_distance and the corresponding
cell face's midpoint? I wonder whether deal.ii/ASPECT has some commands
like "variable_name(MPI_id)" to return the result of each processor.

Best,
Shangxin

On Wed, Oct 21, 2015 at 5:34 PM, Wolfgang Bangerth <bangerth at tamu.edu>
wrote:

>
> Shangxin,
> I have difficulty following your questions, and I believe it is because
> you have already picked a specific algorithm and you are asking how to
> implement it. But we know neither the algorithm, nor what the problem is
> you are trying to solve.
>
> Can you give us an overview *in formulas* of what you are actually trying
> to compute, along with an overview of *how* you try to compute it? Maybe
> that already helps us give you suggestions how to avoid some of the
> problems you see.
>
> Best
>  W.
>
>
> On 10/21/2015 03:23 PM, Shangxin Liu wrote:
>
>> Hi;
>>
>> I'm now making a new plugin into the postprocess directory to compute
>> geoid anomaly. The code has been finished but it needs some further
>> modification to run in parallel. To realize this, I run into these two
>> problems:
>>
>> 1. The geoid computation needs the integral of the density anomaly from
>> CMB to surface. For now, I do the integral based on cells. So to get all
>> the radii of the internal spheres, the code loops over all of the cells
>> and get the radius of each cell's upper face and push_back the radius
>> into a vector after removing the same values from the cell faces at the
>> same depth. But I suppose this solution is not so elegant and also will
>> not work in the multiple MPI processors since each processor will only
>> cover part of the domain. So is there a fast way in deal.ii/ASPECT to
>> directly grab all the radii of the spheres determined by the cell faces
>> within the domain?
>>
>> 2. In the spherical harmonic coefficients computation, I need to
>> evaluate the values at each grid points I created prepared for sphere
>> integral from the output values at the center of each cell face. My
>> current solution is to loop over all the sphere cells and find the
>> closest cell face midpoint to the grid point (nearest point
>> interpolation). But since each processor will only cover part of the
>> domain so I also need to compare among all the processors to find the
>> closest point and the corresponding density anomaly/topo values at that
>> point. I notice that there is a Utilities::MPI::max/min command can
>> compare among different processors. But my purpose is not only to get
>> the closet distance, but also need to know which processor provide this
>> closest cell face midpoint and the corresponding location of this closet
>> cell midpoint hence to get the density anomaly/topo there. Are there any
>> MPI commands in deal.ii/ASPECT to realize this?
>>
>> Best,
>> Shangxin
>>
>
>
> --
> ------------------------------------------------------------------------
> Wolfgang Bangerth               email:            bangerth at math.tamu.edu
>                                 www: http://www.math.tamu.edu/~bangerth/
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.geodynamics.org/pipermail/aspect-devel/attachments/20151021/b0e2f10a/attachment-0001.html>


More information about the Aspect-devel mailing list