From 155001023 at csu.edu.cn Sun Jul 2 07:58:45 2017 From: 155001023 at csu.edu.cn (=?UTF-8?B?5byg5piO6LSi?=) Date: Sun, 2 Jul 2017 22:58:45 +0800 (GMT+08:00) Subject: [CIG-SEISMO] compile problem about SPECFEM2D Message-ID: <3f833eff.530e.15d03cfa447.Coremail.155001023@csu.edu.cn> hello experts: i get some errors when compilling SPECFEM2D, as list below error #6580: Name in only-list does not exist. [NELEM_ELASTIC_FIXED_SURFACE] E:\SEM\elastic_fixed_boundary.f90 40 error #6580: Name in only-list does not exist. [ELASTIC_FIXED_EDGES] E:\SEM\elastic_fixed_boundary.f90 40 error #6580: Name in only-list does not exist. [ELASTIC_FIXED_SURFACE] E:\SEM\elastic_fixed_boundary.f90 40 How can i fix them? thank you Yours sincerely Micky Jhon. -------------- next part -------------- An HTML attachment was scrubbed... URL: From komeaz at gmail.com Sun Jul 2 04:57:28 2017 From: komeaz at gmail.com (abolfazl komeazi) Date: Sun, 2 Jul 2017 16:27:28 +0430 Subject: [CIG-SEISMO] SPECFEM3D-problems Message-ID: Dear Dr. Komatitsch, I'm working with specfem3d version *2.0.1.* I've created my Mesh file using cubit. the problem is when I run the program it seems my stations aren't in a correct location. I've get this error: need at least one receiver Error detected, aborting MPI... proc 0 -------------------------------------------------------------------------- MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD with errorcode 30. My target region is in latitude(25,34) and longitude(52,61) [which is part of IRAN] and the corresponding UTM part is 40. attached you can find my STATION file. looking at "output_solver.txt" file, I found that the program didn't recognize my target region since it says: check that stations in file ./DATA/STATIONS are within longitude min/max: 52.511256102833592 61.535490745169042 latitude min/max : 0.0000000000000000 9.0094311157611440 UTM x min/max: 0.0000000000000000 999000.00000000000 UTM y min/max : 0.0000000000000000 999000.00000000000 Any help would be appreciated. Sincerely Yours, Abolfazl Komeazi -- Ph.D Student of Geophysics - Seismology, Seismological Research Center, International Institute of Earthquake Engineering and Seismology (IIEES), Seismology Department P.O. Box 19395-3913, Tehran, Iran, komeaz at gmail.com a.komeazi at iiees.ac.ir -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- ********************************************** **** Specfem 3-D Solver - MPI version f90 **** ********************************************** Version: v2.0.2-2178-g6b93bc9 Fixing slow underflow trapping problem using small initial field There are 1 MPI processes Processes are numbered from 0 to 0 There is a total of 1 slices NDIM = 3 NGLLX = 5 NGLLY = 5 NGLLZ = 5 using single precision for the calculations smallest and largest possible floating-point numbers are: 1.17549435E-38 3.40282347E+38 velocity model: default total acoustic elements : 0 total elastic elements : 142560 total poroelastic elements : 0 ******** minimum and maximum number of elements and points in the CUBIT + SCOTCH mesh: NSPEC_global_min = 142560 NSPEC_global_max = 142560 NSPEC_global_max / NSPEC_global_min imbalance = 1.00000000 = 0.00000000 % NSPEC_global_sum = 142560 NGLOB_global_min = 9367159 NGLOB_global_max = 9367159 NGLOB_global_max / NGLOB_global_min imbalance = 1.00000000 = 0.00000000 % NGLOB_global_sum = 9367159 If you have elements of a single type (all acoustic, all elastic, all poroelastic, and without CPML) in the whole mesh, then there should be no significant imbalance in the above numbers. Otherwise, it is normal to have imbalance in elements and points because the domain decomposer compensates for the different cost of different elements by partitioning them unevenly among processes. ******** ******** Model: P velocity min,max = 5000.00000 8230.00000 Model: S velocity min,max = 2800.00000 4560.00000 Model: Poisson's ratio min,max = 0.239989176 0.301242232 ******** ********************************************* *** Verification of simulation parameters *** ********************************************* *** Xmin and Xmax of the model = 0.00000000 999000.000 *** Ymin and Ymax of the model = 0.00000000 999000.000 *** Zmin and Zmax of the model = -100000.000 2590.92261 *** Max GLL point distance = 4862.95752 *** Min GLL point distance = 27.3437538 *** Max/min ratio = 177.845276 *** Max element size = 14916.4326 *** Min element size = 35.9687500 *** Max/min ratio = 414.705353 *** Minimum period resolved = 5.73660755 *** Maximum suggested time step = 1.69836986E-03 *** for DT : 1.2000000000000000E-002 *** Max stability for wave velocities = 3.53279948 Elapsed time for checking mesh resolution in seconds = 0.15254497528076172 saving VTK files for Courant number and minimum period ****************************************** There is a total of 1 slices ****************************************** sources: UTM projection: UTM zone: 40 ************************************* locating source 1 ************************************* source located in slice 0 in element 1 in elastic domain using moment tensor source: xi coordinate of source in that element: -1.0000000000000000 eta coordinate of source in that element: 1.0000000000000000 gamma coordinate of source in that element: 1.0000000000000000 source time function: using Gaussian source time function Source time function is a Heaviside, convolve later half duration: 5.9999999999999998E-002 seconds time shift: 0.0000000000000000 seconds magnitude of the source: scalar moment M0 = 1.3781354517608221E+025 dyne-cm moment magnitude Mw = 6.0595281272975043 original (requested) position of the source: latitude: 28.123999999999999 longitude: 56.887999999999998 UTM x: 489000.50498456747 UTM y: 3110928.1325351126 depth: 8.0000000000000000 km topo elevation: 2386.0551757812500 position of the source that will be used: UTM x: 989750.00000000000 UTM y: 8534.0234375000000 depth: 33.364680175781253 km z: -30978.625000000000 error in location of the source: 3142649.00 m ***************************************************** ***************************************************** ***** WARNING: source location estimate is poor ***** ***************************************************** ***************************************************** maximum error in location of the sources: 3142649.00 m Elapsed time for detection of sources in seconds = 5.9499740600585938E-003 End of source detection - done printing the source-time function receivers: there are 5 stations in file ./DATA/STATIONS saving 0 stations inside the model in file ./DATA/STATIONS_FILTERED excluding 5 stations located outside the model error filtered stations: simulation needs at least 1 station but got 0 check that stations in file ./DATA/STATIONS are within longitude min/max: 52.511256102833592 61.535490745169042 latitude min/max : 0.0000000000000000 9.0094311157611440 UTM x min/max: 0.0000000000000000 999000.00000000000 UTM y min/max : 0.0000000000000000 999000.00000000000 -------------- next part -------------- A non-text attachment was scrubbed... Name: STATIONS Type: application/octet-stream Size: 155 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: Par_file Type: application/octet-stream Size: 14478 bytes Desc: not available URL: -------------- next part -------------- ****************************************** *** Specfem3D MPI Mesher - f90 version *** ****************************************** Version: v2.0.2-2178-g6b93bc9 This is process 0 There are 1 MPI processes Processes are numbered from 0 to 0 There is a total of 1 slices NGLLX = 5 NGLLY = 5 NGLLZ = 5 Shape functions defined by NGNOD = 8 control nodes Surface shape functions defined by NGNOD2D = 4 control nodes Beware! Curvature (i.e. HEX27 elements) is not handled by our internal mesher velocity model: default using UTM projection in region 40 no attenuation no anisotropy no oceans incorporating Stacey absorbing conditions using a CMTSOLUTION source using a Gaussian source time function ************************** creating mesh in the model ************************** external mesh points : 158103 defined materials : 486 undefined materials : 0 total number of spectral elements: 142560 absorbing boundaries: xmin,xmax : 1320 1320 ymin,ymax : 1296 1296 bottom,top: 11880 300 total number of C-PML elements in the global mesh: 0 number of MPI partition interfaces: 0 minimum memory used so far : 485.252899 MB per process minimum total memory requested : 2277.1708908081055 MB per process create regions: ...allocating arrays File DATA/Par_file_faults not found: assume no faults ...setting up jacobian ...indexing global points ...preparing MPI interfaces total MPI interface points: 0 total assembled MPI interface points: 0 ...setting up absorbing boundaries absorbing boundary: total number of free faces = 300 total number of faces = 17112 ...determining velocity model 10 % time remaining: 7.9413225403842909E-009 s 20 % time remaining: 7.0450781984627176E-009 s 30 % time remaining: 6.1550471839816207E-009 s 40 % time remaining: 5.3170136669342226E-009 s 50 % time remaining: 4.4344464865790577E-009 s 60 % time remaining: 3.5407512532723424E-009 s 70 % time remaining: 2.6508547788530566E-009 s 80 % time remaining: 1.7630248397408886E-009 s 90 % time remaining: 8.8131681736461364E-010 s 100 % time remaining: 0.0000000000000000 s ...detecting acoustic-elastic-poroelastic surfaces total acoustic elements : 0 total elastic elements : 142560 total poroelastic elements: 0 ...element inner/outer separation for overlapping of communications with calculations: percentage of edge elements 0.00000000 % percentage of volume elements 100.000000 % ...element mesh coloring use coloring = F ...external binary models no external binary model used ...creating mass matrix ...saving databases saving mesh files for AVS, OpenDX, Paraview saving additonal mesh files with surface/coupling points ...checking mesh resolution ******** minimum and maximum number of elements and points in the CUBIT + SCOTCH mesh: NSPEC_global_min = 142560 NSPEC_global_max = 142560 NSPEC_global_max / NSPEC_global_min imbalance = 1.00000000 = 0.00000000 % NSPEC_global_sum = 142560 NGLOB_global_min = 9367159 NGLOB_global_max = 9367159 NGLOB_global_max / NGLOB_global_min imbalance = 1.00000000 = 0.00000000 % NGLOB_global_sum = 9367159 If you have elements of a single type (all acoustic, all elastic, all poroelastic, and without CPML) in the whole mesh, then there should be no significant imbalance in the above numbers. Otherwise, it is normal to have imbalance in elements and points because the domain decomposer compensates for the different cost of different elements by partitioning them unevenly among processes. ******** ******** Model: P velocity min,max = 5000.00000 8230.00000 Model: S velocity min,max = 2800.00000 4560.00000 Model: Poisson's ratio min,max = 0.239989176 0.301242232 ******** ********************************************* *** Verification of simulation parameters *** ********************************************* *** Xmin and Xmax of the model = 0.00000000 999000.000 *** Ymin and Ymax of the model = 0.00000000 999000.000 *** Zmin and Zmax of the model = -100000.000 2590.92261 *** Max GLL point distance = 4862.95752 *** Min GLL point distance = 27.3437538 *** Max/min ratio = 177.845276 *** Max element size = 14916.4326 *** Min element size = 35.9687500 *** Max/min ratio = 414.705353 *** Minimum period resolved = 5.73660755 *** Maximum suggested time step = 1.69836986E-03 Elapsed time for checking mesh resolution in seconds = 0.15224289894104004 saving VTK files for Courant number and minimum period min and max of topography included in mesh in m is 1240.0000000000000 2590.9226859999999 Repartition of elements: ----------------------- total number of elements in mesh slice 0: 142560 total number of points in mesh slice 0: 9367159 total number of elements in entire mesh: 142560 approximate total number of points in entire mesh (with duplicates on MPI edges): 9367159.0000000000 approximate total number of DOFs in entire mesh (with duplicates on MPI edges): 28101477.000000000 total number of time steps in the solver will be: 40000 using single precision for the calculations smallest and largest possible floating-point numbers are: 1.17549435E-38 3.40282347E+38 Elapsed time for mesh generation and buffer creation in seconds = 166.58818387985229 End of mesh generation done From komatitsch at lma.cnrs-mrs.fr Sun Jul 2 14:51:04 2017 From: komatitsch at lma.cnrs-mrs.fr (Dimitri Komatitsch) Date: Sun, 2 Jul 2017 23:51:04 +0200 Subject: [CIG-SEISMO] SPECFEM3D-problems In-Reply-To: References: Message-ID: Dear Abolfazl Komeazi, It seems that the mesh you have created starts on the equator and does not include Iran, since you have this in the output of the code: latitude min/max : 0.0000000000000000 9.0094311157611440 Thus you need to check your mesh and center it on Iran. Best regards, Dimitri. On 07/02/2017 01:57 PM, abolfazl komeazi wrote: > Dear Dr. Komatitsch, > > I'm working with specfem3d version /2.0.1./I've created my Mesh file > using cubit. the problem is when I run the program it seems my stations > aren't in a correct location. I've get this error: > > need at least one receiver > Error detected, aborting MPI... proc 0 > -------------------------------------------------------------------------- > MPI_ABORT was invoked on rank 0 in communicator MPI_COMM_WORLD > with errorcode 30. > > My target region is in latitude(25,34) and longitude(52,61) [which is > part of IRAN] and the corresponding UTM part is 40. attached you can > find my STATION file. looking at "output_solver.txt" file, I found that > the program didn't recognize my target region since it says: > > check that stations in file ./DATA/STATIONS are within > longitude min/max: 52.511256102833592 61.535490745169042 > latitude min/max : 0.0000000000000000 9.0094311157611440 > UTM x min/max: 0.0000000000000000 999000.00000000000 > UTM y min/max : 0.0000000000000000 999000.00000000000 > > Any help would be appreciated. > > Sincerely Yours, > Abolfazl Komeazi > > -- > Ph.D Student of Geophysics - Seismology, > Seismological Research Center, > International Institute of Earthquake Engineering and Seismology (IIEES), > Seismology Department > P.O. Box 19395-3913, > Tehran, Iran, > komeaz at gmail.com > a.komeazi at iiees.ac.ir > > > _______________________________________________ > CIG-SEISMO mailing list > CIG-SEISMO at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-seismo > -- Dimitri Komatitsch, CNRS Research Director (DR CNRS) Laboratory of Mechanics and Acoustics, Marseille, France http://komatitsch.free.fr From Jean.Lecoulant at univ-brest.fr Wed Jul 5 07:28:04 2017 From: Jean.Lecoulant at univ-brest.fr (Jean Lecoulant) Date: Wed, 05 Jul 2017 16:28:04 +0200 Subject: [CIG-SEISMO] Modelling seafloor roughness with SPECFEM3D Message-ID: <20170705162804.Horde.X9VFomCURtHqVRiJ1txfeQ1@webmailperso.univ-brest.fr> To whom it may concern, I seek to model the emission of T-waves by a rough seafloor with SPECFEM3D. To do this, I need to mesh a random topography on the crust/ocean interface, with a topography wave-length an order of magnitude smaller than the typical wave-length of T-waves. The wave-length of the T-waves emitted by my seismic source is about 1 km, therefore I need a topography with a 100 m wave-length. The discretization shall be done with a space-step dx ~ 10 m to obtain a smooth topography. The calculation domain for the simulations is a parallelepiped, 10x10 km wide and 6 km thick. It is horizontally divided in a 3 km thick fluid medium (the ocean) underlain by a 3 km thick solid medium (the Earth crust). All sides, except the top (sea) surface, are absorbing layers. I use Gmsh to build meshes and I run the appropriate python and fortran routines to convert these meshes. I have been using different discretization, varying the number of finite elements both along the horizontal axes and along the vertical axis. In the horizontal plan, I have tested 1001x1001 (dx = 10 m), 501x501 (dx = 25 m), 251x251 (dx = 40 m) and 201x201 (dx = 50 m) meshes. For each of these discretizations, I have tested different numbers of finite elements along the vertical axis in the crust and in the ocean: 31 (dz = 100m), 51 (dz = 60m) and 91 (dz = 33.33m). Running the SPECFEM3D routine xgenerate_databases with those meshes, I encounter two different errors. When there are too few elements along the vertical: 'there is an error in separation of beta_x, alpha_y, alpha_z'. When there are too many elements along the vertical: 'Access to an undefined portion of a memory object'. Hence I cannot find a satisfying mesh. Does anyone know a way to generate a mesh able to model a rough seafloor? Best regards, Jean Lecoulant Ph.D. student in geophysics Laboratoire Géosciences Océan Institut Universitaire Européen de la Mer Université de Bretagne Occidentale, France Tel: +33 (0)2 98 49 88 94 -------------- next part -------------- An HTML attachment was scrubbed... URL: From komatitsch at lma.cnrs-mrs.fr Thu Jul 6 09:53:00 2017 From: komatitsch at lma.cnrs-mrs.fr (Dimitri Komatitsch) Date: Thu, 6 Jul 2017 18:53:00 +0200 Subject: [CIG-SEISMO] Modelling seafloor roughness with SPECFEM3D In-Reply-To: <20170705162804.Horde.X9VFomCURtHqVRiJ1txfeQ1@webmailperso.univ-brest.fr> References: <20170705162804.Horde.X9VFomCURtHqVRiJ1txfeQ1@webmailperso.univ-brest.fr> Message-ID: <8d8012fd-d1ae-6b40-c494-c0f9e89b7980@lma.cnrs-mrs.fr> Hi Jean, Thanks for your message. Alexis Bottero is currently doing this as well in his PhD thesis here, let me put you in contact with him and with my colleague Paul Cristini. Best regards, Dimitri. On 07/05/2017 04:28 PM, Jean Lecoulant wrote: > To whom it may concern, > > I seek to model the emission of T-waves by a rough seafloor with SPECFEM3D. > > To do this, I need to mesh a random topography on the crust/ocean > interface, with a topography wave-length an order of magnitude smaller > than the typical wave-length of T-waves. The wave-length of the T-waves > emitted by my seismic source is about 1 km, therefore I need a > topography with a 100 m wave-length. The discretization shall be done > with a space-step dx ~ 10 m to obtain a smooth topography. > > The calculation domain for the simulations is a parallelepiped, 10x10 km > wide and 6 km thick. It is horizontally divided in a 3 km thick fluid > medium (the ocean) underlain by a 3 km thick solid medium (the Earth > crust). All sides, except the top (sea) surface, are absorbing layers. I > use Gmsh to build meshes and I run the appropriate python and fortran > routines to convert these meshes. I have been using different > discretization, varying the number of finite elements both along the > horizontal axes and along the vertical axis. In the horizontal plan, I > have tested 1001x1001 (dx = 10 m), 501x501 (dx = 25 m), 251x251 (dx = 40 > m) and 201x201 (dx = 50 m) meshes. For each of these discretizations, I > have tested different numbers of finite elements along the vertical axis > in the crust and in the ocean: 31 (dz = 100m), 51 (dz = 60m) and 91 (dz > = 33.33m). > > Running the SPECFEM3D routine xgenerate_databases with those meshes, I > encounter two different errors. When there are too few elements along > the vertical: 'there is an error in separation of beta_x, alpha_y, > alpha_z'. When there are too many elements along the vertical: 'Access > to an undefined portion of a memory object'. Hence I cannot find a > satisfying mesh. > > Does anyone know a way to generate a mesh able to model a rough seafloor? > > Best regards, > > Jean Lecoulant > Ph.D. student in geophysics > Laboratoire Géosciences Océan > Institut Universitaire Européen de la Mer > Université de Bretagne Occidentale, France > Tel: +33 (0)2 98 49 88 94 > > > > _______________________________________________ > CIG-SEISMO mailing list > CIG-SEISMO at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-seismo > -- Dimitri Komatitsch, CNRS Research Director (DR CNRS) Laboratory of Mechanics and Acoustics, Marseille, France http://komatitsch.free.fr From elodie.kendall.14 at ucl.ac.uk Thu Jul 6 01:32:42 2017 From: elodie.kendall.14 at ucl.ac.uk (Kendall, Elodie) Date: Thu, 6 Jul 2017 08:32:42 +0000 Subject: [CIG-SEISMO] Specfem with an altered crustal model In-Reply-To: References: Message-ID: Hi there, I was hoping to ask you a question about Specfem3d_globe please? I have been using it a few months now and I am currently trying to introduce crustal thickness perturbations into crust2.0 and analyse the effect on the synthetics. I have set up a new crust2.0 which has 16,200 keys (one for each 2 by 2 degree grid) but the same layers for each grid. I have altered the corresponding subroutines for this new set of keys, no errors with the mesher and solver- same synthetics as before. Now, I add the perturbations to each layer of each grid in my crust2.0 model. I have changed the maximum moho depth from 90 to 115 (the new max.), the mesher runs fine but the solver fails with error "forward simulation became unstable in fluid and blew up". I have tried changing the time step DT to DT*0.95 and DT*0.8 in the script shared/ get_timestep_and_layers.f90 however the same error appears. I was hoping you could help me please? Thanks a lot, Elodie Kendall -------------- next part -------------- An HTML attachment was scrubbed... URL: From tjesser at ucdavis.edu Thu Jul 6 12:02:39 2017 From: tjesser at ucdavis.edu (Tyler Esser) Date: Thu, 6 Jul 2017 12:02:39 -0700 Subject: [CIG-SEISMO] SPECFEM3D Geotech 1.2.0 Released Message-ID: We are happy to announce that SPECFEM3D Geotech has released a new version! Congratulations to the devs! SPECFEM3D Geotech 1.2.0 introduces these changes: - The file format of the displacement boundary conditions has changed (See Section 3.2.5 of the manual). - Now the program can be run from the general locations. - Main routines are modularized in preparation to add other features in the coming versions. - Removed all instances of runtime warning for temporary arrays. - Added step-by-step tutorials for both serial and parallel runs. - Added GiD mesh converter. - Improved EXODUS mesh converter. - Fixed several minor bugs. - Structural change and overall code cleanup. For download links and additional information, please visit https://geodynamics.org/cig/software/specfem3d_geotech/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bozdag at mines.edu Thu Jul 6 17:31:25 2017 From: bozdag at mines.edu (Ebru Bozdag) Date: Fri, 7 Jul 2017 00:31:25 +0000 Subject: [CIG-SEISMO] Specfem with an altered crustal model In-Reply-To: References: Message-ID: <3435B571-38D1-401C-8453-DA3A025A8076@mines.edu> Hi Elodie, Are you honouring the crustal thickness in your simulations, i.e., are you using ! to suppress element stretching for 3D moho surface logical,parameter :: SUPPRESS_MOHO_STRETCHING = .false. in your constant.h file? If this is the case perhaps after you add perturbations you may also be distorting the mesh aspect ratio at some locations. Just try to run the mesher and the solver by setting SUPPRESS_MOHO_STRETCHING = .true. and see if you still have the same problem, if you have not tried it yet. Best regards, Ebru ----------------- Ebru Bozdag Assistant Professor COLORADOSCHOOLOFMINES Department of Geophysics bozdag at mines.edu | tel: +1-303-273-3578 | fax: +1-303-273-3478 On 06 Jul 2017, at 02:32, Kendall, Elodie > wrote: Hi there, I was hoping to ask you a question about Specfem3d_globe please? I have been using it a few months now and I am currently trying to introduce crustal thickness perturbations into crust2.0 and analyse the effect on the synthetics. I have set up a new crust2.0 which has 16,200 keys (one for each 2 by 2 degree grid) but the same layers for each grid. I have altered the corresponding subroutines for this new set of keys, no errors with the mesher and solver- same synthetics as before. Now, I add the perturbations to each layer of each grid in my crust2.0 model. I have changed the maximum moho depth from 90 to 115 (the new max.), the mesher runs fine but the solver fails with error "forward simulation became unstable in fluid and blew up". I have triedchanging the time step DT to DT*0.95 and DT*0.8 in the script shared/ get_timestep_and_layers.f90 however the same error appears. I was hoping you could help me please? Thanks a lot, Elodie Kendall _______________________________________________ CIG-SEISMO mailing list CIG-SEISMO at geodynamics.org http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-seismo -------------- next part -------------- An HTML attachment was scrubbed... URL: From tuxiang2016 at outlook.com Thu Jul 6 03:39:01 2017 From: tuxiang2016 at outlook.com (tu xiang) Date: Thu, 6 Jul 2017 10:39:01 +0000 Subject: [CIG-SEISMO] Problems with running SPECFEM3D Message-ID: Hello, everyone I want to do fault spontaneous rupture with the SPECFEM3D software, so I start my learning from running the fault examples provided by SPECFEM3D. First, I install the softwares required by the simulation. I purchased the Trelis/CUBIT 16.0 software and installed it. I downloaded openMPI software and installed it. 1.Problems with Installation of SPECFEM3D Then, I need to install the SPECFEM3D software. I downloaded the software from GitHub. I run the configure shell script as follow:./configure FC=gfortran CC=gcc MPIFC=mpif90 --with-mpi I have successfully compiled the xcheck_mesh_quality, xconvolve_source_timefunction, xdecompose_mesh, xgenerate_databases, xmeshfem3D, xspecfem3D code in SPECFEM3D. If I want to construct the mesh model of simulation with Trelis/CUBIT and output in the SPECFEM3D format, I need to install GEOCUBIT python script library. The SPECFEM3D software provide a CUBIT_GEOCUBIT directory. According to the README.md file in the CUBIT_GEOCUBIT directory, GEOCUBIT can be install by the command `sudo python setup.py install`. Thus, I run this command and restart my computer. Then I run ` import cubit2specfem3d ` in the Trelis/CUBIT command line panel. But it reveals that Trelis cannot find this script. In this case, I tried the second way of install GEOCUBIT python script library, that is, added the path to my .bashrc file: export PYTHONPATH=$PYTHONPATH:/home/soft/specfem3d/CUBIT_GEOCUBIT/geocubitlib export PATH=$PATH:/home/soft/specfem3d/CUBIT_GEOCUBIT/geocubitlib I restart my computer. Then I run ` import cubit2specfem3d ` in the Trelis/CUBIT command line panel. It does not output any warnings or errors. 2.Problems with Running Mesh Generating Script So I think I have successfully installed all the required softwares. Next, I want to run the fault examples provided in the specfem3d/EXAMPLES/fault_examples/ directory. I start from the tpv5 example. I run the mesh constructing python script ‘TPV5.py’ with Trelis/CUBIT software. But it output errors: Trelis> Traceback (most recent call last): File "", line 1, in File "/home/bei/specfem3d-master/CUBIT_GEOCUBIT/geocubitlib/cubit2specfem3d.py", line 1464, in export2SPECFEM3D sem_mesh = mesh(hex27, cpml, cpml_size, top_absorbing) File "/home/bei/specfem3d-master/CUBIT_GEOCUBIT/geocubitlib/cubit2specfem3d.py", line 467, in __init__ self.block_definition() File "/home/bei/specfem3d-master/CUBIT_GEOCUBIT/geocubitlib/cubit2specfem3d.py", line 554, in block_definition if qk < 0 or qmu < 0: UnboundLocalError: local variable 'qk' referenced before assignment Trelis> I have tried again and again. 3.Problems with mpirun Luckily, SPECFEM3D provided the tpv5 mesh model with the example. Thus, I can continue the simulation. First, I run the tpv5 example with only one processor, it can run successfully. But when I run this example with 6 processors, it output errors. My operation is as follows: I make OUTPUT_FILES directory: mkdir -p OUTPUT_FILES I make bin directory: mkdir -p bin I link the compiled code to the bin directory: ln -s ../../../../bin/xdecompose_mesh ln -s ../../../../bin/xgenerate_databases ln -s ../../../../bin/xspecfem3D I change the NPROC parameter to 6 in Par_file file. I copy the Par_file, CMTSOLUTION , and STATIONS files to the OUTPUT_FILE I run the command ‘./bin/xdecompose_mesh 6 ./MESH ./OUTPUT_FILES/DATABASES_MPI’ to decompose the mesh model. It can run successfully I run the command ‘pirun -np 6 ./bin/xgenerate_databa ses’ to generate databases. It output errors: $ mpirun -np 6 ./bin/xgenerate_databa ses -------------------------------------------------------------------------- [[2895,1],1]: A high-performance Open MPI point-to-point messaging module was unable to find any relevant network interfaces: Module: OpenFabrics (openib) Host: ubuntu Another transport will be used instead, although this may result in lower performance. -------------------------------------------------------------------------- [ubuntu:02332] 5 more processes have sent help message help-mpi-btl-base.txt / btl:no-nic s [ubuntu:02332] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages I do not know why. -------------- next part -------------- An HTML attachment was scrubbed... URL: From Jean.Lecoulant at univ-brest.fr Fri Jul 7 05:36:56 2017 From: Jean.Lecoulant at univ-brest.fr (Jean Lecoulant) Date: Fri, 07 Jul 2017 14:36:56 +0200 Subject: [CIG-SEISMO] Modelling seafloor roughness with SPECFEM3D In-Reply-To: <8d8012fd-d1ae-6b40-c494-c0f9e89b7980@lma.cnrs-mrs.fr> References: <20170705162804.Horde.X9VFomCURtHqVRiJ1txfeQ1@webmailperso.univ-brest.fr> <8d8012fd-d1ae-6b40-c494-c0f9e89b7980@lma.cnrs-mrs.fr> Message-ID: <20170707143656.Horde.ykT0ZB3S-qxFx1gax1GF7g1@webmailperso.univ-brest.fr> Bonjour, Merci beaucoup pour votre réponse et pour m'avoir donner les adresses d'Alexis Bottero et de Paul Cristini. Ils pourront sûrement résoudre ce problème. Cordialement, Jean Dimitri Komatitsch a écrit : > Hi Jean, > > Thanks for your message. Alexis Bottero is currently doing this as well > in his PhD thesis here, let me put you in contact with him and with my > colleague Paul Cristini. > > Best regards, > Dimitri. > > On 07/05/2017 04:28 PM, Jean Lecoulant wrote: >> To whom it may concern, >> >> I seek to model the emission of T-waves by a rough seafloor with >> SPECFEM3D. >> >> To do this, I need to mesh a random topography on the crust/ocean >> interface, with a topography wave-length an order of magnitude smaller >> than the typical wave-length of T-waves. The wave-length of the T-waves >> emitted by my seismic source is about 1 km, therefore I need a >> topography with a 100 m wave-length. The discretization shall be done >> with a space-step dx ~ 10 m to obtain a smooth topography. >> >> The calculation domain for the simulations is a parallelepiped, 10x10 >> km wide and 6 km thick. It is horizontally divided in a 3 km thick >> fluid medium (the ocean) underlain by a 3 km thick solid medium (the >> Earth crust). All sides, except the top (sea) surface, are absorbing >> layers. I use Gmsh to build meshes and I run the appropriate python and >> fortran routines to convert these meshes. I have been using different >> discretization, varying the number of finite elements both along the >> horizontal axes and along the vertical axis. In the horizontal plan, I >> have tested 1001x1001 (dx = 10 m), 501x501 (dx = 25 m), 251x251 (dx = >> 40 m) and 201x201 (dx = 50 m) meshes. For each of these >> discretizations, I have tested different numbers of finite elements >> along the vertical axis in the crust and in the ocean: 31 (dz = 100m), >> 51 (dz = 60m) and 91 (dz = 33.33m). >> >> Running the SPECFEM3D routine xgenerate_databases with those meshes, I >> encounter two different errors. When there are too few elements along >> the vertical: 'there is an error in separation of beta_x, alpha_y, >> alpha_z'. When there are too many elements along the vertical: 'Access >> to an undefined portion of a memory object'. Hence I cannot find a >> satisfying mesh. >> >> Does anyone know a way to generate a mesh able to model a rough seafloor? >> >> Best regards, >> >> Jean Lecoulant >> Ph.D. student in geophysics >> Laboratoire Géosciences Océan >> Institut Universitaire Européen de la Mer >> Université de Bretagne Occidentale, France >> Tel: +33 (0)2 98 49 88 94 >> >> _______________________________________________ >> CIG-SEISMO mailing list >> CIG-SEISMO at geodynamics.org >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-seismo > > -- > Dimitri Komatitsch, CNRS Research Director (DR CNRS) > Laboratory of Mechanics and Acoustics, Marseille, > Francehttp://komatitsch.free.fr -------------- next part -------------- An HTML attachment was scrubbed... URL: From komatitsch at lma.cnrs-mrs.fr Fri Jul 7 11:56:04 2017 From: komatitsch at lma.cnrs-mrs.fr (Dimitri Komatitsch) Date: Fri, 7 Jul 2017 20:56:04 +0200 Subject: [CIG-SEISMO] Problems with running SPECFEM3D In-Reply-To: References: Message-ID: Hi XiangTu, Thanks for your message. I do not know the answer to the CUBIT/TRELIS question (have you tried the last one that Emanuele Casarotti committed to Git / devel branch last week?), but regarding MPI it is an MPI installation problem on your machine, unrelated to SPECFEM; your system administrator should be able to fix it. Best regards, Dimitri. On 07/06/2017 12:39 PM, tu xiang wrote: > Hello, everyone > > I want to do fault spontaneous rupture with the SPECFEM3D software, so I > start my learning from running the fault examples provided by SPECFEM3D. > First, I install the softwares required by the simulation. > I purchased the Trelis/CUBIT 16.0 software and installed it. I > downloaded openMPI software and installed it. > 1.Problems with Installation of SPECFEM3D > Then, I need to install the SPECFEM3D software. I downloaded the > software from GitHub. > I run the configure shell script as follow:./configureFC=gfortran CC=gcc > MPIFC=mpif90 --with-mpi > I have successfully compiled the xcheck_mesh_quality, > xconvolve_source_timefunction, > xdecompose_mesh,xgenerate_databases,xmeshfem3D, xspecfem3D code in > SPECFEM3D. > If I want to construct the mesh model of simulation with Trelis/CUBIT > and output in the SPECFEM3D format, I need to install GEOCUBIT python > script library. > The SPECFEM3D software provide a CUBIT_GEOCUBIT directory. > According to the README.md file in the CUBIT_GEOCUBIT directory, > GEOCUBIT can be install by the command `sudo python setup.py install`. > Thus, I run this command and restart my computer. Then I run ` import > cubit2specfem3d ` in the Trelis/CUBIT command line panel. But it reveals > that Trelis cannot find this script. > In this case, I tried the second way of install GEOCUBIT python script > library, that is, added the path to my .bashrc file: > export > PYTHONPATH=$PYTHONPATH:/home/soft/specfem3d/CUBIT_GEOCUBIT/geocubitlib > export PATH=$PATH:/home/soft/specfem3d/CUBIT_GEOCUBIT/geocubitlib > I restart my computer. Then I run ` import cubit2specfem3d ` in the > Trelis/CUBIT command line panel. It does not output any warnings or errors. > 2.Problems with Running Mesh Generating Script > So I think I have successfully installed all the required softwares. > Next, I want to run the fault examples provided in the > specfem3d/EXAMPLES/fault_examples/ directory. > I start from the tpv5 example. > I run the mesh constructing python script ‘TPV5.py’ with Trelis/CUBIT > software. But it output errors: > Trelis> > Traceback (most recent call last): > File "", line 1, in > File > "/home/bei/specfem3d-master/CUBIT_GEOCUBIT/geocubitlib/cubit2specfem3d.py", > line 1464, in export2SPECFEM3D > sem_mesh = mesh(hex27, cpml, cpml_size, top_absorbing) > File > "/home/bei/specfem3d-master/CUBIT_GEOCUBIT/geocubitlib/cubit2specfem3d.py", > line 467, in __init__ > self.block_definition() > File > "/home/bei/specfem3d-master/CUBIT_GEOCUBIT/geocubitlib/cubit2specfem3d.py", > line 554, in block_definition > if qk < 0 or qmu < 0: > UnboundLocalError: local variable 'qk' referenced before assignment > Trelis> > I have tried again and again. > 3.Problems with mpirun > Luckily, SPECFEM3D provided the tpv5 mesh model with the example. Thus, > I can continue the simulation. > First, I run the tpv5 example with only one processor, it can run > successfully. But when I run this example with 6 processors, it output > errors. > My operation is as follows: > I make OUTPUT_FILES directory: > mkdir -p OUTPUT_FILES > I make bin directory: > mkdir -p bin > I link the compiled code to the bin directory: > ln -s ../../../../bin/xdecompose_mesh > ln -s ../../../../bin/xgenerate_databases > ln -s ../../../../bin/xspecfem3D > I change the NPROC parameter to 6 in Par_file file. > I copy the Par_file, CMTSOLUTION , and STATIONS files to the OUTPUT_FILE > I run the command > ‘./bin/xdecompose_mesh6./MESH./OUTPUT_FILES/DATABASES_MPI’to decompose > the mesh model. > It can run successfully > I run the command ‘pirun -np 6 ./bin/xgenerate_databa > ses’ to generate databases. > It output errors: > $ mpirun -np 6 ./bin/xgenerate_databa > ses > -------------------------------------------------------------------------- > [[2895,1],1]: A high-performance Open MPI point-to-point messaging module > was unable to find any relevant network interfaces: > > Module: OpenFabrics (openib) > Host: ubuntu > > Another transport will be used instead, although this may result in > lower performance. > -------------------------------------------------------------------------- > [ubuntu:02332] 5 more processes have sent help message > help-mpi-btl-base.txt / btl:no-nic > s > [ubuntu:02332] Set MCA parameter "orte_base_help_aggregate" to 0 to see > all help / error > messages > I do not know why. > > > > _______________________________________________ > CIG-SEISMO mailing list > CIG-SEISMO at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-seismo > -- Dimitri Komatitsch, CNRS Research Director (DR CNRS) Laboratory of Mechanics and Acoustics, Marseille, France http://komatitsch.free.fr From olboyd at usgs.gov Thu Jul 13 07:57:07 2017 From: olboyd at usgs.gov (Oliver Boyd) Date: Thu, 13 Jul 2017 08:57:07 -0600 Subject: [CIG-SEISMO] AGU Session on Upper Crustal Geophysical Parameter Estimation Message-ID: <150C78EA-25E7-4B39-847A-A1F336D8EAFA@usgs.gov> Dear Colleagues, Please join us for exciting presentations and enlightening discussion and consider contributing to AGU session S034 Geophysical Parameter Estimation of the Very Upper Crust: Perspectives from the Field, Lab, and Mind to be held at the upcoming Fall meeting in New Orleans, LA 11–15 December. We plan to bring together researchers having distinct perspectives and interests to share common concerns regarding geophysical parameter estimation of the very upper crust. Please note that abstracts are due Wednesday August 2nd, midnight Eastern time. Session description: Geophysical parameter estimation of the very upper crust plays a critical role in, for example, geologic hazards, construction, archeology, and natural resource assessment and recovery. Methods of characterization and estimation vary widely from direct and indirect measurement in the field and lab to empirical and theoretical interpolation and extrapolation. In this session, we aim to bring together researchers from various disciplines and settings to present their work on geophysical parameter estimation of the top ~1 kilometer of the crust, which includes, for example, the influence of partial saturation, mixed phases, the water table, unconsolidated sediments, rock fractures, and weathering layers. Invited Speakers: Kenneth Stokoe, Georgia Tech Kristina Keating, Rutgers Kind regards, Oliver Boyd, USGS, Golden Alan Yong, USGS, Pasadena Manika Prasad, Colorado School of Mines -------------- next part -------------- An HTML attachment was scrubbed... URL: From l.y.vandewiel at uu.nl Wed Jul 19 06:42:27 2017 From: l.y.vandewiel at uu.nl (Wiel, L.Y. van de (Lukas)) Date: Wed, 19 Jul 2017 13:42:27 +0000 Subject: [CIG-SEISMO] SpecFem3D Globe GPU System requirements Message-ID: <63EBBEAFE91F4246B1AF7F06076D1DFF1B2334FD@WP0046.soliscom.uu.nl> Dear sir, madam, at Utrecht University Seismology group we run SpecFem3D Globe and we are looking at adding GPUs to our new cluster, just for this purpose (and to teach (PhD-)students to write GPU code.) To make a thoroughly motivated decision about this, we aim to compile and run Specfem3D Globe both on a CUDA card and on an openCL-card. We will measure the performance and see if there are significant differences between the two. Has this perhaps already been done by you? Section 2.2 of the manual is clean enough on how to do this. What would be the system requirements for the GPU? Is 4 GB enough? Do we perhaps need 8, for running the benchmarks that are included in the tarball? Thank you very much and best wishes, Lukas van de Wiel (scientific programmer of the Utrecht University Earth Sciences department) -------------- next part -------------- An HTML attachment was scrubbed... URL: From komatitsch at lma.cnrs-mrs.fr Wed Jul 19 15:04:47 2017 From: komatitsch at lma.cnrs-mrs.fr (Dimitri Komatitsch) Date: Thu, 20 Jul 2017 00:04:47 +0200 Subject: [CIG-SEISMO] SpecFem3D Globe GPU System requirements In-Reply-To: <63EBBEAFE91F4246B1AF7F06076D1DFF1B2334FD@WP0046.soliscom.uu.nl> References: <63EBBEAFE91F4246B1AF7F06076D1DFF1B2334FD@WP0046.soliscom.uu.nl> Message-ID: <41819ede-2c08-113f-2f6c-30b8f90925a9@lma.cnrs-mrs.fr> Dear Lukas, You can find some recent benchmarks in https://www.overleaf.com/9136559nqfcpxtqrwzx#/33432931/ Best regards, Dimitri. On 07/19/2017 03:42 PM, Wiel, L.Y. van de (Lukas) wrote: > Dear sir, madam, > > at Utrecht University Seismology group we run SpecFem3D Globe and we are > looking at adding GPUs to our new cluster, just for this purpose (and to > teach (PhD-)students to write GPU code.) > > To make a thoroughly motivated decision about this, we aim to compile > and run Specfem3D Globe both on a CUDA card and on an openCL-card. We > will measure the performance and see if there are significant > differences between the two. Has this perhaps already been done by you? > > Section 2.2 of the manual is clean enough on how to do this. > What would be the system requirements for the GPU? Is 4 GB enough? Do we > perhaps need 8, for running the benchmarks that are included in the tarball? > > Thank you very much and best wishes, > > Lukas van de Wiel > > (scientific programmer of the Utrecht University Earth Sciences department) > > > _______________________________________________ > CIG-SEISMO mailing list > CIG-SEISMO at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-seismo > -- Dimitri Komatitsch, CNRS Research Director (DR CNRS) Laboratory of Mechanics and Acoustics, Marseille, France http://komatitsch.free.fr From ctape at alaska.edu Fri Jul 21 10:02:03 2017 From: ctape at alaska.edu (Carl Tape) Date: Fri, 21 Jul 2017 09:02:03 -0800 Subject: [CIG-SEISMO] AGU 2017 session "Theoretical and Computational Frontiers in Seismic Tomography" Message-ID: Dear colleagues, Following the tradition of previous years, we invite you to submit abstracts to our *seismic tomography session *at the AGU Fall Meeting. This year's focus is on advances in theoretical and computational developments. The submission deadline is *August 2nd*. Our invited speakers are two bright PhD students, *Julien Thurin* (Grenoble) and *Wenjie Lei* (Princeton). More detailed information on the session can be found below. Hoping to see you in New Orleans! Carene Larmat Monica Maceira Carl Tape Andreas Fichtner *Theoretical and Computational Frontiers in Seismic Tomography* *Session ID#: *22396 Session Description: This session offers a platform to present advances in seismic tomography, with focus on developments that contribute to the improved resolution of 3D Earth structure at all scales. Of special interest are theoretical and computational developments in tomography, as well as methods that enable the exploitation of new massive data volumes from dense networks around the globe. Possible contributions include novel forward modeling techniques, advances in seismic interferometry for static and time-variable Earth structure, inversion and nonlinear optimization techniques, innovative approaches to uncertainty analysis, machine learning and large-scale data analysis. --------------------------------------------------------------- Carl Tape Associate Professor Geophysical Institute (office 413D) University of Alaska Fairbanks Phone: 907-474-5456 Email: ctape at alaska.edu Web: http://www.giseis.alaska.edu/input/carl/ --------------------------------------------------------------- -------------- next part -------------- An HTML attachment was scrubbed... URL: From carene at lanl.gov Fri Jul 21 10:37:38 2017 From: carene at lanl.gov (Larmat, Carene) Date: Fri, 21 Jul 2017 17:37:38 +0000 Subject: [CIG-SEISMO] Theoretical and Computational Frontiers in Seismic Tomography - AGU Session: 22396 Message-ID: <1500658659193.28240@lanl.gov> Dear colleagues of the CIG-seismology community, We would like to attract your attention to the following AGU Fall Meeting session "Theoretical and Computational Frontiers in Seismic tomography" which should be of interest for this community. This session has been recurrent for several year and this year's focus is on advances in theoretical and computational developments. The submission deadline is August 2nd. In a special effort to promote the youngsters, our invited speakers are two bright PhD students, Julien Thurin (Grenoble) and Wenjie Lei (Princeton). More detailed information on the session can be found below. Hoping to see you in New Orleans! Carene Larmat Monica Maceira Carl Tape Andreas Fichtner Theoretical and Computational Frontiers in Seismic Tomography Session ID#: 22396 Session Description: This session offers a platform to present advances in seismic tomography, with focus on developments that contribute to the improved resolution of 3D Earth structure at all scales. Of special interest are theoretical and computational developments in tomography, as well as methods that enable the exploitation of new massive data volumes from dense networks around the globe. Possible contributions include novel forward modeling techniques, advances in seismic interferometry for static and time-variable Earth structure, inversion and nonlinear optimization techniques, innovative approaches to uncertainty analysis, machine learning and large-scale data analysis. Carene Larmat, EES-17 LANL, carene at lanl.gov, 505 667 2074 -------------- next part -------------- An HTML attachment was scrubbed... URL: From nicholas_mancinelli at brown.edu Fri Jul 21 11:42:29 2017 From: nicholas_mancinelli at brown.edu (Nicholas Mancinelli) Date: Fri, 21 Jul 2017 14:42:29 -0400 Subject: [CIG-SEISMO] modeling multiple isotropic sources with SPECFEM2D Message-ID: <5D6676AF-06BE-4869-B3CA-AEE2049655B4@brown.edu> Hi SPECFEM2D developers, Is it possible to run elastic simulations with multiple isotropic sources nucleating simultaneously? The attached Par_file documents my failed attempt to run such a model. My concern is that the isotropic sources (Mxx=1, Mzz=1, Mxz=0) have P-wavefronts with nodes, and they also are radiating some S-wave energy (model snapshot attached). These problems do go away when I run the simulation for only one of the sources. Do you know if this is a problem with my implementation, or rather a bug within the guts of the source code? Best, Nick Nicholas J Mancinelli Postdoctoral Research Associate Department of Earth, Environmental, and Planetary Sciences Brown University -------------- next part -------------- A non-text attachment was scrubbed... Name: Par_file Type: application/octet-stream Size: 16945 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: SOURCE Type: application/octet-stream Size: 5574 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: interfaces.dat Type: application/octet-stream Size: 382 bytes Desc: not available URL: -------------- next part -------------- -------------- next part -------------- A non-text attachment was scrubbed... Name: bad_iso_multisrc.pdf Type: application/pdf Size: 139932 bytes Desc: not available URL: From komatitsch at lma.cnrs-mrs.fr Fri Jul 21 12:41:31 2017 From: komatitsch at lma.cnrs-mrs.fr (Dimitri Komatitsch) Date: Fri, 21 Jul 2017 21:41:31 +0200 Subject: [CIG-SEISMO] modeling multiple isotropic sources with SPECFEM2D In-Reply-To: <5D6676AF-06BE-4869-B3CA-AEE2049655B4@brown.edu> References: <5D6676AF-06BE-4869-B3CA-AEE2049655B4@brown.edu> Message-ID: <58e04a64-0a7a-fd67-02da-6bb4d3a1c261@lma.cnrs-mrs.fr> Hi Nicholas, Thanks! You are using an old version of the code, with the current one everything works fine (see the attached picture). Just type this: git clone --recursive --branch devel https://github.com/geodynamics/specfem2d.git and use the attached Par_file, upgraded to the current format. Best regards, Dimitri. On 07/21/2017 08:42 PM, Nicholas Mancinelli wrote: > Hi SPECFEM2D developers, > > Is it possible to run elastic simulations with multiple isotropic sources nucleating simultaneously? The attached Par_file documents my failed attempt to run such a model. My concern is that the isotropic sources (Mxx=1, Mzz=1, Mxz=0) have P-wavefronts with nodes, and they also are radiating some S-wave energy (model snapshot attached). These problems do go away when I run the simulation for only one of the sources. > > Do you know if this is a problem with my implementation, or rather a bug within the guts of the source code? > > Best, > > Nick > > Nicholas J Mancinelli > Postdoctoral Research Associate > Department of Earth, Environmental, > and Planetary Sciences > Brown University > > > > > > > > _______________________________________________ > CIG-SEISMO mailing list > CIG-SEISMO at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-seismo > -- Dimitri Komatitsch, CNRS Research Director (DR CNRS) Laboratory of Mechanics and Acoustics, Marseille, France http://komatitsch.free.fr -------------- next part -------------- A non-text attachment was scrubbed... Name: image0001000.jpg Type: image/jpeg Size: 92139 bytes Desc: not available URL: -------------- next part -------------- #----------------------------------------------------------------------------- # # simulation input parameters # #----------------------------------------------------------------------------- # title of job title = Slave Craton # forward or adjoint simulation # 1 = forward, 2 = adjoint, 3 = both simultaneously # note: 2 is purposely UNUSED (for compatibility with the numbering of our 3D codes) SIMULATION_TYPE = 1 # 0 = regular wave propagation simulation, 1/2/3 = noise simulation NOISE_TOMOGRAPHY = 0 # save the last frame, needed for adjoint simulation SAVE_FORWARD = .false. # parameters concerning partitioning NPROC = 1 # number of processes partitioning_method = 3 # SCOTCH = 3, ascending order (very bad idea) = 1 # number of control nodes per element (4 or 9) ngnod = 9 # time step parameters # total number of time steps NSTEP = 20000 # duration of a time step (see section "How to choose the time step" of the manual for how to do this) DT = 4.000000E-02 # time stepping # 1 = Newmark (2nd order), 2 = LDDRK4-6 (4th-order 6-stage low storage Runge-Kutta), 3 = classical RK4 4th-order 4-stage Runge-Kutta time_stepping_scheme = 1 # axisymmetric (2.5D) or Cartesian planar (2D) simulation AXISYM = .false. # set the type of calculation (P-SV or SH/membrane waves) P_SV = .true. # set to true to use GPUs GPU_MODE = .false. # creates/reads a binary database that allows to skip all time consuming setup steps in initialization # 0 = does not read/create database # 1 = creates database # 2 = reads database setup_with_binary_database = 0 # available models # default - define model using nbmodels below # ascii - read model from ascii database file # binary - read model from binary databse file # binary_voigt - read Voigt model from binary database file # external - define model using define_external_model subroutine # gll - read GLL model from binary database file # legacy - read model from model_velocity.dat_input MODEL = default # Output the model with the requested type, does not save if turn to default or .false. # (available output formats: ascii,binary,gll,legacy) SAVE_MODEL = default #----------------------------------------------------------------------------- # # attenuation # #----------------------------------------------------------------------------- # attenuation parameters ATTENUATION_VISCOELASTIC = .false. # turn attenuation (viscoelasticity) on or off for non-poroelastic solid parts of the model ATTENUATION_VISCOACOUSTIC = .false. # turn attenuation (viscoacousticity) on or off for non-poroelastic fluid parts of the model ATTENUATION_PORO_FLUID_PART = .false. # turn viscous attenuation on or off for the fluid part of poroelastic parts of the model Q0 = 1 # quality factor for viscous attenuation freq0 = 10 # frequency for viscous attenuation # for viscoelastic attenuation N_SLS = 3 # number of standard linear solids for attenuation (3 is usually the minimum) f0_attenuation = 5.196 # (Hz) relevant only if source is a Dirac or a Heaviside, otherwise it is f0 the dominant frequency of the source in the DATA/SOURCE file READ_VELOCITIES_AT_f0 = .false. # shift velocities to account for physical dispersion (see user manual for more information) # to undo attenuation for sensitivity kernel calculations or forward runs with SAVE_FORWARD # use the flag below. It performs undoing of attenuation in an exact way for sensitivity kernel calculations # but requires disk space for temporary storage, and uses a significant amount of memory used as buffers for temporary storage. # When that option is on the second parameter indicates how often the code dumps restart files to disk (if in doubt, use something between 100 and 1000). UNDO_ATTENUATION = .false. NT_DUMP_ATTENUATION = 500 #----------------------------------------------------------------------------- # # sources # #----------------------------------------------------------------------------- # source parameters NSOURCES = 3 # number of sources (source information is then read from the DATA/SOURCE file) force_normal_to_surface = .false. # angleforce normal to surface (external mesh and curve file needed) # use an existing initial wave field as source or start from zero (medium initially at rest) initialfield = .false. add_Bielak_conditions_bottom = .false. # add Bielak conditions or not if initial plane wave add_Bielak_conditions_right = .false. add_Bielak_conditions_top = .false. add_Bielak_conditions_left = .false. # acoustic forcing ACOUSTIC_FORCING = .false. # acoustic forcing of an acoustic medium with a rigid interface #----------------------------------------------------------------------------- # # receivers # #----------------------------------------------------------------------------- # receiver set parameters for recording stations (i.e. recording points) seismotype = 1 # record 1=displ 2=veloc 3=accel 4=pressure 5=curl of displ 6=the fluid potential # subsampling of the seismograms to create smaller files (but less accurately sampled in time) subsamp_seismos = 1 # so far, this option can only be used if all the receivers are in acoustic elements USE_TRICK_FOR_BETTER_PRESSURE = .false. # use this t0 as earliest starting time rather than the automatically calculated one USER_T0 = 0.0d0 # seismogram formats save_ASCII_seismograms = .true. # save seismograms in ASCII format or not save_binary_seismograms_single = .true. # save seismograms in single precision binary format or not (can be used jointly with ASCII above to save both) save_binary_seismograms_double = .false. # save seismograms in double precision binary format or not (can be used jointly with both flags above to save all) SU_FORMAT = .false. # output single precision binary seismograms in Seismic Unix format (adjoint traces will be read in the same format) # use an existing STATION file found in ./DATA or create a new one from the receiver positions below in this Par_file use_existing_STATIONS = .false. # number of receiver sets (i.e. number of receiver lines to create below) nreceiversets = 1 # orientation anglerec = 0.d0 # angle to rotate components at receivers rec_normal_to_surface = .false. # base anglerec normal to surface (external mesh and curve file needed) # first receiver set (repeat these 6 lines and adjust nreceiversets accordingly) nrec = 150 # number of receivers xdeb = 1080000 # first receiver x in meters zdeb = 1500000 # first receiver z in meters xfin = 2100000 # last receiver x in meters (ignored if only one receiver) zfin = 1500000 # last receiver z in meters (ignored if only one receiver) record_at_surface_same_vertical = .false. # receivers inside the medium or at the surface #----------------------------------------------------------------------------- # # adjoint kernel outputs # #----------------------------------------------------------------------------- # save sensitivity kernels in ASCII format (much bigger files, but compatible with current GMT scripts) or in binary format save_ASCII_kernels = .true. #----------------------------------------------------------------------------- # # boundary conditions # #----------------------------------------------------------------------------- # Perfectly Matched Layer (PML) boundaries # absorbing boundary active or not PML_BOUNDARY_CONDITIONS = .true. NELEM_PML_THICKNESS = 3 ROTATE_PML_ACTIVATE = .false. ROTATE_PML_ANGLE = 30. # set to .false. unless you know what you are doing; this implements automatic adjustment of the PML parameters for elongated models. # The goal is to improve the absorbing efficiency of PML for waves with large incidence angles, but this can lead to artefacts. # In particular, this option is efficient only when the number of sources NSOURCES is equal to one. PML_PARAMETER_ADJUSTMENT = .false. # Stacey ABC STACEY_ABSORBING_CONDITIONS = .false. # periodic boundaries ADD_PERIODIC_CONDITIONS = .false. PERIODIC_HORIZ_DIST = 4000.d0 #----------------------------------------------------------------------------- # # velocity and density models # #----------------------------------------------------------------------------- nbmodels = 2 # available material types (see user manual for more information) # acoustic: model_number 1 rho Vp 0 0 0 QKappa 9999 0 0 0 0 0 0 (for QKappa use 9999 to ignore it) # elastic: model_number 1 rho Vp Vs 0 0 QKappa Qmu 0 0 0 0 0 0 (for QKappa and Qmu use 9999 to ignore them) # when viscoelasticity or viscoacousticity is turned on, # the Vp and Vs values that are read here are the UNRELAXED ones i.e. the values at infinite frequency # unless the READ_VELOCITIES_AT_f0 parameter above is set to true, in which case they are the values at frequency f0. # Please also note that Qmu is always equal to Qs, but Qkappa is in general not equal to Qp. # To convert one to the other see doc/Qkappa_Qmu_versus_Qp_Qs_relationship_in_2D_plane_strain.pdf and # utils/attenuation/conversion_from_Qkappa_Qmu_to_Qp_Qs_from_Dahlen_Tromp_959_960.f90. # anisotropic: model_number 2 rho c11 c13 c15 c33 c35 c55 c12 c23 c25 0 0 0 # anisotropic in AXISYM: model_number 2 rho c11 c13 c15 c33 c35 c55 c12 c23 c25 c22 0 0 # poroelastic: model_number 3 rhos rhof phi c kxx kxz kzz Ks Kf Kfr etaf mufr Qmu # tomo: model_number -1 0 0 A 0 0 0 0 0 0 0 0 0 0 2 1 3300.0 7920.0 4400.0 0 0 9999 9999 0 0 0 0 0 0 1 1 3300.0 7920.0 4400.0 0 0 9999 9999 0 0 0 0 0 0 # external tomography file TOMOGRAPHY_FILE = ./DATA/tomo.dummy # use an external mesh created by an external meshing tool or use the internal mesher read_external_mesh = .false. #----------------------------------------------------------------------------- # # PARAMETERS FOR EXTERNAL MESHING # #----------------------------------------------------------------------------- # data concerning mesh, when generated using third-party app (more info in README) # (see also absorbing_conditions above) mesh_file = ./DATA/mesh_file # file containing the mesh nodes_coords_file = ./DATA/nodes_coords_file # file containing the nodes coordinates materials_file = ./DATA/materials_file # file containing the material number for each element free_surface_file = ./DATA/free_surface_file # file containing the free surface axial_elements_file = ./DATA/axial_elements_file # file containing the axial elements if AXISYM is true absorbing_surface_file = ./DATA/absorbing_surface_file # file containing the absorbing surface acoustic_forcing_surface_file = ./DATA/MSH/Surf_acforcing_Bottom_enforcing_mesh # file containing the acoustic forcing surface absorbing_cpml_file = ./DATA/absorbing_cpml_file # file containing the CPML element numbers tangential_detection_curve_file = ./DATA/courbe_eros_nodes # file containing the curve delimiting the velocity model #----------------------------------------------------------------------------- # # PARAMETERS FOR INTERNAL MESHING # #----------------------------------------------------------------------------- # file containing interfaces for internal mesh interfacesfile = ../interfaces.dat # geometry of the model (origin lower-left corner = 0,0) and mesh description xmin = 0.d0 # abscissa of left side of the model xmax = 3.000000E+06 # abscissa of right side of the model nx = 301 # number of elements along X # absorbing boundary parameters (see absorbing_conditions above) absorbbottom = .true. absorbright = .true. absorbtop = .false. absorbleft = .true. # define the different regions of the model in the (nx,nz) spectral-element mesh nbregions = 1 # then set below the different regions and model number for each region # format of each line: nxmin nxmax nzmin nzmax material_number 1 301 1 151 1 #----------------------------------------------------------------------------- # # display parameters # #----------------------------------------------------------------------------- # every how many time steps we display information about the simulation (costly, do not use a very small value) NSTEP_BETWEEN_OUTPUT_INFO = 1000 # meshing output output_grid_Gnuplot = .false. # generate a GNUPLOT file containing the grid, and a script to plot it output_grid_ASCII = .false. # dump the grid in an ASCII text file consisting of a set of X,Y,Z points or not # to plot total energy curves, for instance to monitor how CPML absorbing layers behave; # should be turned OFF in most cases because a bit expensive OUTPUT_ENERGY = .false. # every how many time steps we compute energy (which is a bit expensive to compute) NTSTEP_BETWEEN_OUTPUT_ENERGY = 10 # Compute the field int_0^t v^2 dt for a set of GLL points and write it to file. Use # the script utils/visualisation/plotIntegratedEnergyFile.py to watch. It is refreshed at the same time than the seismograms COMPUTE_INTEGRATED_ENERGY_FIELD = .false. #----------------------------------------------------------------------------- # # movies/images/snaphots # #----------------------------------------------------------------------------- # every how many time steps we draw JPEG or PostScript pictures of the simulation # and/or we dump results of the simulation as ASCII or binary files (costly, do not use a very small value) NSTEP_BETWEEN_OUTPUT_IMAGES = 100 # minimum amplitude kept in % for the JPEG and PostScript snapshots; amplitudes below that are muted cutsnaps = 1. #### for JPEG color images #### output_color_image = .true. # output JPEG color image of the results every NSTEP_BETWEEN_OUTPUT_IMAGES time steps or not imagetype_JPEG = 3 # display 1=displ_Ux 2=displ_Uz 3=displ_norm 4=veloc_Vx 5=veloc_Vz 6=veloc_norm 7=accel_Ax 8=accel_Az 9=accel_norm 10=pressure factor_subsample_image = 1.0d0 # (double precision) factor to subsample color images output by the code (useful for very large models) USE_CONSTANT_MAX_AMPLITUDE = .false. # by default the code normalizes each image independently to its maximum; use this option to use the global maximum below instead CONSTANT_MAX_AMPLITUDE_TO_USE = 1.17d4 # constant maximum amplitude to use for all color images if the above USE_CONSTANT_MAX_AMPLITUDE option is true POWER_DISPLAY_COLOR = 0.30d0 # non linear display to enhance small amplitudes in JPEG color images DRAW_SOURCES_AND_RECEIVERS = .true. # display sources as orange crosses and receivers as green squares in JPEG images or not DRAW_WATER_IN_BLUE = .true. # display acoustic layers as constant blue in JPEG images, because they likely correspond to water in the case of ocean acoustics or in the case of offshore oil industry experiments (if off, display them as greyscale, as for elastic or poroelastic elements, for instance for acoustic-only oil industry models of solid media) USE_SNAPSHOT_NUMBER_IN_FILENAME = .false. # use snapshot number in the file name of JPEG color snapshots instead of the time step (for instance to create movies in an easier way later) #### for PostScript snapshots #### output_postscript_snapshot = .false. # output Postscript snapshot of the results every NSTEP_BETWEEN_OUTPUT_IMAGES time steps or not imagetype_postscript = 1 # display 1=displ vector 2=veloc vector 3=accel vector; small arrows are displayed for the vectors meshvect = .false. # display mesh on PostScript plots or not modelvect = .true. # display velocity model on PostScript plots or not boundvect = .false. # display boundary conditions on PostScript plots or not interpol = .false. # interpolation of the PostScript display on a regular grid inside each spectral element, or use the non-evenly spaced GLL points pointsdisp = 6 # number of points in each direction for interpolation of PostScript snapshots (set to 1 for lower-left corner only) subsamp_postscript = 1 # subsampling of background velocity model in PostScript snapshots sizemax_arrows = 1.d0 # maximum size of arrows on PostScript plots in centimeters US_LETTER = .true. # use US letter or European A4 paper for PostScript plots #### for wavefield dumps #### output_wavefield_dumps = .false. # output wave field to a text file (creates very big files) imagetype_wavefield_dumps = 1 # display 1=displ vector 2=veloc vector 3=accel vector 4=pressure use_binary_for_wavefield_dumps = .false. # use ASCII or single-precision binary format for the wave field dumps #----------------------------------------------------------- # Ability to run several calculations (several earthquakes) # in an embarrassingly-parallel fashion from within the same run; # this can be useful when using a very large supercomputer to compute # many earthquakes in a catalog, in which case it can be better from # a batch job submission point of view to start fewer and much larger jobs, # each of them computing several earthquakes in parallel. # To turn that option on, set parameter NUMBER_OF_SIMULTANEOUS_RUNS to a value greater than 1. # To implement that, we create NUMBER_OF_SIMULTANEOUS_RUNS MPI sub-communicators, # each of them being labeled "my_local_mpi_comm_world", and we use them # in all the routines in "src/shared/parallel.f90", except in MPI_ABORT() because in that case # we need to kill the entire run. # When that option is on, of course the number of processor cores used to start # the code in the batch system must be a multiple of NUMBER_OF_SIMULTANEOUS_RUNS, # all the individual runs must use the same number of processor cores, # which as usual is NPROC in the Par_file, # and thus the total number of processor cores to request from the batch system # should be NUMBER_OF_SIMULTANEOUS_RUNS * NPROC. # All the runs to perform must be placed in directories called run0001, run0002, run0003 and so on # (with exactly four digits). # # Imagine you have 10 independent calculations to do, each of them on 100 cores; you have three options: # # 1/ submit 10 jobs to the batch system # # 2/ submit a single job on 1000 cores to the batch, and in that script create a sub-array of jobs to start 10 jobs, # each running on 100 cores (see e.g. http://www.schedmd.com/slurmdocs/job_array.html ) # # 3/ submit a single job on 1000 cores to the batch, start SPECFEM2D on 1000 cores, create 10 sub-communicators, # cd into one of 10 subdirectories (called e.g. run0001, run0002,... run0010) depending on the sub-communicator # your MPI rank belongs to, and run normally on 100 cores using that sub-communicator. # # The option below implements 3/. # NUMBER_OF_SIMULTANEOUS_RUNS = 1 # if we perform simultaneous runs in parallel, if only the source and receivers vary between these runs # but not the mesh nor the model (velocity and density) then we can also read the mesh and model files # from a single run in the beginning and broadcast them to all the others; for a large number of simultaneous # runs for instance when solving inverse problems iteratively this can DRASTICALLY reduce I/Os to disk in the solver # (by a factor equal to NUMBER_OF_SIMULTANEOUS_RUNS), and reducing I/Os is crucial in the case of huge runs. # Thus, always set this option to .true. if the mesh and the model are the same for all simultaneous runs. # In that case there is no need to duplicate the mesh and model file database (the content of the DATABASES_MPI # directories) in each of the run0001, run0002,... directories, it is sufficient to have one in run0001 # and the code will broadcast it to the others) BROADCAST_SAME_MESH_AND_MODEL = .true. From jwhuang1982 at gmail.com Mon Jul 24 09:11:15 2017 From: jwhuang1982 at gmail.com (Junwei Huang) Date: Mon, 24 Jul 2017 12:11:15 -0400 Subject: [CIG-SEISMO] Potential bug in SPECFEM3D_Cartisen Message-ID: Hi developers, Would like to report that this line of code may cause issues: [image: Inline image 1] As iglob_is_surface_external_mesh is true for the external meshes only and the loop is iterating within internal nodes, iglob_is_surface_external_mesh is always false. It can cause issues when "SOURCES_CAN_BE_BURIED" or "RECEIVERS_CAN_BE_BURIED" are false. Hope it makes sense. Best regards, Junwei -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: image.png Type: image/png Size: 10467 bytes Desc: not available URL: From komatitsch at lma.cnrs-mrs.fr Mon Jul 24 09:54:22 2017 From: komatitsch at lma.cnrs-mrs.fr (Dimitri Komatitsch) Date: Mon, 24 Jul 2017 18:54:22 +0200 Subject: [CIG-SEISMO] Potential bug in SPECFEM3D_Cartisen In-Reply-To: References: Message-ID: <18e5bc66-01e3-bd66-3b5a-979fbf0243b0@lma.cnrs-mrs.fr> Hi Junwei, Thank you very much. Very useful. Do you know how this should be fixed? If so, could you please send me the fix, or submit it directly as a pull request, following https://github.com/geodynamics/specfem3d/wiki/Using-Git-for-SPECFEM ? Thank you, Best regards, Dimitri. On 07/24/2017 06:11 PM, Junwei Huang wrote: > Hi developers, > Would like to report that this line of code may cause issues: > > Inline image 1 > > > As iglob_is_surface_external_mesh is true for the external meshes only > and the loop is iterating within internal nodes, > iglob_is_surface_external_mesh is always false. > > It can cause issues when "SOURCES_CAN_BE_BURIED" or > "RECEIVERS_CAN_BE_BURIED" are false. Hope it makes sense. > > Best regards, > Junwei > > > _______________________________________________ > CIG-SEISMO mailing list > CIG-SEISMO at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-seismo > -- Dimitri Komatitsch, CNRS Research Director (DR CNRS) Laboratory of Mechanics and Acoustics, Marseille, France http://komatitsch.free.fr From jwhuang1982 at gmail.com Mon Jul 24 11:12:49 2017 From: jwhuang1982 at gmail.com (Junwei Huang) Date: Mon, 24 Jul 2017 14:12:49 -0400 Subject: [CIG-SEISMO] Potential bug in SPECFEM3D_Cartisen In-Reply-To: <18e5bc66-01e3-bd66-3b5a-979fbf0243b0@lma.cnrs-mrs.fr> References: <18e5bc66-01e3-bd66-3b5a-979fbf0243b0@lma.cnrs-mrs.fr> Message-ID: Hi Dimitri, I am afraid I don't know enough about the software to modify it without introducing extra issues. In my case, I may work around it by hardwiring the loop to imin = 1 imax = NGLLX jmin = 1 jmax = NGLLY kmin = 1 kmax = NGLLZ for both source and receivers. I wonder what was the reason to exclude the edge nodes. Junwei On Mon, Jul 24, 2017 at 12:54 PM, Dimitri Komatitsch < komatitsch at lma.cnrs-mrs.fr> wrote: > > Hi Junwei, > > Thank you very much. Very useful. Do you know how this should be fixed? If > so, could you please send me the fix, or submit it directly as a pull > request, following https://github.com/geodynamics > /specfem3d/wiki/Using-Git-for-SPECFEM ? > > Thank you, > Best regards, > > Dimitri. > > On 07/24/2017 06:11 PM, Junwei Huang wrote: > >> Hi developers, >> Would like to report that this line of code may cause issues: >> >> Inline image 1 >> >> >> As iglob_is_surface_external_mesh is true for the external meshes only >> and the loop is iterating within internal nodes, >> iglob_is_surface_external_mesh is always false. >> >> It can cause issues when "SOURCES_CAN_BE_BURIED" or >> "RECEIVERS_CAN_BE_BURIED" are false. Hope it makes sense. >> >> Best regards, >> Junwei >> >> >> _______________________________________________ >> CIG-SEISMO mailing list >> CIG-SEISMO at geodynamics.org >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-seismo >> >> > -- > Dimitri Komatitsch, CNRS Research Director (DR CNRS) > Laboratory of Mechanics and Acoustics, Marseille, France > http://komatitsch.free.fr > -------------- next part -------------- An HTML attachment was scrubbed... URL: From komatitsch at lma.cnrs-mrs.fr Mon Jul 24 12:51:52 2017 From: komatitsch at lma.cnrs-mrs.fr (Dimitri Komatitsch) Date: Mon, 24 Jul 2017 21:51:52 +0200 Subject: [CIG-SEISMO] Potential bug in SPECFEM3D_Cartisen In-Reply-To: References: <18e5bc66-01e3-bd66-3b5a-979fbf0243b0@lma.cnrs-mrs.fr> Message-ID: <32ca513e-78bd-2582-ece9-9e5181a2b00e@lma.cnrs-mrs.fr> Hi Junwei, Thanks a lot. I have thus opened a new Git issue: https://github.com/geodynamics/specfem3d/issues/1087 Thank you, Best regards, Dimitri. On 07/24/2017 08:12 PM, Junwei Huang wrote: > Hi Dimitri, > I am afraid I don't know enough about the software to modify it without > introducing extra issues. In my case, I may work around it by hardwiring > the loop to > imin = 1 > imax = NGLLX > jmin = 1 > jmax = NGLLY > kmin = 1 > kmax = NGLLZ > for both source and receivers. I wonder what was the reason to exclude > the edge nodes. > > Junwei > > > > > On Mon, Jul 24, 2017 at 12:54 PM, Dimitri Komatitsch > > wrote: > > > Hi Junwei, > > Thank you very much. Very useful. Do you know how this should be > fixed? If so, could you please send me the fix, or submit it > directly as a pull request, following > https://github.com/geodynamics/specfem3d/wiki/Using-Git-for-SPECFEM > ? > > Thank you, > Best regards, > > Dimitri. > > On 07/24/2017 06:11 PM, Junwei Huang wrote: > > Hi developers, > Would like to report that this line of code may cause issues: > > Inline image 1 > > > As iglob_is_surface_external_mesh is true for the external > meshes only and the loop is iterating within internal nodes, > iglob_is_surface_external_mesh is always false. > > It can cause issues when "SOURCES_CAN_BE_BURIED" or > "RECEIVERS_CAN_BE_BURIED" are false. Hope it makes sense. > > Best regards, > Junwei > > > _______________________________________________ > CIG-SEISMO mailing list > CIG-SEISMO at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-seismo > > > > -- > Dimitri Komatitsch, CNRS Research Director (DR CNRS) > Laboratory of Mechanics and Acoustics, Marseille, France > http://komatitsch.free.fr > > -- Dimitri Komatitsch, CNRS Research Director (DR CNRS) Laboratory of Mechanics and Acoustics, Marseille, France http://komatitsch.free.fr From michael.gineste at ntnu.no Tue Jul 25 12:11:34 2017 From: michael.gineste at ntnu.no (Michael Gineste) Date: Tue, 25 Jul 2017 21:11:34 +0200 Subject: [CIG-SEISMO] external model file question Message-ID: <0b1b2473-fb66-beef-a8d0-a3c8b0de2177@ntnu.no> Hi Specfem group, I have a question regarding the external model file input, dependent on the parameter MODEL. I can not find much on this in the documentation so what I understand is from the source code... I need to supply the external model through a external file, so using a subroutine is not an option, ruling out parameter option 'external'. For the remaining options, it seems to me that that only option 'default' and 'tomo' allows you supply a model not to know the mesh partition in advance. Is this correctly understood? The 'default' option, which I have used so far, maps a model specification to an entire element, whereas I wish to pass a model at finer resolution, and with 'tomo' it gets interpolated onto the GLL points. Which is fine, but I wonder how one is supposed to handle the PML elements with this model format. Must values given at (x,z) points within the PML elements be ensured constant and equal to (x,z)-points just adjacent to the PML elements, in the same manner as when one uses the combination nbmodels, materials_file and absorbing_cpml_file? Or is it handled within the code? Thank you for your help. Best regards, Michael Gineste From michael.gineste at ntnu.no Wed Jul 26 13:14:11 2017 From: michael.gineste at ntnu.no (Michael Gineste) Date: Wed, 26 Jul 2017 22:14:11 +0200 Subject: [CIG-SEISMO] external model file question In-Reply-To: <0b1b2473-fb66-beef-a8d0-a3c8b0de2177@ntnu.no> References: <0b1b2473-fb66-beef-a8d0-a3c8b0de2177@ntnu.no> Message-ID: <8c96ac2b-5348-1f54-fdbd-317fc03852ef@ntnu.no> Hi all, I think I found out how it is supposed to be handled, by reading the Specfem3D manual. Sorry about the spam. Best, Michael Gineste On 2017-07-25 21:11, Michael Gineste wrote: > Hi Specfem group, > > I have a question regarding the external model file input, dependent on > the parameter MODEL. I can not find much on this in the documentation so > what I understand is from the source code... > > I need to supply the external model through a external file, so using a > subroutine is not an option, ruling out parameter option 'external'. For > the remaining options, it seems to me that that only option 'default' > and 'tomo' allows you supply a model not to know the mesh partition in > advance. Is this correctly understood? > > The 'default' option, which I have used so far, maps a model > specification to an entire element, whereas I wish to pass a model at > finer resolution, and with 'tomo' it gets interpolated onto the GLL > points. Which is fine, but I wonder how one is supposed to handle the > PML elements with this model format. > > Must values given at (x,z) points within the PML elements be ensured > constant and equal to (x,z)-points just adjacent to the PML elements, in > the same manner as when one uses the combination nbmodels, > materials_file and absorbing_cpml_file? Or is it handled within the code? > > Thank you for your help. > > Best regards, > Michael Gineste > > > From elodie.kendall.14 at ucl.ac.uk Mon Jul 31 03:37:11 2017 From: elodie.kendall.14 at ucl.ac.uk (Kendall, Elodie) Date: Mon, 31 Jul 2017 10:37:11 +0000 Subject: [CIG-SEISMO] Specfem3d globe: with an altered crustal model In-Reply-To: References: <3435B571-38D1-401C-8453-DA3A025A8076@mines.edu>, <8D5C33D1-33B1-47B6-A50B-A1DADCCA2C34@mines.edu> , <5E8C94E0-62D2-4DEF-B52D-E0F33E780CDC@mines.edu>, Message-ID: Hi there, I have created a new crust 2.0 (the same format but with crustal thickness perturbations added to each grid) and I want to analyse the effect on the synthetics. I have changed the maximum moho depth from 90 to 115 (the new max.), the mesher runs fine but the solver fails with error "forward simulation became unstable in fluid and blew up". I have suppressed the moho stretching and the solver runs without errors however I then lose accuracy as the moho is not being honoured. Do you know how I could check how many GLL points sample each crustal point please? I assume that if it really undersampled (1 not 5 GLL for the oceans) I should use something like this to create a new mesh? https://github.com/geodynamics/specfem3d/wiki/03_mesh_generation Thanks a lot, Elodie ________________________________ From: Ebru Bozdag Sent: 19 July 2017 16:47 To: Kendall, Elodie Subject: Re: [CIG-SEISMO] Specfem with an altered crustal model Hi Elodie, This was to diagnose where the problem was coming from. Just note that when you set SUPPRESS_MOHO_STRETCHING = .true. you are losing accuracy in surface-wave propagation particularly underneath oceans since probably in your model oceanic crust is now sampled by only one GLL point. That was the main motivation to start honouring Moho in global simulations where the oceanic crust is sampled by 1 spectral element (< 15 km) (thus there are 5 GLL points to sample it in the vertical direction) and the continental crust by 2 spectral elements (> 35 km). See Tromp et al. 2010 (Near real-time simulations of global CMT earthquakes, GJI) for the details. So you may prefer or need to honour Moho in your perturbed model (then you will need to adjust the mesh according to your Moho perturbations) depending on what you would like to address in your experiments. You can also send this to the CIG list to close the issue. Good luck! Ebru ----------------- Ebru Bozdag Assistant Professor COLORADOSCHOOLOFMINES Department of Geophysics bozdag at mines.edu | tel: +1-303-273-3578 | fax: +1-303-273-3478 On 19 Jul 2017, at 08:46, Kendall, Elodie > wrote: Hi Ebru, I set SUPPRESS_MOHO_STRETCHING = .true. in constant.h.in as you recommended and both the mesher and solver run perfectly now with no errors. Thanks a lot! Best wishes, Elodie ________________________________ From: Ebru Bozdag > Sent: 07 July 2017 01:33 To: Kendall, Elodie Subject: Fwd: [CIG-SEISMO] Specfem with an altered crustal model I also forward my response to you directly as there are quite some confusions recently with my email addresses. Ebru ----------------- Ebru Bozdag Assistant Professor COLORADOSCHOOLOFMINES Department of Geophysics bozdag at mines.edu | tel: +1-303-273-3578 | fax: +1-303-273-3478 Begin forwarded message: From: Ebru Bozdag > Subject: Re: [CIG-SEISMO] Specfem with an altered crustal model Date: 6 Jul 2017 18:31:26 MDT To: > Hi Elodie, Are you honouring the crustal thickness in your simulations, i.e., are you using ! to suppress element stretching for 3D moho surface logical,parameter :: SUPPRESS_MOHO_STRETCHING = .false. in your constant.h file? If this is the case perhaps after you add perturbations you may also be distorting the mesh aspect ratio at some locations. Just try to run the mesher and the solver by setting SUPPRESS_MOHO_STRETCHING = .true. and see if you still have the same problem, if you have not tried it yet. Best regards, Ebru ----------------- Ebru Bozdag Assistant Professor COLORADOSCHOOLOFMINES Department of Geophysics bozdag at mines.edu | tel: +1-303-273-3578 | fax: +1-303-273-3478 On 06 Jul 2017, at 02:32, Kendall, Elodie > wrote: Hi there, I was hoping to ask you a question about Specfem3d_globe please? I have been using it a few months now and I am currently trying to introduce crustal thickness perturbations into crust2.0 and analyse the effect on the synthetics. I have set up a new crust2.0 which has 16,200 keys (one for each 2 by 2 degree grid) but the same layers for each grid. I have altered the corresponding subroutines for this new set of keys, no errors with the mesher and solver- same synthetics as before. Now, I add the perturbations to each layer of each grid in my crust2.0 model. I have changed the maximum moho depth from 90 to 115 (the new max.), the mesher runs fine but the solver fails with error "forward simulation became unstable in fluid and blew up". I have triedchanging the time step DT to DT*0.95 and DT*0.8 in the script shared/ get_timestep_and_layers.f90 however the same error appears. I was hoping you could help me please? Thanks a lot, Elodie Kendall _______________________________________________ CIG-SEISMO mailing list CIG-SEISMO at geodynamics.org http://lists.geodynamics.org/cgi-bin/mailman/listinfo/cig-seismo -------------- next part -------------- An HTML attachment was scrubbed... URL: