From rene.gassmoeller at mailbox.org Tue Sep 4 16:41:35 2018 From: rene.gassmoeller at mailbox.org (Rene Gassmoeller) Date: Tue, 4 Sep 2018 16:41:35 -0700 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> Message-ID: Dear all, following our discussion last week I sat together with Max Rudolph here at UC Davis and discussed a few possibilities for modifying the current stabilization scheme in ASPECT to overcome the difficulties that were discussed, i.e. (apparent) imbalance of steady-state heat flux across the boundaries due to strong artificial diffusion through the boundary layers and thus a modified average temperature (all for models with a low to medium resolution of 3-5 global refinements in the boundary layer). I would think there are two possible solutions and went ahead with a few tests. 1. Reimplementing SUPG into ASPECT (as originally proposed in https://github.com/geodynamics/aspect/pull/412 and discarded). I have a rebased version of the original pull request open, but it is not tested rigorously. Everyone is welcome to contribute. 2. Modifying the entropy viscosity method, e.g. by applying the artificial diffusion in streamline direction only. The second should not require significant changes to the code and so I implemented a experimental version (see https://github.com/geodynamics/aspect/pull/2649). In preliminary tests by myself and Max it seems to perform pretty well, essentially behaving like the existing EV method, except for drastically reducing the diffusion perpendicular to the flow (not surprising), and therefore having much less influence on boundary layer heat flow. I have written a summary in the discussion of the pull request, but in essence I feel like this might be a step forward and would appreciate if those of you who had problems with particular benchmarks could give this version a try. All tests I have run so far indicate more accurate global properties (average temperature, vrms), but I also see slightly bigger temperature oscillations, and would be interested if the method is stable in all cases. So if you have a bit of time, please let me know about the results of your tests. If the method performs well enough, I could see this becoming the new default for the stabilization scheme, but for that we need a lot of testing. You can get my branch by running the following in your aspect git repo: git checkout -b gassmoeller-suev master git pull https://github.com/gassmoeller/aspect.git suev Let me know how that works, Best, Rene -- Rene Gassmoeller https://gassmoeller.github.io/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bangerth at colostate.edu Tue Sep 4 16:48:26 2018 From: bangerth at colostate.edu (Wolfgang Bangerth) Date: Tue, 4 Sep 2018 17:48:26 -0600 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> Message-ID: <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> On 09/04/2018 05:41 PM, Rene Gassmoeller wrote: > following our discussion last week I sat together with Max Rudolph here at UC > Davis and discussed a few possibilities for modifying the current > stabilization scheme in ASPECT to overcome the difficulties that were > discussed, i.e. (apparent) imbalance of steady-state heat flux across the > boundaries due to strong artificial diffusion through the boundary layers and > thus a modified average temperature (all for models with a low to medium > resolution of 3-5 global refinements in the boundary layer). First, thanks for tracking these things down! I have a completely tangential question: Right now, when we compute the heat flux across a boundary, we only consider the physical component of it, but not the part of that results from the artificial diffusion. Even if we had a discretization that is energy conservative, this would mean that the net heat flux through the boundaries (as currently computed) is not balanced. Should we modify the postprocessor to also take into account the artificial diffusion factor? Best W. -- ------------------------------------------------------------------------ Wolfgang Bangerth email: bangerth at colostate.edu www: http://www.math.colostate.edu/~bangerth/ From maxrudolph at ucdavis.edu Wed Sep 5 06:12:23 2018 From: maxrudolph at ucdavis.edu (Max Rudolph) Date: Wed, 5 Sep 2018 06:12:23 -0700 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> Message-ID: On Tue, Sep 4, 2018 at 5:03 PM Wolfgang Bangerth wrote: > On 09/04/2018 05:41 PM, Rene Gassmoeller wrote: > > following our discussion last week I sat together with Max Rudolph here > at UC > > Davis and discussed a few possibilities for modifying the current > > stabilization scheme in ASPECT to overcome the difficulties that were > > discussed, i.e. (apparent) imbalance of steady-state heat flux across > the > > boundaries due to strong artificial diffusion through the boundary > layers and > > thus a modified average temperature (all for models with a low to medium > > resolution of 3-5 global refinements in the boundary layer). > > First, thanks for tracking these things down! > > I have a completely tangential question: Right now, when we compute the > heat > flux across a boundary, we only consider the physical component of it, but > not > the part of that results from the artificial diffusion. Even if we had a > discretization that is energy conservative, this would mean that the net > heat > flux through the boundaries (as currently computed) is not balanced. > Should we > modify the postprocessor to also take into account the artificial > diffusion > factor? > > Rene and I discussed this idea on Monday and I don't think that this is the right thing to do. It would lead to an unexpected relationship between the temperature gradient (and hence temperature structure of the lithosphere) and the physical thermal conductivity. Maybe more helpful would be a separate output of the non-physical contribution to the heat flux through each boundary, or within the entire domain as the ratio of the norm of the artificial heat flux divided by the norm of the total heat flux. I still think that a warning message when this quantity exceeds, say, 1% would help users understand that they should expect unphysical results. Best > W. > > > -- > ------------------------------------------------------------------------ > Wolfgang Bangerth email: bangerth at colostate.edu > www: http://www.math.colostate.edu/~bangerth/ > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From bangerth at colostate.edu Wed Sep 5 10:41:11 2018 From: bangerth at colostate.edu (Wolfgang Bangerth) Date: Wed, 5 Sep 2018 11:41:11 -0600 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> Message-ID: <972cd5d9-7495-0aa3-55a7-808094dd6c63@colostate.edu> On 09/05/2018 07:12 AM, Max Rudolph wrote: > > Rene and I discussed this idea on Monday and I don't think that this is > the right thing to do. It would lead to an unexpected relationship > between the temperature gradient (and hence temperature structure of the > lithosphere) and the physical thermal conductivity. Maybe more helpful > would be a separate output of the non-physical contribution to the heat > flux through each boundary, or within the entire domain as the ratio of > the norm of the artificial heat flux divided by the norm of the total > heat flux. I still think that a warning message when this quantity > exceeds, say, 1% would help users understand that they should expect > unphysical results. But this warning message would be printed on pretty much every single simulation in which the mesh does not completely resolve boundary and internal layers -- which is essentially every simulation ever done in the field of mantle convection. If it was a rare occasion where artificial viscosity is needed to make a simulation stable, then we wouldn't use it. But the reality is that all realistic global-scale simulations must necessarily have some kind of artificial diffusion (SUPG, EV, dG schemes, ...) that is larger than the physical diffusion at least in parts of the domain because resolving the boundary layers is not possible on a global scale and will not be possible for a long time to come. The idea of artificial diffusion schemes is to make boundary layers as large as the cells of the mesh so that they are resolved, rather than leading to over/undershoots. It is *needed* to avoid Gibb's phenomenon if you can't make the mesh small enough. That does not mean that (i) the scheme we currently use is the best idea, (ii) we can't improve the situation. But I do not think that printing a warning for essentially every single simulation is useful. (I'll note that we also use artificial diffusion schemes for the compositional fields for which the physical diffusion is zero -- so the artificial diffusion is *always* larger than the physical one.) Best W. -- ------------------------------------------------------------------------ Wolfgang Bangerth email: bangerth at colostate.edu www: http://www.math.colostate.edu/~bangerth/ From maxrudolph at ucdavis.edu Wed Sep 5 11:56:05 2018 From: maxrudolph at ucdavis.edu (Max Rudolph) Date: Wed, 5 Sep 2018 11:56:05 -0700 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: <972cd5d9-7495-0aa3-55a7-808094dd6c63@colostate.edu> References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> <972cd5d9-7495-0aa3-55a7-808094dd6c63@colostate.edu> Message-ID: OK, you are right that there will always be some region with more artificial than physical heat transport. What about instead looking at the ratio of physical to artificial heat flux through each boundary? For Rene's anisotropic "SUEV" implementation, even in the presence of large entropy viscosity, the artificial heat transport can be very small as long as u.gradT is small. In particular, even though entropy viscosity is fairly large at the boundaries, the velocities are tangential to the boundary, so there is very little artificial diffusion. Max On Wed, Sep 5, 2018 at 10:41 AM Wolfgang Bangerth wrote: > On 09/05/2018 07:12 AM, Max Rudolph wrote: > > > > Rene and I discussed this idea on Monday and I don't think that this is > > the right thing to do. It would lead to an unexpected relationship > > between the temperature gradient (and hence temperature structure of the > > lithosphere) and the physical thermal conductivity. Maybe more helpful > > would be a separate output of the non-physical contribution to the heat > > flux through each boundary, or within the entire domain as the ratio of > > the norm of the artificial heat flux divided by the norm of the total > > heat flux. I still think that a warning message when this quantity > > exceeds, say, 1% would help users understand that they should expect > > unphysical results. > > But this warning message would be printed on pretty much every single > simulation in which the mesh does not completely resolve boundary and > internal layers -- which is essentially every simulation ever done in > the field of mantle convection. > > If it was a rare occasion where artificial viscosity is needed to make a > simulation stable, then we wouldn't use it. But the reality is that all > realistic global-scale simulations must necessarily have some kind of > artificial diffusion (SUPG, EV, dG schemes, ...) that is larger than the > physical diffusion at least in parts of the domain because resolving the > boundary layers is not possible on a global scale and will not be > possible for a long time to come. The idea of artificial diffusion > schemes is to make boundary layers as large as the cells of the mesh so > that they are resolved, rather than leading to over/undershoots. It is > *needed* to avoid Gibb's phenomenon if you can't make the mesh small > enough. > > That does not mean that (i) the scheme we currently use is the best > idea, (ii) we can't improve the situation. But I do not think that > printing a warning for essentially every single simulation is useful. > > (I'll note that we also use artificial diffusion schemes for the > compositional fields for which the physical diffusion is zero -- so the > artificial diffusion is *always* larger than the physical one.) > > Best > W. > > -- > ------------------------------------------------------------------------ > Wolfgang Bangerth email: bangerth at colostate.edu > www: http://www.math.colostate.edu/~bangerth/ > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From sdk at vt.edu Wed Sep 5 16:47:36 2018 From: sdk at vt.edu (Scott King) Date: Wed, 5 Sep 2018 19:47:36 -0400 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> <972cd5d9-7495-0aa3-55a7-808094dd6c63@colostate.edu> Message-ID: As for calculating fluxes at the boundaries, I looked at the heat flux code a bit and I’m wondering... I will share this paper with you all. https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1365-246X.1987.tb01375.x It might be less relevant to second-order elements than linear elements but, a lot of the same arguments I’m seeing in the posts the last few days are bringing back memories. This is what is done by default in CitcomS (unless you explicitly call the CFB method). I suspect it should be fairly trivial for someone who is good with deal.ii to implement because it is pretty standard finite element stuff, i.e., calculate fluxes at the internal integration points and project the values to the nodes. Yes, the artificial diffusivity is an issue but I think this explains why even when we turn it off we get relatively poor Nusselt numbers while getting excellent agreement with depth-averaged properties and mean values. Scott > On Sep 5, 2018, at 2:56 PM, Max Rudolph wrote: > > OK, you are right that there will always be some region with more artificial than physical heat transport. > What about instead looking at the ratio of physical to artificial heat flux through each boundary? > For Rene's anisotropic "SUEV" implementation, even in the presence of large entropy viscosity, the artificial heat transport can be very small as long as u.gradT is small. In particular, even though entropy viscosity is fairly large at the boundaries, the velocities are tangential to the boundary, so there is very little artificial diffusion. > > Max > > On Wed, Sep 5, 2018 at 10:41 AM Wolfgang Bangerth > wrote: > On 09/05/2018 07:12 AM, Max Rudolph wrote: > > > > Rene and I discussed this idea on Monday and I don't think that this is > > the right thing to do. It would lead to an unexpected relationship > > between the temperature gradient (and hence temperature structure of the > > lithosphere) and the physical thermal conductivity. Maybe more helpful > > would be a separate output of the non-physical contribution to the heat > > flux through each boundary, or within the entire domain as the ratio of > > the norm of the artificial heat flux divided by the norm of the total > > heat flux. I still think that a warning message when this quantity > > exceeds, say, 1% would help users understand that they should expect > > unphysical results. > > But this warning message would be printed on pretty much every single > simulation in which the mesh does not completely resolve boundary and > internal layers -- which is essentially every simulation ever done in > the field of mantle convection. > > If it was a rare occasion where artificial viscosity is needed to make a > simulation stable, then we wouldn't use it. But the reality is that all > realistic global-scale simulations must necessarily have some kind of > artificial diffusion (SUPG, EV, dG schemes, ...) that is larger than the > physical diffusion at least in parts of the domain because resolving the > boundary layers is not possible on a global scale and will not be > possible for a long time to come. The idea of artificial diffusion > schemes is to make boundary layers as large as the cells of the mesh so > that they are resolved, rather than leading to over/undershoots. It is > *needed* to avoid Gibb's phenomenon if you can't make the mesh small enough. > > That does not mean that (i) the scheme we currently use is the best > idea, (ii) we can't improve the situation. But I do not think that > printing a warning for essentially every single simulation is useful. > > (I'll note that we also use artificial diffusion schemes for the > compositional fields for which the physical diffusion is zero -- so the > artificial diffusion is *always* larger than the physical one.) > > Best > W. > > -- > ------------------------------------------------------------------------ > Wolfgang Bangerth email: bangerth at colostate.edu > www: http://www.math.colostate.edu/~bangerth/ > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From bangerth at colostate.edu Thu Sep 6 10:27:55 2018 From: bangerth at colostate.edu (Wolfgang Bangerth) Date: Thu, 6 Sep 2018 11:27:55 -0600 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> <972cd5d9-7495-0aa3-55a7-808094dd6c63@colostate.edu> Message-ID: <972fe330-5452-3125-59f8-6dded2d8d184@colostate.edu> On 09/05/2018 12:56 PM, Max Rudolph wrote: > What about instead looking at the ratio of physical to artificial heat > flux through each boundary? I think it would be interesting to try this out for a case you are interested in. I don't have intuition how that would come out though: with larger diffusion comes a gentler slope in the boundary layer, and consequently less heat flux. But because there is more heat flux overall due to the artificial diffusion, it may also be the other way around. Best W. -- ------------------------------------------------------------------------ Wolfgang Bangerth email: bangerth at colostate.edu www: http://www.math.colostate.edu/~bangerth/ From adamholt at mit.edu Thu Sep 6 12:25:01 2018 From: adamholt at mit.edu (Adam Holt) Date: Thu, 6 Sep 2018 19:25:01 +0000 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure Message-ID: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> Hi all! I am a relatively new user of ASPECT, and have ran into some tricky things related to visualizing/post-processing ASPECT output using Paraview. I wondered if anyone had experience with the following: First, plotting evenly spaced velocity vectors for models with mesh refinement. As the element sizes vary dramatically, plotting vectors using "Every Nth Point" (Paraview Glyph option) produces very uneven vector coverage. I thought I had solved this by interpolating the data onto a plane (using Paraview function "Resample with dataset"), but the result of this function (an evenly spaced velocity field) does not time evolve with the simulation (at least for Paraview 5.2.0). Has anybody ran into a similar issue? Second, I am interested in the dynamic pressure field and wondered how best to retrieve it from my (incompressible) models. For such models, I assume it can be computed by subtracting the horizontally-constant static pressure from the pressure outputted (the 'nonadiabatic pressure' output variable). Is this something that should be done in Paraview, or by writing a new post-processor plugin? Thanks in advance for any input! Adam Holt From jperryh2 at uoregon.edu Thu Sep 6 16:41:48 2018 From: jperryh2 at uoregon.edu (Jonathan Perry-Houts) Date: Thu, 6 Sep 2018 16:41:48 -0700 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure In-Reply-To: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> References: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> Message-ID: <7333fb72-c18c-de54-cb29-741dcba2a8c1@uoregon.edu> On 09/06/18 12:25, Adam Holt wrote: > Hi all! > > I am a relatively new user of ASPECT, and have ran into some tricky things related to visualizing/post-processing ASPECT output using Paraview. I wondered if anyone had experience with the following: > > First, plotting evenly spaced velocity vectors for models with mesh refinement. As the element sizes vary dramatically, plotting vectors using "Every Nth Point" (Paraview Glyph option) produces very uneven vector coverage. I thought I had solved this by interpolating the data onto a plane (using Paraview function "Resample with dataset"), but the result of this function (an evenly spaced velocity field) does not time evolve with the simulation (at least for Paraview 5.2.0). Has anybody ran into a similar issue? That seems to work in Paraview 5.5.0 (it does evolve with the simulation for me). Not sure if it matters, but I used a "Plane" source as the uniform grid to Resample on. I attached a custom filter that works for me (import it with Tools>Manage Custom Filters>Import). It's ridiculous that the glyph filter's "uniform spatial distribution" option doesn't do this automatically. That's exactly what I would expect it to do. Apparently it just selects uniformly spaced points, and if there happens to be a node there, it will plot a vector, otherwise it skips it. That's why it almost never works for "small" data sets like a 2d mesh. > Second, I am interested in the dynamic pressure field and wondered how best to retrieve it from my (incompressible) models. For such models, I assume it can be computed by subtracting the horizontally-constant static pressure from the pressure outputted (the 'nonadiabatic pressure' output variable). Is this something that should be done in Paraview, or by writing a new post-processor plugin?> > Thanks in advance for any input! > Adam Holt > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > -------------- next part -------------- From adamholt at mit.edu Thu Sep 6 19:30:26 2018 From: adamholt at mit.edu (Adam Holt) Date: Fri, 7 Sep 2018 02:30:26 +0000 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure In-Reply-To: <7333fb72-c18c-de54-cb29-741dcba2a8c1@uoregon.edu> References: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu>, <7333fb72-c18c-de54-cb29-741dcba2a8c1@uoregon.edu> Message-ID: <8107E67F0A485F42A0545655001D992D97A6BA93@OC11EXPO26.exchange.mit.edu> Thanks Jonathan! After upgrading to the newer Paraview (5.5), the filter works nicely. Again, it didn't do the time-stepping for version 5.2, so I guess the issue is with the older version. Good to know. And totally agree about the "uniform spatial distribution" - It took me a while to figure out why this was not showing glyphs for my 2-D models... many thanks, Adam ________________________________________ From: Aspect-devel [aspect-devel-bounces at geodynamics.org] on behalf of Jonathan Perry-Houts [jperryh2 at uoregon.edu] Sent: Thursday, September 06, 2018 7:41 PM To: aspect-devel at geodynamics.org Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure On 09/06/18 12:25, Adam Holt wrote: > Hi all! > > I am a relatively new user of ASPECT, and have ran into some tricky things related to visualizing/post-processing ASPECT output using Paraview. I wondered if anyone had experience with the following: > > First, plotting evenly spaced velocity vectors for models with mesh refinement. As the element sizes vary dramatically, plotting vectors using "Every Nth Point" (Paraview Glyph option) produces very uneven vector coverage. I thought I had solved this by interpolating the data onto a plane (using Paraview function "Resample with dataset"), but the result of this function (an evenly spaced velocity field) does not time evolve with the simulation (at least for Paraview 5.2.0). Has anybody ran into a similar issue? That seems to work in Paraview 5.5.0 (it does evolve with the simulation for me). Not sure if it matters, but I used a "Plane" source as the uniform grid to Resample on. I attached a custom filter that works for me (import it with Tools>Manage Custom Filters>Import). It's ridiculous that the glyph filter's "uniform spatial distribution" option doesn't do this automatically. That's exactly what I would expect it to do. Apparently it just selects uniformly spaced points, and if there happens to be a node there, it will plot a vector, otherwise it skips it. That's why it almost never works for "small" data sets like a 2d mesh. > Second, I am interested in the dynamic pressure field and wondered how best to retrieve it from my (incompressible) models. For such models, I assume it can be computed by subtracting the horizontally-constant static pressure from the pressure outputted (the 'nonadiabatic pressure' output variable). Is this something that should be done in Paraview, or by writing a new post-processor plugin?> > Thanks in advance for any input! > Adam Holt > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > From sking07 at vt.edu Thu Sep 6 19:18:46 2018 From: sking07 at vt.edu (Scott King) Date: Thu, 6 Sep 2018 22:18:46 -0400 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: <3be46824-9777-6b5a-881c-005fea029d3f@mailbox.org> References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> <972cd5d9-7495-0aa3-55a7-808094dd6c63@colostate.edu> <3be46824-9777-6b5a-881c-005fea029d3f@mailbox.org> Message-ID: <5FCBA80B-5D66-44DD-8489-185694449DE3@vt.edu> Great! Glad I could contribute instead of being my usual grumpy PITA. We’re all set up to run Zhong cases as quickly as we can get them through the queues. Best Scott Sent from my iPhone > On Sep 6, 2018, at 10:08 PM, Rene Gassmoeller wrote: > > Hi Scott, > > very interesting, thanks for sharing that thought! That looks like a significant improvement for the heat flux postprocessors. You were right, the changes to the postprocessor were not very complicated, and I will open a pull request for them tomorrow, when I have cleaned up a few things. I just wanted to share some first results with you: > > These are the original convergence studies for the Blankenbach 1a case with ASPECT: > > # Nu Vrms name (refinement level): > 4.78661864e+00 4.34590432e+01 case1a_ref4.stat > 4.87927972e+00 4.29377468e+01 case1a_ref5.stat > 4.88993106e+00 4.28733838e+01 case1a_ref6.stat > 4.88680525e+00 4.28659548e+01 case1a_ref7.stat > 4.88440900e+00 4.28649470e+01 case1a_reference.stat > > Both Nu and Vrms converge, but rather slowly for the very low Rayleigh number (10^4). Below are the values with Wolfgang's improvements in pull request 2650 (taking the max of artificial diffusion and physical diffusion instead of the sum): > > # Nu Vrms name (refinement level): > 5.30885322e+00 4.28499932e+01 case1a_ref3.stat > 5.06735289e+00 4.28656773e+01 case1a_ref4.stat > 4.93712396e+00 4.28650353e+01 case1a_ref5.stat > 4.88440900e+00 4.28649470e+01 case1a_reference.stat > > As you can see the Vrms is now much closer to the reference value already at low resolutions (even at refinement level 3, which is only 8x8 cells). But the Nusselt number is now worse, and converging from above the reference value instead of from below. With your suggested improvements to the postprocessors (taking the volume averaged total heat flux in the boundary cell, instead of the conductive heat flux at the surface): > > # Nu Vrms name (refinement level): > 4.89728221e+00 4.28499932e+01 case1a_ref3.stat > 4.88535143e+00 4.28656773e+01 case1a_ref4.stat > 4.88443365e+00 4.28650353e+01 case1a_ref5.stat > 4.88440900e+00 4.28649470e+01 case1a_reference.stat > The Vrms is not affected, because it is only a change in the postprocessor, but now the Nu number is significantly closer to the reference value even at low resolutions. All in all, we now get a better accuracy with a 16x16 grid, than with a 128x128 grid before the changes. I would say that is progress :-). > The other Blankenbach cases show similar improvements (still running though), and I have not yet tested the behavior for other geometries, but I do not think there is a conceptual problem. I will not have time to do much more benchmarking, because I am traveling from the end of next week on, but do you think you or Grant would have some time to give a few of the cases of the Zhong 2008 paper another try once the changes are in the main version? > > Thanks again for the reference! > > Best, > Rene > >> On 09/05/2018 04:47 PM, Scott King wrote: >> >> As for calculating fluxes at the boundaries, I looked at the heat flux code a bit and I’m wondering... I will share this paper with you all. >> >> https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1365-246X.1987.tb01375.x >> >> It might be less relevant to second-order elements than linear elements but, a lot of the same arguments I’m seeing in the posts the last few days are bringing back memories. This is what is done by default in CitcomS (unless you explicitly call the CFB method). I suspect it should be fairly trivial for someone who is good with deal.ii to implement because it is pretty standard finite element stuff, i.e., calculate fluxes at the internal integration points and project the values to the nodes. >> >> Yes, the artificial diffusivity is an issue but I think this explains why even when we turn it off we get relatively poor Nusselt numbers while getting excellent agreement with depth-averaged properties and mean values. >> >> Scott >> >>> On Sep 5, 2018, at 2:56 PM, Max Rudolph wrote: >>> >>> OK, you are right that there will always be some region with more artificial than physical heat transport. >>> What about instead looking at the ratio of physical to artificial heat flux through each boundary? >>> For Rene's anisotropic "SUEV" implementation, even in the presence of large entropy viscosity, the artificial heat transport can be very small as long as u.gradT is small. In particular, even though entropy viscosity is fairly large at the boundaries, the velocities are tangential to the boundary, so there is very little artificial diffusion. >>> >>> Max >>> >>>> On Wed, Sep 5, 2018 at 10:41 AM Wolfgang Bangerth wrote: >>>> On 09/05/2018 07:12 AM, Max Rudolph wrote: >>>> > >>>> > Rene and I discussed this idea on Monday and I don't think that this is >>>> > the right thing to do. It would lead to an unexpected relationship >>>> > between the temperature gradient (and hence temperature structure of the >>>> > lithosphere) and the physical thermal conductivity. Maybe more helpful >>>> > would be a separate output of the non-physical contribution to the heat >>>> > flux through each boundary, or within the entire domain as the ratio of >>>> > the norm of the artificial heat flux divided by the norm of the total >>>> > heat flux. I still think that a warning message when this quantity >>>> > exceeds, say, 1% would help users understand that they should expect >>>> > unphysical results. >>>> >>>> But this warning message would be printed on pretty much every single >>>> simulation in which the mesh does not completely resolve boundary and >>>> internal layers -- which is essentially every simulation ever done in >>>> the field of mantle convection. >>>> >>>> If it was a rare occasion where artificial viscosity is needed to make a >>>> simulation stable, then we wouldn't use it. But the reality is that all >>>> realistic global-scale simulations must necessarily have some kind of >>>> artificial diffusion (SUPG, EV, dG schemes, ...) that is larger than the >>>> physical diffusion at least in parts of the domain because resolving the >>>> boundary layers is not possible on a global scale and will not be >>>> possible for a long time to come. The idea of artificial diffusion >>>> schemes is to make boundary layers as large as the cells of the mesh so >>>> that they are resolved, rather than leading to over/undershoots. It is >>>> *needed* to avoid Gibb's phenomenon if you can't make the mesh small enough. >>>> >>>> That does not mean that (i) the scheme we currently use is the best >>>> idea, (ii) we can't improve the situation. But I do not think that >>>> printing a warning for essentially every single simulation is useful. >>>> >>>> (I'll note that we also use artificial diffusion schemes for the >>>> compositional fields for which the physical diffusion is zero -- so the >>>> artificial diffusion is *always* larger than the physical one.) >>>> >>>> Best >>>> W. >>>> >>>> -- >>>> ------------------------------------------------------------------------ >>>> Wolfgang Bangerth email: bangerth at colostate.edu >>>> www: http://www.math.colostate.edu/~bangerth/ >>>> _______________________________________________ >>>> Aspect-devel mailing list >>>> Aspect-devel at geodynamics.org >>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>> _______________________________________________ >>> Aspect-devel mailing list >>> Aspect-devel at geodynamics.org >>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >> >> >> >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > > -- > Rene Gassmoeller > https://gassmoeller.github.io/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From bangerth at colostate.edu Thu Sep 6 20:20:50 2018 From: bangerth at colostate.edu (Wolfgang Bangerth) Date: Thu, 6 Sep 2018 21:20:50 -0600 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure In-Reply-To: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> References: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> Message-ID: <379937f1-71f5-d65a-80e8-de0c702cba95@colostate.edu> On 09/06/2018 01:25 PM, Adam Holt wrote: > Second, I am interested in the dynamic pressure field and wondered how best > to retrieve it from my (incompressible) models. For such models, I assume > it can be computed by subtracting the horizontally-constant static pressure > from the pressure outputted (the 'nonadiabatic pressure' output variable). I think that the "nonadiabatic pressure" is exactly the "dynamic pressure" you want: it's the total pressure minus the adiabatic pressure. So there isn't a need to write anything new, you just need to specify "nonadiabatic pressure" in the list of visualization variables. Though it is true that the adiabatic pressure is only computed once at the beginning of a simulation, and is not equal to the horizontally averaged pressure. There are, in other words, multiple ways to define a "dynamic pressure", and you need to specify which one exactly you want in order to figure out whether there is already an existing visualization postprocessor. Cheers W. -- ------------------------------------------------------------------------ Wolfgang Bangerth email: bangerth at colostate.edu www: http://www.math.colostate.edu/~bangerth/ From adamholt at mit.edu Fri Sep 7 10:42:41 2018 From: adamholt at mit.edu (Adam Holt) Date: Fri, 7 Sep 2018 17:42:41 +0000 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure In-Reply-To: <379937f1-71f5-d65a-80e8-de0c702cba95@colostate.edu> References: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu>, <379937f1-71f5-d65a-80e8-de0c702cba95@colostate.edu> Message-ID: <8107E67F0A485F42A0545655001D992D97A6BB4E@OC11EXPO26.exchange.mit.edu> Thanks much for the explanation! I'm interested in the component of the pressure that drives flow, so yes it sounds like this "nonadiabatic pressure". (For my incompressible models, I'm assuming this is just total minus hydrostatic pressure.) I was confused because the "nonadiabatic pressure" output is dominated by a depth-dependent 1-D trend. I'm assuming this is a numerical/normalization thing, and I just need to remove the horizontal average at all depths. many thanks, Adam ________________________________________ From: Aspect-devel [aspect-devel-bounces at geodynamics.org] on behalf of Wolfgang Bangerth [bangerth at colostate.edu] Sent: Thursday, September 06, 2018 11:20 PM To: aspect-devel at geodynamics.org Subject: Re: [aspect-devel] Evenly spaced vectors, dynamic pressure On 09/06/2018 01:25 PM, Adam Holt wrote: > Second, I am interested in the dynamic pressure field and wondered how best > to retrieve it from my (incompressible) models. For such models, I assume > it can be computed by subtracting the horizontally-constant static pressure > from the pressure outputted (the 'nonadiabatic pressure' output variable). I think that the "nonadiabatic pressure" is exactly the "dynamic pressure" you want: it's the total pressure minus the adiabatic pressure. So there isn't a need to write anything new, you just need to specify "nonadiabatic pressure" in the list of visualization variables. Though it is true that the adiabatic pressure is only computed once at the beginning of a simulation, and is not equal to the horizontally averaged pressure. There are, in other words, multiple ways to define a "dynamic pressure", and you need to specify which one exactly you want in order to figure out whether there is already an existing visualization postprocessor. Cheers W. -- ------------------------------------------------------------------------ Wolfgang Bangerth email: bangerth at colostate.edu www: http://www.math.colostate.edu/~bangerth/ _______________________________________________ Aspect-devel mailing list Aspect-devel at geodynamics.org http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel From rene.gassmoeller at mailbox.org Fri Sep 7 15:50:15 2018 From: rene.gassmoeller at mailbox.org (Rene Gassmoeller) Date: Fri, 7 Sep 2018 15:50:15 -0700 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure In-Reply-To: <8107E67F0A485F42A0545655001D992D97A6BB4E@OC11EXPO26.exchange.mit.edu> References: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> <379937f1-71f5-d65a-80e8-de0c702cba95@colostate.edu> <8107E67F0A485F42A0545655001D992D97A6BB4E@OC11EXPO26.exchange.mit.edu> Message-ID: <58cbf659-4ad5-b426-31c1-0ce26a013ff7@mailbox.org> Hi Adam, If your nonadiabatic pressure is dominated by a 1D trend that likely means the reference profile that is computed by ASPECT in the initialization is not quite consistent with the actual pressure solution in your model. You can control the reference profile that is computed by setting the following parameters in the parameter file: set Surface pressure = 0 set Adiabatic surface temperature = 0 (change to whatever temperature you use as reference temperature in the material model) That might not totally eliminate the deviation between reference profile and hydrostatic pressure, but it should help a great deal. Best, Rene On 09/07/2018 10:42 AM, Adam Holt wrote: > Thanks much for the explanation! I'm interested in the component of the pressure that drives flow, so yes it sounds like this "nonadiabatic pressure". (For my incompressible models, I'm assuming this is just total minus hydrostatic pressure.) > > I was confused because the "nonadiabatic pressure" output is dominated by a depth-dependent 1-D trend. I'm assuming this is a numerical/normalization thing, and I just need to remove the horizontal average at all depths. > > many thanks, > Adam > > > ________________________________________ > From: Aspect-devel [aspect-devel-bounces at geodynamics.org] on behalf of Wolfgang Bangerth [bangerth at colostate.edu] > Sent: Thursday, September 06, 2018 11:20 PM > To: aspect-devel at geodynamics.org > Subject: Re: [aspect-devel] Evenly spaced vectors, dynamic pressure > > On 09/06/2018 01:25 PM, Adam Holt wrote: >> Second, I am interested in the dynamic pressure field and wondered how best >> to retrieve it from my (incompressible) models. For such models, I assume >> it can be computed by subtracting the horizontally-constant static pressure >> from the pressure outputted (the 'nonadiabatic pressure' output variable). > I think that the "nonadiabatic pressure" is exactly the "dynamic pressure" you > want: it's the total pressure minus the adiabatic pressure. So there isn't a > need to write anything new, you just need to specify "nonadiabatic pressure" > in the list of visualization variables. > > Though it is true that the adiabatic pressure is only computed once at the > beginning of a simulation, and is not equal to the horizontally averaged > pressure. There are, in other words, multiple ways to define a "dynamic > pressure", and you need to specify which one exactly you want in order to > figure out whether there is already an existing visualization postprocessor. > > Cheers > W. > > -- > ------------------------------------------------------------------------ > Wolfgang Bangerth email: bangerth at colostate.edu > www: http://www.math.colostate.edu/~bangerth/ > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -- Rene Gassmoeller https://gassmoeller.github.io/ From bangerth at colostate.edu Fri Sep 7 16:11:05 2018 From: bangerth at colostate.edu (Wolfgang Bangerth) Date: Fri, 7 Sep 2018 17:11:05 -0600 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure In-Reply-To: <8107E67F0A485F42A0545655001D992D97A6BB4E@OC11EXPO26.exchange.mit.edu> References: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> <379937f1-71f5-d65a-80e8-de0c702cba95@colostate.edu> <8107E67F0A485F42A0545655001D992D97A6BB4E@OC11EXPO26.exchange.mit.edu> Message-ID: On 09/07/2018 11:42 AM, Adam Holt wrote: > Thanks much for the explanation! I'm interested in the component of > the pressure that drives flow, so yes it sounds like this > "nonadiabatic pressure". (For my incompressible models, I'm assuming > this is just total minus hydrostatic pressure.) The problem is that there is no one "pressure that drives flow", or "dynamic pressure". You can define it in many ways -- basically, you can think of the dynamic pressure as the total pressure minus a "static" pressure p_s where p_s is chosen so that it does not result in any flow or mass movement. (Mathematically, p_s equals a gravity potential -- any gravity potential --, then it does not result in any flow.) The problem is that you can choose p_s in many different ways. For example, if you choose p_s to be the static, 1d (depth dependent) pressure for *any* stable stratification where the density increases with depth, then that satisfies the conditions. For this density profile, you can choose the adiabatic profile, or the laterally averaged density at any given time. The point is that since p_d = p_total - p_s and because p_s can be chosen in different ways, there are different ways to define p_d all of which can be considered to have a component that "drives the flow". The postprocessor mentioned above defines one particular way of choosing it. Best W. -- ------------------------------------------------------------------------ Wolfgang Bangerth email: bangerth at colostate.edu www: http://www.math.colostate.edu/~bangerth/ From adamholt at mit.edu Sat Sep 8 16:37:07 2018 From: adamholt at mit.edu (Adam Holt) Date: Sat, 8 Sep 2018 23:37:07 +0000 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure In-Reply-To: <58cbf659-4ad5-b426-31c1-0ce26a013ff7@mailbox.org> References: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> <379937f1-71f5-d65a-80e8-de0c702cba95@colostate.edu> <8107E67F0A485F42A0545655001D992D97A6BB4E@OC11EXPO26.exchange.mit.edu>, <58cbf659-4ad5-b426-31c1-0ce26a013ff7@mailbox.org> Message-ID: <8107E67F0A485F42A0545655001D992D97A6BC01@OC11EXPO26.exchange.mit.edu> Great! This indeed gives a reference profile closer to hydrostatic (resulting in lateral pressure gradients being clear in the "nonadiabatic pressure"). Thanks for the tip Rene, and thanks Wolfgang for the clarification (previous email) -Adam ________________________________________ From: Aspect-devel [aspect-devel-bounces at geodynamics.org] on behalf of Rene Gassmoeller [rene.gassmoeller at mailbox.org] Sent: Friday, September 07, 2018 6:50 PM To: aspect-devel at geodynamics.org Subject: Re: [aspect-devel] Evenly spaced vectors, dynamic pressure Hi Adam, If your nonadiabatic pressure is dominated by a 1D trend that likely means the reference profile that is computed by ASPECT in the initialization is not quite consistent with the actual pressure solution in your model. You can control the reference profile that is computed by setting the following parameters in the parameter file: set Surface pressure = 0 set Adiabatic surface temperature = 0 (change to whatever temperature you use as reference temperature in the material model) That might not totally eliminate the deviation between reference profile and hydrostatic pressure, but it should help a great deal. Best, Rene On 09/07/2018 10:42 AM, Adam Holt wrote: > Thanks much for the explanation! I'm interested in the component of the pressure that drives flow, so yes it sounds like this "nonadiabatic pressure". (For my incompressible models, I'm assuming this is just total minus hydrostatic pressure.) > > I was confused because the "nonadiabatic pressure" output is dominated by a depth-dependent 1-D trend. I'm assuming this is a numerical/normalization thing, and I just need to remove the horizontal average at all depths. > > many thanks, > Adam > > > ________________________________________ > From: Aspect-devel [aspect-devel-bounces at geodynamics.org] on behalf of Wolfgang Bangerth [bangerth at colostate.edu] > Sent: Thursday, September 06, 2018 11:20 PM > To: aspect-devel at geodynamics.org > Subject: Re: [aspect-devel] Evenly spaced vectors, dynamic pressure > > On 09/06/2018 01:25 PM, Adam Holt wrote: >> Second, I am interested in the dynamic pressure field and wondered how best >> to retrieve it from my (incompressible) models. For such models, I assume >> it can be computed by subtracting the horizontally-constant static pressure >> from the pressure outputted (the 'nonadiabatic pressure' output variable). > I think that the "nonadiabatic pressure" is exactly the "dynamic pressure" you > want: it's the total pressure minus the adiabatic pressure. So there isn't a > need to write anything new, you just need to specify "nonadiabatic pressure" > in the list of visualization variables. > > Though it is true that the adiabatic pressure is only computed once at the > beginning of a simulation, and is not equal to the horizontally averaged > pressure. There are, in other words, multiple ways to define a "dynamic > pressure", and you need to specify which one exactly you want in order to > figure out whether there is already an existing visualization postprocessor. > > Cheers > W. > > -- > ------------------------------------------------------------------------ > Wolfgang Bangerth email: bangerth at colostate.edu > www: http://www.math.colostate.edu/~bangerth/ > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -- Rene Gassmoeller https://gassmoeller.github.io/ _______________________________________________ Aspect-devel mailing list Aspect-devel at geodynamics.org http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel From mark.brandon at yale.edu Sun Sep 9 09:04:11 2018 From: mark.brandon at yale.edu (Mark Brandon) Date: Sun, 9 Sep 2018 12:04:11 -0400 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure In-Reply-To: <8107E67F0A485F42A0545655001D992D97A6BC01@OC11EXPO26.exchange.mit.edu> References: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> <379937f1-71f5-d65a-80e8-de0c702cba95@colostate.edu> <8107E67F0A485F42A0545655001D992D97A6BB4E@OC11EXPO26.exchange.mit.edu> <58cbf659-4ad5-b426-31c1-0ce26a013ff7@mailbox.org> <8107E67F0A485F42A0545655001D992D97A6BC01@OC11EXPO26.exchange.mit.edu> Message-ID: <2D8C5651-DC92-445F-A2E2-06357222D591@yale.edu> I am a reader of this exchange (and learning a lot in the process). I have one small comment: diabatic means with loss of heat, adiabatic means no loss of heat. As a result, nonadiabatic is a double negative. Maybe that is why it typically shown in quotes in this discussion. Back to reading mode.... Best, Mark > On Sep 8, 2018, at 7:37 PM, Adam Holt wrote: > > Great! This indeed gives a reference profile closer to hydrostatic (resulting in lateral pressure gradients being clear in the "nonadiabatic pressure"). > > Thanks for the tip Rene, and thanks Wolfgang for the clarification (previous email) > > -Adam > > ________________________________________ > From: Aspect-devel [aspect-devel-bounces at geodynamics.org] on behalf of Rene Gassmoeller [rene.gassmoeller at mailbox.org] > Sent: Friday, September 07, 2018 6:50 PM > To: aspect-devel at geodynamics.org > Subject: Re: [aspect-devel] Evenly spaced vectors, dynamic pressure > > Hi Adam, > > If your nonadiabatic pressure is dominated by a 1D trend that likely > means the reference profile that is computed by ASPECT in the > initialization is not quite consistent with the actual pressure solution > in your model. You can control the reference profile that is computed by > setting the following parameters in the parameter file: > > set Surface pressure = 0 > set Adiabatic surface temperature = 0 (change to whatever temperature > you use as reference temperature in the material model) > > That might not totally eliminate the deviation between reference profile > and hydrostatic pressure, but it should help a great deal. > > Best, > > Rene > > > On 09/07/2018 10:42 AM, Adam Holt wrote: >> Thanks much for the explanation! I'm interested in the component of the pressure that drives flow, so yes it sounds like this "nonadiabatic pressure". (For my incompressible models, I'm assuming this is just total minus hydrostatic pressure.) >> >> I was confused because the "nonadiabatic pressure" output is dominated by a depth-dependent 1-D trend. I'm assuming this is a numerical/normalization thing, and I just need to remove the horizontal average at all depths. >> >> many thanks, >> Adam >> >> >> ________________________________________ >> From: Aspect-devel [aspect-devel-bounces at geodynamics.org] on behalf of Wolfgang Bangerth [bangerth at colostate.edu] >> Sent: Thursday, September 06, 2018 11:20 PM >> To: aspect-devel at geodynamics.org >> Subject: Re: [aspect-devel] Evenly spaced vectors, dynamic pressure >> >> On 09/06/2018 01:25 PM, Adam Holt wrote: >>> Second, I am interested in the dynamic pressure field and wondered how best >>> to retrieve it from my (incompressible) models. For such models, I assume >>> it can be computed by subtracting the horizontally-constant static pressure >>> from the pressure outputted (the 'nonadiabatic pressure' output variable). >> I think that the "nonadiabatic pressure" is exactly the "dynamic pressure" you >> want: it's the total pressure minus the adiabatic pressure. So there isn't a >> need to write anything new, you just need to specify "nonadiabatic pressure" >> in the list of visualization variables. >> >> Though it is true that the adiabatic pressure is only computed once at the >> beginning of a simulation, and is not equal to the horizontally averaged >> pressure. There are, in other words, multiple ways to define a "dynamic >> pressure", and you need to specify which one exactly you want in order to >> figure out whether there is already an existing visualization postprocessor. >> >> Cheers >> W. >> >> -- >> ------------------------------------------------------------------------ >> Wolfgang Bangerth email: bangerth at colostate.edu >> www: http://www.math.colostate.edu/~bangerth/ >> >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > > -- > Rene Gassmoeller > https://gassmoeller.github.io/ > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel From bangerth at colostate.edu Mon Sep 10 13:21:09 2018 From: bangerth at colostate.edu (Wolfgang Bangerth) Date: Mon, 10 Sep 2018 14:21:09 -0600 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure In-Reply-To: <2D8C5651-DC92-445F-A2E2-06357222D591@yale.edu> References: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> <379937f1-71f5-d65a-80e8-de0c702cba95@colostate.edu> <8107E67F0A485F42A0545655001D992D97A6BB4E@OC11EXPO26.exchange.mit.edu> <58cbf659-4ad5-b426-31c1-0ce26a013ff7@mailbox.org> <8107E67F0A485F42A0545655001D992D97A6BC01@OC11EXPO26.exchange.mit.edu> <2D8C5651-DC92-445F-A2E2-06357222D591@yale.edu> Message-ID: <3b895550-b0f5-3180-6d3e-ae8a19a60bae@colostate.edu> On 09/09/2018 10:04 AM, Mark Brandon wrote: > I am a reader of this exchange (and learning a lot in the process). I have one small comment: > diabatic means with loss of heat, adiabatic means no loss of heat. As a result, nonadiabatic is a double negative. > Maybe that is why it typically shown in quotes in this discussion. Not being much of a thermodynamicist, would it be worth adding the word "diabatic" to the documentation of that postprocessor? Would that help readers? I have to admit that I never thought about this word. I can't say I've ever come across the term "diabatic" -- it seems like it is not often used, and Wikipedia only lists it in the context of the some quantum mechanics things. (A different thought is that we do not want to output the "diabatic" part, however one would define it, but specifically that part that does not results from the adiabatic profile. As such, non-adiabatic is different from diabatic.) Cheers W. -- ------------------------------------------------------------------------ Wolfgang Bangerth email: bangerth at colostate.edu www: http://www.math.colostate.edu/~bangerth/ From rene.gassmoeller at mailbox.org Tue Sep 11 10:03:10 2018 From: rene.gassmoeller at mailbox.org (Rene Gassmoeller) Date: Tue, 11 Sep 2018 10:03:10 -0700 Subject: [aspect-devel] ASPECT online user meeting, Aug 28, 9am PT In-Reply-To: <608819df-ae5d-7ed5-7b76-94866b762a35@mailbox.org> References: <608819df-ae5d-7ed5-7b76-94866b762a35@mailbox.org> Message-ID: <50068250-4d4b-5a34-d88a-2a6c2b91df2f@mailbox.org> Hi all, this is a reminder that we will have our next ASPECT user meeting tomorrow, Sep 12th at 9 am PT. As always the meeting will be available using the Zoom software, at: https://zoom.us/j/875133126 Since we had little time for a discussion of your questions last time, lets set aside some time for that. Also if you are interested in a certain feature of ASPECT, or want to present a particular work related to ASPECT yourself then let us know tomorrow so that we can schedule that for a future meeting. Looking forward to seeing you tomorrow. Best, Rene -- Rene Gassmoeller https://gassmoeller.github.io/ From rene.gassmoeller at mailbox.org Tue Sep 11 11:44:27 2018 From: rene.gassmoeller at mailbox.org (Rene Gassmoeller) Date: Tue, 11 Sep 2018 11:44:27 -0700 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: <5FCBA80B-5D66-44DD-8489-185694449DE3@vt.edu> References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> <972cd5d9-7495-0aa3-55a7-808094dd6c63@colostate.edu> <3be46824-9777-6b5a-881c-005fea029d3f@mailbox.org> <5FCBA80B-5D66-44DD-8489-185694449DE3@vt.edu> Message-ID: <34339a20-973a-aadc-721b-14a7f92e4521@mailbox.org> Scott, I have opened the pull request with the improved heat flux computation, and it is available here: https://github.com/geodynamics/aspect/pull/2660 . It already includes the improvements Wolfgang made to the advection stabilization term, so this should be a good version to start the testing. I have run into some issues trying to reproduce compressible benchmark results with the new method though (I explain them in the pull request), I do not suppose you have another magic fix for them at hand? That would be much appreciated :-). Best, Rene On 09/06/2018 07:18 PM, Scott King wrote: > Great!   Glad I could contribute instead of being my usual grumpy PITA. > > We’re all set up to run Zhong cases as quickly as we can get them > through the queues. > > Best > > Scott > > Sent from my iPhone > > On Sep 6, 2018, at 10:08 PM, Rene Gassmoeller > > > wrote: > >> Hi Scott, >> >> very interesting, thanks for sharing that thought! That looks like a >> significant improvement for the heat flux postprocessors. You were >> right, the changes to the postprocessor were not very complicated, >> and I will open a pull request for them tomorrow, when I have cleaned >> up a few things. I just wanted to share some first results with you: >> >> These are the original convergence studies for the Blankenbach 1a >> case with ASPECT: >> >> # Nu                     Vrms                    name (refinement level): >> 4.78661864e+00 4.34590432e+01 case1a_ref4.stat >> 4.87927972e+00 4.29377468e+01 case1a_ref5.stat >> 4.88993106e+00 4.28733838e+01 case1a_ref6.stat >> 4.88680525e+00 4.28659548e+01 case1a_ref7.stat >> 4.88440900e+00 4.28649470e+01 case1a_reference.stat >> >> Both Nu and Vrms converge, but rather slowly for the very low >> Rayleigh number (10^4). Below are the values with Wolfgang's >> improvements in pull request 2650 (taking the max of artificial >> diffusion and physical diffusion instead of the sum): >> >> # Nu                     Vrms                    name (refinement level): >> 5.30885322e+00 4.28499932e+01 case1a_ref3.stat >> 5.06735289e+00 4.28656773e+01 case1a_ref4.stat >> 4.93712396e+00 4.28650353e+01 case1a_ref5.stat >> 4.88440900e+00 4.28649470e+01 case1a_reference.stat >> >> As you can see the Vrms is now much closer to the reference value >> already at low resolutions (even at refinement level 3, which is only >> 8x8 cells). But the Nusselt number is now worse, and converging from >> above the reference value instead of from below. With your suggested >> improvements to the postprocessors (taking the volume averaged total >> heat flux in the boundary cell, instead of the conductive heat flux >> at the surface): >> >> # Nu                     Vrms                    name (refinement level): >> 4.89728221e+00 4.28499932e+01 case1a_ref3.stat >> 4.88535143e+00 4.28656773e+01 case1a_ref4.stat >> 4.88443365e+00 4.28650353e+01 case1a_ref5.stat >> 4.88440900e+00 4.28649470e+01 case1a_reference.stat >> >> The Vrms is not affected, because it is only a change in the >> postprocessor, but now the Nu number is significantly closer to the >> reference value even at low resolutions. All in all, we now get a >> better accuracy with a 16x16 grid, than with a 128x128 grid before >> the changes. I would say that is progress :-). >> The other Blankenbach cases show similar improvements (still running >> though), and I have not yet tested the behavior for other geometries, >> but I do not think there is a conceptual problem. I will not have >> time to do much more benchmarking, because I am traveling from the >> end of next week on, but do you think you or Grant would have some >> time to give a few of the cases of the Zhong 2008 paper another try >> once the changes are in the main version? >> >> Thanks again for the reference! >> >> Best, >> Rene >> >> On 09/05/2018 04:47 PM, Scott King wrote: >>> >>> As for calculating fluxes at the boundaries, I looked at the heat >>> flux code a bit and I’m wondering...  I will share this paper with >>> you all. >>> >>> https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1365-246X.1987.tb01375.x >>> >>> It might be less relevant to second-order elements than linear >>> elements but, a lot of the same arguments I’m seeing in the posts >>> the last few days are bringing back memories.   This is what is done >>> by default in CitcomS (unless you explicitly call the CFB method).   >>> I suspect it should be fairly trivial for someone who is good with >>> deal.ii to implement because it is pretty standard finite element >>> stuff, i.e., calculate fluxes at the internal integration points and >>> project the values to the nodes. >>> >>> Yes, the artificial diffusivity is an issue but I think this >>> explains why even when we turn it off we get relatively poor Nusselt >>> numbers while getting excellent agreement with depth-averaged >>> properties and mean values. >>> >>> Scott >>> >>>> On Sep 5, 2018, at 2:56 PM, Max Rudolph >>> > wrote: >>>> >>>> OK, you are right that there will always be some region with more >>>> artificial than physical heat transport. >>>> What about instead looking at the ratio of physical to artificial >>>> heat flux through each boundary? >>>> For Rene's anisotropic "SUEV" implementation, even in the presence >>>> of large entropy viscosity, the artificial heat transport can be >>>> very small as long as u.gradT is small. In particular, even though >>>> entropy viscosity is fairly large at the boundaries, the velocities >>>> are tangential to the boundary, so there is very little artificial >>>> diffusion. >>>> >>>> Max >>>> >>>> On Wed, Sep 5, 2018 at 10:41 AM Wolfgang Bangerth >>>> > wrote: >>>> >>>> On 09/05/2018 07:12 AM, Max Rudolph wrote: >>>> > >>>> > Rene and I discussed this idea on Monday and I don't think >>>> that this is >>>> > the right thing to do. It would lead to an unexpected >>>> relationship >>>> > between the temperature gradient (and hence temperature >>>> structure of the >>>> > lithosphere) and the physical thermal conductivity. Maybe >>>> more helpful >>>> > would be a separate output of the non-physical contribution >>>> to the heat >>>> > flux through each boundary, or within the entire domain as >>>> the ratio of >>>> > the norm of the artificial heat flux divided by the norm of >>>> the total >>>> > heat flux. I still think that a warning message when this >>>> quantity >>>> > exceeds, say, 1% would help users understand that they should >>>> expect >>>> > unphysical results. >>>> >>>> But this warning message would be printed on pretty much every >>>> single >>>> simulation in which the mesh does not completely resolve >>>> boundary and >>>> internal layers -- which is essentially every simulation ever >>>> done in >>>> the field of mantle convection. >>>> >>>> If it was a rare occasion where artificial viscosity is needed >>>> to make a >>>> simulation stable, then we wouldn't use it. But the reality is >>>> that all >>>> realistic global-scale simulations must necessarily have some >>>> kind of >>>> artificial diffusion (SUPG, EV, dG schemes, ...) that is larger >>>> than the >>>> physical diffusion at least in parts of the domain because >>>> resolving the >>>> boundary layers is not possible on a global scale and will not be >>>> possible for a long time to come. The idea of artificial diffusion >>>> schemes is to make boundary layers as large as the cells of the >>>> mesh so >>>> that they are resolved, rather than leading to >>>> over/undershoots. It is >>>> *needed* to avoid Gibb's phenomenon if you can't make the mesh >>>> small enough. >>>> >>>> That does not mean that (i) the scheme we currently use is the >>>> best >>>> idea, (ii) we can't improve the situation. But I do not think that >>>> printing a warning for essentially every single simulation is >>>> useful. >>>> >>>> (I'll note that we also use artificial diffusion schemes for the >>>> compositional fields for which the physical diffusion is zero >>>> -- so the >>>> artificial diffusion is *always* larger than the physical one.) >>>> >>>> Best >>>>   W. >>>> >>>> -- >>>> ------------------------------------------------------------------------ >>>> Wolfgang Bangerth          email: bangerth at colostate.edu >>>> >>>>                             www: >>>> http://www.math.colostate.edu/~bangerth/ >>>> >>>> _______________________________________________ >>>> Aspect-devel mailing list >>>> Aspect-devel at geodynamics.org >>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>>> >>>> _______________________________________________ >>>> Aspect-devel mailing list >>>> Aspect-devel at geodynamics.org >>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>> >>> >>> >>> _______________________________________________ >>> Aspect-devel mailing list >>> Aspect-devel at geodynamics.org >>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >> >> -- >> Rene Gassmoeller >> https://gassmoeller.github.io/ -- Rene Gassmoeller https://gassmoeller.github.io/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From rene.gassmoeller at mailbox.org Tue Sep 11 12:00:17 2018 From: rene.gassmoeller at mailbox.org (Rene Gassmoeller) Date: Tue, 11 Sep 2018 19:00:17 -0000 Subject: [aspect-devel] ASPECT Newsletter #63 Message-ID: <20180911185142.19B6EAC1ED9@geodynamics.org> Hello everyone! This is ASPECT newsletter #63. It automatically reports recently merged features and discussions about the ASPECT mantle convection code. ## Below you find a list of recently proposed or merged features: #2660: Average heat flux in boundary cells for more accurate heat flux computation (proposed by gassmoeller) https://github.com/geodynamics/aspect/pull/2660 #2659: Unify heat flux computation (proposed by gassmoeller) https://github.com/geodynamics/aspect/pull/2659 #2656: [WIP] Add phase changes and compressibility to visco plastic material model (proposed by naliboff) https://github.com/geodynamics/aspect/pull/2656 #2654: Clarify what the 'nonadiabatic pressure' viz postprocessor does. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/2654 #2653: Mark up the 'dx' in integrals. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/2653 #2652: Provide formulas for the material-statistics postprocessor. (proposed by bangerth) https://github.com/geodynamics/aspect/pull/2652 #2651: Better explain what kinds of checks run with --validate. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/2651 #2650: Use the max of physical and artificial viscosity. (proposed by bangerth) https://github.com/geodynamics/aspect/pull/2650 #2649: Only apply entropy viscosity in streamline direction (proposed by gassmoeller) https://github.com/geodynamics/aspect/pull/2649 #2648: Update a comment. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/2648 #2647: update parameters (proposed by tjhei; merged) https://github.com/geodynamics/aspect/pull/2647 #2646: Fix plugin graph generation (proposed by tjhei; merged) https://github.com/geodynamics/aspect/pull/2646 #2645: tester: correctly enable warnings as errors: -Werror (proposed by tjhei; merged) https://github.com/geodynamics/aspect/pull/2645 #2644: jenkins: clean workspace (proposed by tjhei; merged) https://github.com/geodynamics/aspect/pull/2644 #2643: Some code cleanup in entropy viscosity (proposed by gassmoeller; merged) https://github.com/geodynamics/aspect/pull/2643 #2642: Fix parallel viscosity smoothing (proposed by gassmoeller; merged) https://github.com/geodynamics/aspect/pull/2642 #2641: Clarify a comment and a formula. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/2641 #2640: Be consistent with punctuation in a list in the manual. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/2640 #2639: Allow variable stabilization parameters for each field (proposed by gassmoeller; merged) https://github.com/geodynamics/aspect/pull/2639 #2638: [WIP] Reimplement supg #412 (proposed by gassmoeller) https://github.com/geodynamics/aspect/pull/2638 #2634: Add parse input table function (proposed by gassmoeller; merged) https://github.com/geodynamics/aspect/pull/2634 #2630: Add material statistics postprocessor (proposed by gassmoeller; merged) https://github.com/geodynamics/aspect/pull/2630 ## And this is a list of recently opened or closed discussions: #2658: How could I get 'composition' in the prescribed_velocity.cc? (opened) https://github.com/geodynamics/aspect/issues/2658 #2657: how do I get stress tensor of some mesh point? (opened) https://github.com/geodynamics/aspect/issues/2657 #2655: 'quick_mpi' fails when testing ('make test') on macOS (opened) https://github.com/geodynamics/aspect/issues/2655 #2637: unstable Q1-DGP0 element? (opened) https://github.com/geodynamics/aspect/issues/2637 #2631: Strain rate tensor visualization postprocessor not registered (closed) https://github.com/geodynamics/aspect/issues/2631 A list of all major changes since the last release can be found at https://aspect.geodynamics.org/doc/doxygen/changes_current.html. Thanks for being part of the community! Let us know about questions, problems, bugs or just share your experience by writing to aspect-devel at geodynamics.org, or by opening issues or pull requests at https://www.github.com/geodynamics/aspect. Additional information can be found at https://aspect.geodynamics.org/, and https://geodynamics.org/cig/software/aspect/. -------------- next part -------------- An HTML attachment was scrubbed... URL: From kneumiller at opendap.org Tue Sep 11 09:18:31 2018 From: kneumiller at opendap.org (Kodi Neumiller) Date: Tue, 11 Sep 2018 10:18:31 -0600 Subject: [aspect-devel] Aspect Plugin Message-ID: Hello, I am writing a plugin for Aspect and I was trying to get a better feeling for how Aspect handles the information in the prm files. What files in Aspect handle most of the data given by the prm files? Thank you, Kodi From sdk at vt.edu Tue Sep 11 16:27:40 2018 From: sdk at vt.edu (Scott King) Date: Tue, 11 Sep 2018 19:27:40 -0400 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure In-Reply-To: <3b895550-b0f5-3180-6d3e-ae8a19a60bae@colostate.edu> References: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> <379937f1-71f5-d65a-80e8-de0c702cba95@colostate.edu> <8107E67F0A485F42A0545655001D992D97A6BB4E@OC11EXPO26.exchange.mit.edu> <58cbf659-4ad5-b426-31c1-0ce26a013ff7@mailbox.org> <8107E67F0A485F42A0545655001D992D97A6BC01@OC11EXPO26.exchange.mit.edu> <2D8C5651-DC92-445F-A2E2-06357222D591@yale.edu> <3b895550-b0f5-3180-6d3e-ae8a19a60bae@colostate.edu> Message-ID: <9CD0E8F8-4F45-4201-B8AB-05C075F0D81B@vt.edu> I’ve never seen the term diabatic used in solid earth, neither have my colleagues who work in high pressure thermodynamics. (I asked around for fun.) I think introducing it in the documentation would cause confusion. I suspect the reason it isn’t used is that the adiabatic profile is (most often) taken to be a reference state and describing departures from the adiabatic reference state as non-adiabatic, while it linguistically might be a double-negative, is a better description of the process (i.e., non-adiabatic = that part of the field that is not adiabatic). A lot of meteorology also uses this terminology for much the same reason. My $0.02 > On Sep 10, 2018, at 4:21 PM, Wolfgang Bangerth wrote: > > On 09/09/2018 10:04 AM, Mark Brandon wrote: >> I am a reader of this exchange (and learning a lot in the process). I have one small comment: >> diabatic means with loss of heat, adiabatic means no loss of heat. As a result, nonadiabatic is a double negative. >> Maybe that is why it typically shown in quotes in this discussion. > > Not being much of a thermodynamicist, would it be worth adding the word "diabatic" to the documentation of that postprocessor? Would that help readers? > > I have to admit that I never thought about this word. I can't say I've ever come across the term "diabatic" -- it seems like it is not often used, and Wikipedia only lists it in the context of the some quantum mechanics things. > > (A different thought is that we do not want to output the "diabatic" part, however one would define it, but specifically that part that does not results from the adiabatic profile. As such, non-adiabatic is different from diabatic.) > > Cheers > W. > > -- > ------------------------------------------------------------------------ > Wolfgang Bangerth email: bangerth at colostate.edu > www: http://www.math.colostate.edu/~bangerth/ > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel From kneumiller at opendap.org Tue Sep 11 15:10:58 2018 From: kneumiller at opendap.org (Kodi Neumiller) Date: Tue, 11 Sep 2018 16:10:58 -0600 Subject: [aspect-devel] Using NetCDF in ASPECT Message-ID: Hi, I was curious as to whether or not Aspect is compatible with NetCDF? I would like to use NetCDF inside of Aspect to access data outside of Aspect and pass it into the aspect executable. Thank you, Kodi From bangerth at colostate.edu Tue Sep 11 16:35:27 2018 From: bangerth at colostate.edu (Wolfgang Bangerth) Date: Tue, 11 Sep 2018 17:35:27 -0600 Subject: [aspect-devel] Aspect Plugin In-Reply-To: References: Message-ID: <19e68e39-cec5-7fd7-1c54-e7e31a82517e@colostate.edu> On 09/11/2018 10:18 AM, Kodi Neumiller wrote: > I am writing a plugin for Aspect and I was trying to get a better > feeling for how Aspect handles the information in the prm files. What > files in Aspect handle most of the data given by the prm files? There is a global set of parameters that are handled largely in the file source/simulator/parameters.cc. But then most plugins also define their own parameters, and they declare and read them individually -- take any plugin you like (or that's close to what you want to emulate) and look for the declare_parameters() and parse_parameters() functions in the file that implements the plugin. Let us know if you have concrete questions about what a particular piece of code there does! Best W. -- ------------------------------------------------------------------------ Wolfgang Bangerth email: bangerth at colostate.edu www: http://www.math.colostate.edu/~bangerth/ From sdk at vt.edu Tue Sep 11 17:11:09 2018 From: sdk at vt.edu (Scott King) Date: Tue, 11 Sep 2018 20:11:09 -0400 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: <34339a20-973a-aadc-721b-14a7f92e4521@mailbox.org> References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> <972cd5d9-7495-0aa3-55a7-808094dd6c63@colostate.edu> <3be46824-9777-6b5a-881c-005fea029d3f@mailbox.org> <5FCBA80B-5D66-44DD-8489-185694449DE3@vt.edu> <34339a20-973a-aadc-721b-14a7f92e4521@mailbox.org> Message-ID: <682519D9-86F2-4C92-8089-3E9E392F1001@vt.edu> Hi Rene; I don’t have any magic but, I can tell you that I use the same Nu calculator for the Blankenbach et al. cases and the EBA, TALA, and ALA cases in King et al., 2010. In fact I use the same code for all four. That makes me think the issue is not how you are calculating flux per se. One thing I wonder about because of how ASPECT deals with density: For the ALA, density should be \rho_{ref} everywhere, except for the buoyancy term. That’s part of the ALA because the ALA assumes you only keep terms to first order in \mu=\alpha\Delta T. I know ASPECT had/has the full density in the energy equation, not \rho_{ref}. I don’t know what you actually use for density in the ALA benchmark problems. Using the full density everywhere would be fully compressible, which would probably be preferable to the ALA for the mantle but, you also need to include the time derivative of \rho in the continuity equation. I went through this carefully with Gary Glatzmaier and Stephane Labrosse at a workshop on compressible convection last year. I always thought that time derivative of density went away with the low-mach number assumption otherwise you would have pressure waves but, the pressure wave problem goes away as soon as you invoke infinite Prandtl number and drop the momentum terms in the Navier-Stokes to become Stokes (according to Gary G.) and you can (and need to) leave the time-varying density term in the continuity equation. Makes sense if you have problems with decaying heat sources, etc. Of course the “benchmark” problems in King et al. 2010 are steady-state problems so the time-varying density term doesn’t matter for them. Glatzmaier 1988 (I think?) has a good description/derivation of the fully compressible problem. I find the derivation in Jarvis and McKenzie is confusing because the also introduce the stream-function vorticity changes while they are non-dimensionalizing the equations and introducing the ALA. It requires pretty much going through it yourself. In my case 2-3 times to purge the mistakes. I think I’m dyslexic. Maybe just old. Cheers, Scott > On Sep 11, 2018, at 2:44 PM, Rene Gassmoeller wrote: > > Scott, > > I have opened the pull request with the improved heat flux computation, and it is available here: https://github.com/geodynamics/aspect/pull/2660 . It already includes the improvements Wolfgang made to the advection stabilization term, so this should be a good version to start the testing. > > I have run into some issues trying to reproduce compressible benchmark results with the new method though (I explain them in the pull request), I do not suppose you have another magic fix for them at hand? That would be much appreciated :-). > > Best, > > Rene > > On 09/06/2018 07:18 PM, Scott King wrote: >> Great! Glad I could contribute instead of being my usual grumpy PITA. >> >> We’re all set up to run Zhong cases as quickly as we can get them through the queues. >> >> Best >> >> Scott >> >> Sent from my iPhone >> >> On Sep 6, 2018, at 10:08 PM, Rene Gassmoeller > wrote: >> >>> Hi Scott, >>> >>> very interesting, thanks for sharing that thought! That looks like a significant improvement for the heat flux postprocessors. You were right, the changes to the postprocessor were not very complicated, and I will open a pull request for them tomorrow, when I have cleaned up a few things. I just wanted to share some first results with you: >>> >>> These are the original convergence studies for the Blankenbach 1a case with ASPECT: >>> >>> # Nu Vrms name (refinement level): >>> 4.78661864e+00 4.34590432e+01 case1a_ref4.stat >>> 4.87927972e+00 4.29377468e+01 case1a_ref5.stat >>> 4.88993106e+00 4.28733838e+01 case1a_ref6.stat >>> 4.88680525e+00 4.28659548e+01 case1a_ref7.stat >>> 4.88440900e+00 4.28649470e+01 case1a_reference.stat >>> >>> Both Nu and Vrms converge, but rather slowly for the very low Rayleigh number (10^4). Below are the values with Wolfgang's improvements in pull request 2650 (taking the max of artificial diffusion and physical diffusion instead of the sum): >>> >>> # Nu Vrms name (refinement level): >>> 5.30885322e+00 4.28499932e+01 case1a_ref3.stat >>> 5.06735289e+00 4.28656773e+01 case1a_ref4.stat >>> 4.93712396e+00 4.28650353e+01 case1a_ref5.stat >>> 4.88440900e+00 4.28649470e+01 case1a_reference.stat >>> >>> As you can see the Vrms is now much closer to the reference value already at low resolutions (even at refinement level 3, which is only 8x8 cells). But the Nusselt number is now worse, and converging from above the reference value instead of from below. With your suggested improvements to the postprocessors (taking the volume averaged total heat flux in the boundary cell, instead of the conductive heat flux at the surface): >>> >>> # Nu Vrms name (refinement level): >>> 4.89728221e+00 4.28499932e+01 case1a_ref3.stat >>> 4.88535143e+00 4.28656773e+01 case1a_ref4.stat >>> 4.88443365e+00 4.28650353e+01 case1a_ref5.stat >>> 4.88440900e+00 4.28649470e+01 case1a_reference.stat >>> The Vrms is not affected, because it is only a change in the postprocessor, but now the Nu number is significantly closer to the reference value even at low resolutions. All in all, we now get a better accuracy with a 16x16 grid, than with a 128x128 grid before the changes. I would say that is progress :-). >>> The other Blankenbach cases show similar improvements (still running though), and I have not yet tested the behavior for other geometries, but I do not think there is a conceptual problem. I will not have time to do much more benchmarking, because I am traveling from the end of next week on, but do you think you or Grant would have some time to give a few of the cases of the Zhong 2008 paper another try once the changes are in the main version? >>> >>> Thanks again for the reference! >>> >>> Best, >>> Rene >>> >>> On 09/05/2018 04:47 PM, Scott King wrote: >>>> >>>> As for calculating fluxes at the boundaries, I looked at the heat flux code a bit and I’m wondering... I will share this paper with you all. >>>> >>>> https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1365-246X.1987.tb01375.x >>>> >>>> It might be less relevant to second-order elements than linear elements but, a lot of the same arguments I’m seeing in the posts the last few days are bringing back memories. This is what is done by default in CitcomS (unless you explicitly call the CFB method). I suspect it should be fairly trivial for someone who is good with deal.ii to implement because it is pretty standard finite element stuff, i.e., calculate fluxes at the internal integration points and project the values to the nodes. >>>> >>>> Yes, the artificial diffusivity is an issue but I think this explains why even when we turn it off we get relatively poor Nusselt numbers while getting excellent agreement with depth-averaged properties and mean values. >>>> >>>> Scott >>>> >>>>> On Sep 5, 2018, at 2:56 PM, Max Rudolph > wrote: >>>>> >>>>> OK, you are right that there will always be some region with more artificial than physical heat transport. >>>>> What about instead looking at the ratio of physical to artificial heat flux through each boundary? >>>>> For Rene's anisotropic "SUEV" implementation, even in the presence of large entropy viscosity, the artificial heat transport can be very small as long as u.gradT is small. In particular, even though entropy viscosity is fairly large at the boundaries, the velocities are tangential to the boundary, so there is very little artificial diffusion. >>>>> >>>>> Max >>>>> >>>>> On Wed, Sep 5, 2018 at 10:41 AM Wolfgang Bangerth > wrote: >>>>> On 09/05/2018 07:12 AM, Max Rudolph wrote: >>>>> > >>>>> > Rene and I discussed this idea on Monday and I don't think that this is >>>>> > the right thing to do. It would lead to an unexpected relationship >>>>> > between the temperature gradient (and hence temperature structure of the >>>>> > lithosphere) and the physical thermal conductivity. Maybe more helpful >>>>> > would be a separate output of the non-physical contribution to the heat >>>>> > flux through each boundary, or within the entire domain as the ratio of >>>>> > the norm of the artificial heat flux divided by the norm of the total >>>>> > heat flux. I still think that a warning message when this quantity >>>>> > exceeds, say, 1% would help users understand that they should expect >>>>> > unphysical results. >>>>> >>>>> But this warning message would be printed on pretty much every single >>>>> simulation in which the mesh does not completely resolve boundary and >>>>> internal layers -- which is essentially every simulation ever done in >>>>> the field of mantle convection. >>>>> >>>>> If it was a rare occasion where artificial viscosity is needed to make a >>>>> simulation stable, then we wouldn't use it. But the reality is that all >>>>> realistic global-scale simulations must necessarily have some kind of >>>>> artificial diffusion (SUPG, EV, dG schemes, ...) that is larger than the >>>>> physical diffusion at least in parts of the domain because resolving the >>>>> boundary layers is not possible on a global scale and will not be >>>>> possible for a long time to come. The idea of artificial diffusion >>>>> schemes is to make boundary layers as large as the cells of the mesh so >>>>> that they are resolved, rather than leading to over/undershoots. It is >>>>> *needed* to avoid Gibb's phenomenon if you can't make the mesh small enough. >>>>> >>>>> That does not mean that (i) the scheme we currently use is the best >>>>> idea, (ii) we can't improve the situation. But I do not think that >>>>> printing a warning for essentially every single simulation is useful. >>>>> >>>>> (I'll note that we also use artificial diffusion schemes for the >>>>> compositional fields for which the physical diffusion is zero -- so the >>>>> artificial diffusion is *always* larger than the physical one.) >>>>> >>>>> Best >>>>> W. >>>>> >>>>> -- >>>>> ------------------------------------------------------------------------ >>>>> Wolfgang Bangerth email: bangerth at colostate.edu >>>>> www: http://www.math.colostate.edu/~bangerth/ >>>>> _______________________________________________ >>>>> Aspect-devel mailing list >>>>> Aspect-devel at geodynamics.org >>>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel _______________________________________________ >>>>> Aspect-devel mailing list >>>>> Aspect-devel at geodynamics.org >>>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>>> >>>> >>>> _______________________________________________ >>>> Aspect-devel mailing list >>>> Aspect-devel at geodynamics.org >>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>> -- >>> Rene Gassmoeller >>> https://gassmoeller.github.io/ > -- > Rene Gassmoeller > https://gassmoeller.github.io/ _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From bangerth at colostate.edu Tue Sep 11 16:37:10 2018 From: bangerth at colostate.edu (Wolfgang Bangerth) Date: Tue, 11 Sep 2018 17:37:10 -0600 Subject: [aspect-devel] Using NetCDF in ASPECT In-Reply-To: References: Message-ID: On 09/11/2018 04:10 PM, Kodi Neumiller wrote: > I was curious as to whether or not Aspect is compatible with NetCDF? > I would like to use NetCDF inside of Aspect to access data outside of > Aspect and pass it into the aspect executable. There is no generic infrastructure in ASPECT (or deal.II) to read data via NetCDF -- but there is also no reason why any particular plugin couldn't read data via the NetCDF library in the same way as we currently read ASCII files ourselves. How one would do that of course depends on what exactly you want to do! Best W. -- ------------------------------------------------------------------------ Wolfgang Bangerth email: bangerth at colostate.edu www: http://www.math.colostate.edu/~bangerth/ From heister at clemson.edu Wed Sep 12 05:50:58 2018 From: heister at clemson.edu (Timo Heister) Date: Wed, 12 Sep 2018 08:50:58 -0400 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: <682519D9-86F2-4C92-8089-3E9E392F1001@vt.edu> References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> <972cd5d9-7495-0aa3-55a7-808094dd6c63@colostate.edu> <3be46824-9777-6b5a-881c-005fea029d3f@mailbox.org> <5FCBA80B-5D66-44DD-8489-185694449DE3@vt.edu> <34339a20-973a-aadc-721b-14a7f92e4521@mailbox.org> <682519D9-86F2-4C92-8089-3E9E392F1001@vt.edu> Message-ID: Scott, > For the ALA, density should be \rho_{ref} everywhere, except for the buoyancy term. That’s part of the ALA because the ALA assumes you only keep terms to first order in \mu=\alpha\Delta T. I know ASPECT had/has the full density in the energy equation, not \rho_{ref}. I don’t know what you actually use for density in the ALA benchmark problems. We have implemented various formulations in ASPECT including ALA to mimic the same setup that is used in benchmarks like this. This was required to reproduce the results: See manual section 2.11 and 2.11.5 http://www.math.clemson.edu/~heister/aspect-pdf-manual/web/viewer.html#subsection.2.11 http://www.math.clemson.edu/~heister/aspect-pdf-manual/web/viewer.html#sec:combined_formulations Best, Timo -- Timo Heister http://www.math.clemson.edu/~heister/ From sdk at vt.edu Wed Sep 12 06:04:33 2018 From: sdk at vt.edu (Scott King) Date: Wed, 12 Sep 2018 09:04:33 -0400 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> <972cd5d9-7495-0aa3-55a7-808094dd6c63@colostate.edu> <3be46824-9777-6b5a-881c-005fea029d3f@mailbox.org> <5FCBA80B-5D66-44DD-8489-185694449DE3@vt.edu> <34339a20-973a-aadc-721b-14a7f92e4521@mailbox.org> <682519D9-86F2-4C92-8089-3E9E392F1001@vt.edu> Message-ID: Yes. I’m aware of the manual but manuals and codes don’t always agree, even when they intend to. Just sayin. If you read Rene’s comment it sounds like the agreement for the ALA cases is not as good now that the flux is calculated more accurately and he asked if I had any magic. It is the only thing I could think of... > On Sep 12, 2018, at 8:50 AM, Timo Heister wrote: > > Scott, > >> For the ALA, density should be \rho_{ref} everywhere, except for the buoyancy term. That’s part of the ALA because the ALA assumes you only keep terms to first order in \mu=\alpha\Delta T. I know ASPECT had/has the full density in the energy equation, not \rho_{ref}. I don’t know what you actually use for density in the ALA benchmark problems. > > We have implemented various formulations in ASPECT including ALA to > mimic the same setup that is used in benchmarks like this. This was > required to reproduce the results: > > See manual section 2.11 and 2.11.5 > > http://www.math.clemson.edu/~heister/aspect-pdf-manual/web/viewer.html#subsection.2.11 > > http://www.math.clemson.edu/~heister/aspect-pdf-manual/web/viewer.html#sec:combined_formulations > > Best, > Timo > > -- > Timo Heister > http://www.math.clemson.edu/~heister/ > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel From sdk at vt.edu Wed Sep 12 11:55:20 2018 From: sdk at vt.edu (Scott King) Date: Wed, 12 Sep 2018 14:55:20 -0400 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: <34339a20-973a-aadc-721b-14a7f92e4521@mailbox.org> References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> <972cd5d9-7495-0aa3-55a7-808094dd6c63@colostate.edu> <3be46824-9777-6b5a-881c-005fea029d3f@mailbox.org> <5FCBA80B-5D66-44DD-8489-185694449DE3@vt.edu> <34339a20-973a-aadc-721b-14a7f92e4521@mailbox.org> Message-ID: <255D0F5E-0125-4AB8-AB0A-E9B6D9577CCD@vt.edu> Hi Rene; I checked the flux calculation code itself. For the flux calculations we use the thermal diffusion term (-N,z * T), the advective term (rho_ref * N *T * N*V_z) and the viscous dissipation term which is really nasty to write out, so I will assume it’s o.k. (It is included in the tables in the paper, I think…) The dissipation terms drops out in the business cases, of course. So it seems like if BA is good there are two things to check. The dissipation term and whether you use rho_ref in the advection term. Scott > On Sep 11, 2018, at 2:44 PM, Rene Gassmoeller wrote: > > Scott, > > I have opened the pull request with the improved heat flux computation, and it is available here: https://github.com/geodynamics/aspect/pull/2660 . It already includes the improvements Wolfgang made to the advection stabilization term, so this should be a good version to start the testing. > > I have run into some issues trying to reproduce compressible benchmark results with the new method though (I explain them in the pull request), I do not suppose you have another magic fix for them at hand? That would be much appreciated :-). > > Best, > > Rene > > On 09/06/2018 07:18 PM, Scott King wrote: >> Great! Glad I could contribute instead of being my usual grumpy PITA. >> >> We’re all set up to run Zhong cases as quickly as we can get them through the queues. >> >> Best >> >> Scott >> >> Sent from my iPhone >> >> On Sep 6, 2018, at 10:08 PM, Rene Gassmoeller > wrote: >> >>> Hi Scott, >>> >>> very interesting, thanks for sharing that thought! That looks like a significant improvement for the heat flux postprocessors. You were right, the changes to the postprocessor were not very complicated, and I will open a pull request for them tomorrow, when I have cleaned up a few things. I just wanted to share some first results with you: >>> >>> These are the original convergence studies for the Blankenbach 1a case with ASPECT: >>> >>> # Nu Vrms name (refinement level): >>> 4.78661864e+00 4.34590432e+01 case1a_ref4.stat >>> 4.87927972e+00 4.29377468e+01 case1a_ref5.stat >>> 4.88993106e+00 4.28733838e+01 case1a_ref6.stat >>> 4.88680525e+00 4.28659548e+01 case1a_ref7.stat >>> 4.88440900e+00 4.28649470e+01 case1a_reference.stat >>> >>> Both Nu and Vrms converge, but rather slowly for the very low Rayleigh number (10^4). Below are the values with Wolfgang's improvements in pull request 2650 (taking the max of artificial diffusion and physical diffusion instead of the sum): >>> >>> # Nu Vrms name (refinement level): >>> 5.30885322e+00 4.28499932e+01 case1a_ref3.stat >>> 5.06735289e+00 4.28656773e+01 case1a_ref4.stat >>> 4.93712396e+00 4.28650353e+01 case1a_ref5.stat >>> 4.88440900e+00 4.28649470e+01 case1a_reference.stat >>> >>> As you can see the Vrms is now much closer to the reference value already at low resolutions (even at refinement level 3, which is only 8x8 cells). But the Nusselt number is now worse, and converging from above the reference value instead of from below. With your suggested improvements to the postprocessors (taking the volume averaged total heat flux in the boundary cell, instead of the conductive heat flux at the surface): >>> >>> # Nu Vrms name (refinement level): >>> 4.89728221e+00 4.28499932e+01 case1a_ref3.stat >>> 4.88535143e+00 4.28656773e+01 case1a_ref4.stat >>> 4.88443365e+00 4.28650353e+01 case1a_ref5.stat >>> 4.88440900e+00 4.28649470e+01 case1a_reference.stat >>> The Vrms is not affected, because it is only a change in the postprocessor, but now the Nu number is significantly closer to the reference value even at low resolutions. All in all, we now get a better accuracy with a 16x16 grid, than with a 128x128 grid before the changes. I would say that is progress :-). >>> The other Blankenbach cases show similar improvements (still running though), and I have not yet tested the behavior for other geometries, but I do not think there is a conceptual problem. I will not have time to do much more benchmarking, because I am traveling from the end of next week on, but do you think you or Grant would have some time to give a few of the cases of the Zhong 2008 paper another try once the changes are in the main version? >>> >>> Thanks again for the reference! >>> >>> Best, >>> Rene >>> >>> On 09/05/2018 04:47 PM, Scott King wrote: >>>> >>>> As for calculating fluxes at the boundaries, I looked at the heat flux code a bit and I’m wondering... I will share this paper with you all. >>>> >>>> https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1365-246X.1987.tb01375.x >>>> >>>> It might be less relevant to second-order elements than linear elements but, a lot of the same arguments I’m seeing in the posts the last few days are bringing back memories. This is what is done by default in CitcomS (unless you explicitly call the CFB method). I suspect it should be fairly trivial for someone who is good with deal.ii to implement because it is pretty standard finite element stuff, i.e., calculate fluxes at the internal integration points and project the values to the nodes. >>>> >>>> Yes, the artificial diffusivity is an issue but I think this explains why even when we turn it off we get relatively poor Nusselt numbers while getting excellent agreement with depth-averaged properties and mean values. >>>> >>>> Scott >>>> >>>>> On Sep 5, 2018, at 2:56 PM, Max Rudolph > wrote: >>>>> >>>>> OK, you are right that there will always be some region with more artificial than physical heat transport. >>>>> What about instead looking at the ratio of physical to artificial heat flux through each boundary? >>>>> For Rene's anisotropic "SUEV" implementation, even in the presence of large entropy viscosity, the artificial heat transport can be very small as long as u.gradT is small. In particular, even though entropy viscosity is fairly large at the boundaries, the velocities are tangential to the boundary, so there is very little artificial diffusion. >>>>> >>>>> Max >>>>> >>>>> On Wed, Sep 5, 2018 at 10:41 AM Wolfgang Bangerth > wrote: >>>>> On 09/05/2018 07:12 AM, Max Rudolph wrote: >>>>> > >>>>> > Rene and I discussed this idea on Monday and I don't think that this is >>>>> > the right thing to do. It would lead to an unexpected relationship >>>>> > between the temperature gradient (and hence temperature structure of the >>>>> > lithosphere) and the physical thermal conductivity. Maybe more helpful >>>>> > would be a separate output of the non-physical contribution to the heat >>>>> > flux through each boundary, or within the entire domain as the ratio of >>>>> > the norm of the artificial heat flux divided by the norm of the total >>>>> > heat flux. I still think that a warning message when this quantity >>>>> > exceeds, say, 1% would help users understand that they should expect >>>>> > unphysical results. >>>>> >>>>> But this warning message would be printed on pretty much every single >>>>> simulation in which the mesh does not completely resolve boundary and >>>>> internal layers -- which is essentially every simulation ever done in >>>>> the field of mantle convection. >>>>> >>>>> If it was a rare occasion where artificial viscosity is needed to make a >>>>> simulation stable, then we wouldn't use it. But the reality is that all >>>>> realistic global-scale simulations must necessarily have some kind of >>>>> artificial diffusion (SUPG, EV, dG schemes, ...) that is larger than the >>>>> physical diffusion at least in parts of the domain because resolving the >>>>> boundary layers is not possible on a global scale and will not be >>>>> possible for a long time to come. The idea of artificial diffusion >>>>> schemes is to make boundary layers as large as the cells of the mesh so >>>>> that they are resolved, rather than leading to over/undershoots. It is >>>>> *needed* to avoid Gibb's phenomenon if you can't make the mesh small enough. >>>>> >>>>> That does not mean that (i) the scheme we currently use is the best >>>>> idea, (ii) we can't improve the situation. But I do not think that >>>>> printing a warning for essentially every single simulation is useful. >>>>> >>>>> (I'll note that we also use artificial diffusion schemes for the >>>>> compositional fields for which the physical diffusion is zero -- so the >>>>> artificial diffusion is *always* larger than the physical one.) >>>>> >>>>> Best >>>>> W. >>>>> >>>>> -- >>>>> ------------------------------------------------------------------------ >>>>> Wolfgang Bangerth email: bangerth at colostate.edu >>>>> www: http://www.math.colostate.edu/~bangerth/ >>>>> _______________________________________________ >>>>> Aspect-devel mailing list >>>>> Aspect-devel at geodynamics.org >>>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel _______________________________________________ >>>>> Aspect-devel mailing list >>>>> Aspect-devel at geodynamics.org >>>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>>> >>>> >>>> _______________________________________________ >>>> Aspect-devel mailing list >>>> Aspect-devel at geodynamics.org >>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>> -- >>> Rene Gassmoeller >>> https://gassmoeller.github.io/ > -- > Rene Gassmoeller > https://gassmoeller.github.io/ _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From rene.gassmoeller at mailbox.org Wed Sep 12 15:13:34 2018 From: rene.gassmoeller at mailbox.org (Rene Gassmoeller) Date: Wed, 12 Sep 2018 15:13:34 -0700 Subject: [aspect-devel] Internal heating in aspect (Ludovic Jeanniot) In-Reply-To: <255D0F5E-0125-4AB8-AB0A-E9B6D9577CCD@vt.edu> References: <2408dd12-9db7-1a37-6087-cf8aaafac68d@colostate.edu> <24bd7a82-93bf-d81f-e1ce-9498a4ca171d@colostate.edu> <0e7b34e6-d2d8-e2ba-a072-0972660bdfd4@colostate.edu> <972cd5d9-7495-0aa3-55a7-808094dd6c63@colostate.edu> <3be46824-9777-6b5a-881c-005fea029d3f@mailbox.org> <5FCBA80B-5D66-44DD-8489-185694449DE3@vt.edu> <34339a20-973a-aadc-721b-14a7f92e4521@mailbox.org> <255D0F5E-0125-4AB8-AB0A-E9B6D9577CCD@vt.edu> Message-ID: Hi Scott, thanks, I think that might be something. The density we use in the equation and postprocessing is correct, we use a reference profile for all equations but the buoyancy term just like the ALA assumes. But I had not tried only including the shear heating term when computing the heat flux. It makes sense of course, in theory all heating terms contribute (I had tried that), except for the adiabatic term, which only converts heat into work. You mentioned it is nasty to write out, would you mind pointing me to some place to look it up anyway? Because the conventional heating term would have units W/m^3, while the term in the heat flux would need to be W/m^2. Thanks! Rene On 09/12/2018 11:55 AM, Scott King wrote: > Hi Rene; > > I checked the flux calculation code itself.  For the flux calculations > we use the thermal diffusion term (-N,z * T), the advective term > (rho_ref * N *T * N*V_z) and the viscous dissipation term which is > really nasty to write out, so I will assume it’s o.k.   (It is > included in the tables in the paper, I think…)  The dissipation terms > drops out in the business cases, of course.  So it seems like if BA is > good there are two things to check.  The dissipation term and whether > you use rho_ref in the advection term. > > Scott > >> On Sep 11, 2018, at 2:44 PM, Rene Gassmoeller >> > >> wrote: >> >> Scott, >> >> I have opened the pull request with the improved heat flux >> computation, and it is available here: >> https://github.com/geodynamics/aspect/pull/2660 . It already includes >> the improvements Wolfgang made to the advection stabilization term, >> so this should be a good version to start the testing. >> >> I have run into some issues trying to reproduce compressible >> benchmark results with the new method though (I explain them in the >> pull request), I do not suppose you have another magic fix for them >> at hand? That would be much appreciated :-). >> >> Best, >> >> Rene >> >> >> On 09/06/2018 07:18 PM, Scott King wrote: >>> Great!   Glad I could contribute instead of being my usual grumpy PITA. >>> >>> We’re all set up to run Zhong cases as quickly as we can get them >>> through the queues. >>> >>> Best >>> >>> Scott >>> >>> Sent from my iPhone >>> >>> On Sep 6, 2018, at 10:08 PM, Rene Gassmoeller >>> > >>> wrote: >>> >>>> Hi Scott, >>>> >>>> very interesting, thanks for sharing that thought! That looks like >>>> a significant improvement for the heat flux postprocessors. You >>>> were right, the changes to the postprocessor were not very >>>> complicated, and I will open a pull request for them tomorrow, when >>>> I have cleaned up a few things. I just wanted to share some first >>>> results with you: >>>> >>>> These are the original convergence studies for the Blankenbach 1a >>>> case with ASPECT: >>>> >>>> # Nu Vrms                    name (refinement level): >>>> 4.78661864e+00 4.34590432e+01 case1a_ref4.stat >>>> 4.87927972e+00 4.29377468e+01 case1a_ref5.stat >>>> 4.88993106e+00 4.28733838e+01 case1a_ref6.stat >>>> 4.88680525e+00 4.28659548e+01 case1a_ref7.stat >>>> 4.88440900e+00 4.28649470e+01 case1a_reference.stat >>>> >>>> Both Nu and Vrms converge, but rather slowly for the very low >>>> Rayleigh number (10^4). Below are the values with Wolfgang's >>>> improvements in pull request 2650 (taking the max of artificial >>>> diffusion and physical diffusion instead of the sum): >>>> >>>> # Nu Vrms                    name (refinement level): >>>> 5.30885322e+00 4.28499932e+01 case1a_ref3.stat >>>> 5.06735289e+00 4.28656773e+01 case1a_ref4.stat >>>> 4.93712396e+00 4.28650353e+01 case1a_ref5.stat >>>> 4.88440900e+00 4.28649470e+01 case1a_reference.stat >>>> >>>> As you can see the Vrms is now much closer to the reference value >>>> already at low resolutions (even at refinement level 3, which is >>>> only 8x8 cells). But the Nusselt number is now worse, and >>>> converging from above the reference value instead of from below. >>>> With your suggested improvements to the postprocessors (taking the >>>> volume averaged total heat flux in the boundary cell, instead of >>>> the conductive heat flux at the surface): >>>> >>>> # Nu Vrms                    name (refinement level): >>>> 4.89728221e+00 4.28499932e+01 case1a_ref3.stat >>>> 4.88535143e+00 4.28656773e+01 case1a_ref4.stat >>>> 4.88443365e+00 4.28650353e+01 case1a_ref5.stat >>>> 4.88440900e+00 4.28649470e+01 case1a_reference.stat >>>> >>>> The Vrms is not affected, because it is only a change in the >>>> postprocessor, but now the Nu number is significantly closer to the >>>> reference value even at low resolutions. All in all, we now get a >>>> better accuracy with a 16x16 grid, than with a 128x128 grid before >>>> the changes. I would say that is progress :-). >>>> The other Blankenbach cases show similar improvements (still >>>> running though), and I have not yet tested the behavior for other >>>> geometries, but I do not think there is a conceptual problem. I >>>> will not have time to do much more benchmarking, because I am >>>> traveling from the end of next week on, but do you think you or >>>> Grant would have some time to give a few of the cases of the Zhong >>>> 2008 paper another try once the changes are in the main version? >>>> >>>> Thanks again for the reference! >>>> >>>> Best, >>>> Rene >>>> >>>> On 09/05/2018 04:47 PM, Scott King wrote: >>>>> >>>>> As for calculating fluxes at the boundaries, I looked at the heat >>>>> flux code a bit and I’m wondering...  I will share this paper with >>>>> you all. >>>>> >>>>> https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1365-246X.1987.tb01375.x >>>>> >>>>> It might be less relevant to second-order elements than linear >>>>> elements but, a lot of the same arguments I’m seeing in the posts >>>>> the last few days are bringing back memories.   This is what is >>>>> done by default in CitcomS (unless you explicitly call the CFB >>>>> method).   I suspect it should be fairly trivial for someone who >>>>> is good with deal.ii to implement because it is pretty standard >>>>> finite element stuff, i.e., calculate fluxes at the internal >>>>> integration points and project the values to the nodes. >>>>> >>>>> Yes, the artificial diffusivity is an issue but I think this >>>>> explains why even when we turn it off we get relatively poor >>>>> Nusselt numbers while getting excellent agreement with >>>>> depth-averaged properties and mean values. >>>>> >>>>> Scott >>>>> >>>>>> On Sep 5, 2018, at 2:56 PM, Max Rudolph >>>>> > wrote: >>>>>> >>>>>> OK, you are right that there will always be some region with more >>>>>> artificial than physical heat transport. >>>>>> What about instead looking at the ratio of physical to artificial >>>>>> heat flux through each boundary? >>>>>> For Rene's anisotropic "SUEV" implementation, even in the >>>>>> presence of large entropy viscosity, the artificial heat >>>>>> transport can be very small as long as u.gradT is small. In >>>>>> particular, even though entropy viscosity is fairly large at the >>>>>> boundaries, the velocities are tangential to the boundary, so >>>>>> there is very little artificial diffusion. >>>>>> >>>>>> Max >>>>>> >>>>>> On Wed, Sep 5, 2018 at 10:41 AM Wolfgang Bangerth >>>>>> > wrote: >>>>>> >>>>>> On 09/05/2018 07:12 AM, Max Rudolph wrote: >>>>>> > >>>>>> > Rene and I discussed this idea on Monday and I don't think >>>>>> that this is >>>>>> > the right thing to do. It would lead to an unexpected >>>>>> relationship >>>>>> > between the temperature gradient (and hence temperature >>>>>> structure of the >>>>>> > lithosphere) and the physical thermal conductivity. Maybe >>>>>> more helpful >>>>>> > would be a separate output of the non-physical contribution >>>>>> to the heat >>>>>> > flux through each boundary, or within the entire domain as >>>>>> the ratio of >>>>>> > the norm of the artificial heat flux divided by the norm of >>>>>> the total >>>>>> > heat flux. I still think that a warning message when this >>>>>> quantity >>>>>> > exceeds, say, 1% would help users understand that they >>>>>> should expect >>>>>> > unphysical results. >>>>>> >>>>>> But this warning message would be printed on pretty much >>>>>> every single >>>>>> simulation in which the mesh does not completely resolve >>>>>> boundary and >>>>>> internal layers -- which is essentially every simulation ever >>>>>> done in >>>>>> the field of mantle convection. >>>>>> >>>>>> If it was a rare occasion where artificial viscosity is >>>>>> needed to make a >>>>>> simulation stable, then we wouldn't use it. But the reality >>>>>> is that all >>>>>> realistic global-scale simulations must necessarily have some >>>>>> kind of >>>>>> artificial diffusion (SUPG, EV, dG schemes, ...) that is >>>>>> larger than the >>>>>> physical diffusion at least in parts of the domain because >>>>>> resolving the >>>>>> boundary layers is not possible on a global scale and will >>>>>> not be >>>>>> possible for a long time to come. The idea of artificial >>>>>> diffusion >>>>>> schemes is to make boundary layers as large as the cells of >>>>>> the mesh so >>>>>> that they are resolved, rather than leading to >>>>>> over/undershoots. It is >>>>>> *needed* to avoid Gibb's phenomenon if you can't make the >>>>>> mesh small enough. >>>>>> >>>>>> That does not mean that (i) the scheme we currently use is >>>>>> the best >>>>>> idea, (ii) we can't improve the situation. But I do not think >>>>>> that >>>>>> printing a warning for essentially every single simulation is >>>>>> useful. >>>>>> >>>>>> (I'll note that we also use artificial diffusion schemes for the >>>>>> compositional fields for which the physical diffusion is zero >>>>>> -- so the >>>>>> artificial diffusion is *always* larger than the physical one.) >>>>>> >>>>>> Best >>>>>>   W. >>>>>> >>>>>> -- >>>>>> ------------------------------------------------------------------------ >>>>>> Wolfgang Bangerth          email: bangerth at colostate.edu >>>>>> >>>>>>                             www: >>>>>> http://www.math.colostate.edu/~bangerth/ >>>>>> >>>>>> _______________________________________________ >>>>>> Aspect-devel mailing list >>>>>> Aspect-devel at geodynamics.org >>>>>> >>>>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>>>>> >>>>>> _______________________________________________ >>>>>> Aspect-devel mailing list >>>>>> Aspect-devel at geodynamics.org >>>>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>>>> >>>>> >>>>> >>>>> _______________________________________________ >>>>> Aspect-devel mailing list >>>>> Aspect-devel at geodynamics.org >>>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>>> >>>> -- >>>> Rene Gassmoeller >>>> https://gassmoeller.github.io/ >> >> -- >> Rene Gassmoeller >> https://gassmoeller.github.io/ >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -- Rene Gassmoeller https://gassmoeller.github.io/ -------------- next part -------------- An HTML attachment was scrubbed... URL: From jperryh2 at uoregon.edu Fri Sep 14 08:08:03 2018 From: jperryh2 at uoregon.edu (Jonathan Perry-Houts) Date: Fri, 14 Sep 2018 08:08:03 -0700 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure In-Reply-To: <8107E67F0A485F42A0545655001D992D97A6BA93@OC11EXPO26.exchange.mit.edu> References: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> <7333fb72-c18c-de54-cb29-741dcba2a8c1@uoregon.edu> <8107E67F0A485F42A0545655001D992D97A6BA93@OC11EXPO26.exchange.mit.edu> Message-ID: After this thread I tried tracking down wtf was up with the uniform glyph distribution. Turns out it's a paraview bug. I fixed it, and I guess the patch will be part of the next release. https://gitlab.kitware.com/paraview/paraview/merge_requests/2733 However, the fix still relies on there being a node somewhere near each glyph, which means that for adaptively refined meshes some glyphs appear and reappear during the course of the model. If you want them to stay put, you still have to interpolate your model onto a uniform mesh. -JPH On 09/06/2018 07:30 PM, Adam Holt wrote: > Thanks Jonathan! After upgrading to the newer Paraview (5.5), the filter works nicely. Again, it didn't do the time-stepping for version 5.2, so I guess the issue is with the older version. Good to know. > > And totally agree about the "uniform spatial distribution" - It took me a while to figure out why this was not showing glyphs for my 2-D models... > > many thanks, > Adam > > ________________________________________ > From: Aspect-devel [aspect-devel-bounces at geodynamics.org] on behalf of Jonathan Perry-Houts [jperryh2 at uoregon.edu] > Sent: Thursday, September 06, 2018 7:41 PM > To: aspect-devel at geodynamics.org > Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure > > On 09/06/18 12:25, Adam Holt wrote: >> Hi all! >> >> I am a relatively new user of ASPECT, and have ran into some tricky things related to visualizing/post-processing ASPECT output using Paraview. I wondered if anyone had experience with the following: >> >> First, plotting evenly spaced velocity vectors for models with mesh refinement. As the element sizes vary dramatically, plotting vectors using "Every Nth Point" (Paraview Glyph option) produces very uneven vector coverage. I thought I had solved this by interpolating the data onto a plane (using Paraview function "Resample with dataset"), but the result of this function (an evenly spaced velocity field) does not time evolve with the simulation (at least for Paraview 5.2.0). Has anybody ran into a similar issue? > > That seems to work in Paraview 5.5.0 (it does evolve with the simulation > for me). Not sure if it matters, but I used a "Plane" source as the > uniform grid to Resample on. I attached a custom filter that works for > me (import it with Tools>Manage Custom Filters>Import). > > It's ridiculous that the glyph filter's "uniform spatial distribution" > option doesn't do this automatically. That's exactly what I would expect > it to do. Apparently it just selects uniformly spaced points, and if > there happens to be a node there, it will plot a vector, otherwise it > skips it. That's why it almost never works for "small" data sets like a > 2d mesh. > >> Second, I am interested in the dynamic pressure field and wondered how best to retrieve it from my (incompressible) models. For such models, I assume it can be computed by subtracting the horizontally-constant static pressure from the pressure outputted (the 'nonadiabatic pressure' output variable). Is this something that should be done in Paraview, or by writing a new post-processor plugin?> >> Thanks in advance for any input! >> Adam Holt >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >> > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > -- Jonathan Perry-Houts Ph.D. Candidate Department of Earth Sciences 1272 University of Oregon Eugene, OR 97403-1272 From bangerth at colostate.edu Fri Sep 14 08:29:29 2018 From: bangerth at colostate.edu (Wolfgang Bangerth) Date: Fri, 14 Sep 2018 09:29:29 -0600 Subject: [aspect-devel] Evenly spaced vectors, dynamic pressure In-Reply-To: References: <8107E67F0A485F42A0545655001D992D97A6BA2D@OC11EXPO26.exchange.mit.edu> <7333fb72-c18c-de54-cb29-741dcba2a8c1@uoregon.edu> <8107E67F0A485F42A0545655001D992D97A6BA93@OC11EXPO26.exchange.mit.edu> Message-ID: <4e8807b5-eebe-7a2e-feab-32971950c5e9@colostate.edu> On 09/14/2018 09:08 AM, Jonathan Perry-Houts wrote: > After this thread I tried tracking down wtf was up with the uniform > glyph distribution. Turns out it's a paraview bug. I fixed it, and I > guess the patch will be part of the next release. > https://gitlab.kitware.com/paraview/paraview/merge_requests/2733 Nice work, Jonathan! W. -- ------------------------------------------------------------------------ Wolfgang Bangerth email: bangerth at colostate.edu www: http://www.math.colostate.edu/~bangerth/ From jgallagher at opendap.org Thu Sep 20 08:28:48 2018 From: jgallagher at opendap.org (James Gallagher) Date: Thu, 20 Sep 2018 09:28:48 -0600 Subject: [aspect-devel] Writing a plugin for a new data format/source Message-ID: <6CE4AEA6-2D96-47A1-9312-B45863CF173D@opendap.org> Hello, We working on a NSF-funded project (BALTO) to develop a ‘data broker’ and our PI uses Aspect in her work,, which serves as one of the use-cases for the project. We would like guidance on developing an Aspect plugin to read data not from a local ASCII file, but from a (remote) web service. If a new plugin is not the best way forward, what would be the best way to extend Aspect so that it could read other data formats and/or data sources (i.e., from remote web services and not local files)? We have experience in C++ and use OSX and Linux as our primary development platforms. See www.opendap.org. Our data server (which is being used for this brokering project) is written in C++ and Java and is available on GitHub (github.com/opendap). Incidentally, the server, like Aspect, makes extensive use of plugins written in C++. If there’s a better contact point than this list, please let me know. Thanks, James -- James Gallagher jgallagher at opendap.org -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 195 bytes Desc: Message signed with OpenPGP URL: From heister at clemson.edu Fri Sep 21 08:35:40 2018 From: heister at clemson.edu (Timo Heister) Date: Fri, 21 Sep 2018 09:35:40 -0600 Subject: [aspect-devel] Writing a plugin for a new data format/source In-Reply-To: <6CE4AEA6-2D96-47A1-9312-B45863CF173D@opendap.org> References: <6CE4AEA6-2D96-47A1-9312-B45863CF173D@opendap.org> Message-ID: James, > We working on a NSF-funded project (BALTO) to develop a ‘data broker’ and our PI uses Aspect in her work,, which serves as one of the use-cases for the project. We would like guidance on developing an Aspect plugin to read data not from a local ASCII file, but from a (remote) web service. We are reading various kind of files and formats in different plugins (initial conditions, gravity, material model, to name a few). Search for "data file name" in https://urldefense.proofpoint.com/v2/url?u=https-3A__tjhei.github.io_aspect-2Dwww-2Dmanual_&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=M72AauGyerMMa0b-5d6UAdfFfqi5dhWbxYokW5N_y94&s=nv5Ev6xyH8qE7UV-Ga5U9axYfwBn2t1xTVUY0x-pY-A&e= for example. I am not sure what you are trying to do precisely. > If a new plugin is not the best way forward, what would be the best way to extend Aspect so that it could read other data formats and/or data sources (i.e., from remote web services and not local files)? A plugin would only allow for one of the many ways a user can read data in to work. For example, you could write a RemoteFileGravityAsciiData plugin. I would think a better option would be to enhance read_and_distribute_file_content() in aspect/utilities.h to accept URLs and downloading them as needed. This function is used by various current and future plugins. You need to worry about a few things though: - on a cluster you might not have internet access - in a parallel run you don't want every single of your 1000+ processes to download the files - caching of large files? Best, Timo -- Timo Heister http://www.math.clemson.edu/~heister/ From ljhwang at ucdavis.edu Fri Sep 21 08:54:19 2018 From: ljhwang at ucdavis.edu (Lorraine Hwang) Date: Fri, 21 Sep 2018 08:54:19 -0700 Subject: [aspect-devel] Writing a plugin for a new data format/source In-Reply-To: References: <6CE4AEA6-2D96-47A1-9312-B45863CF173D@opendap.org> Message-ID: Hi James, Can you define what you mean by data? Is this ASPECT input files or output files? Can you describe a use case? Best, -Lorraine ***************************** Lorraine Hwang, Ph.D. Associate Director, CIG 530.752.3656 > On Sep 21, 2018, at 8:35 AM, Timo Heister wrote: > > James, > >> We working on a NSF-funded project (BALTO) to develop a ‘data broker’ and our PI uses Aspect in her work,, which serves as one of the use-cases for the project. We would like guidance on developing an Aspect plugin to read data not from a local ASCII file, but from a (remote) web service. > > We are reading various kind of files and formats in different plugins > (initial conditions, gravity, material model, to name a few). Search > for "data file name" in https://urldefense.proofpoint.com/v2/url?u=https-3A__tjhei.github.io_aspect-2Dwww-2Dmanual_&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=M72AauGyerMMa0b-5d6UAdfFfqi5dhWbxYokW5N_y94&s=nv5Ev6xyH8qE7UV-Ga5U9axYfwBn2t1xTVUY0x-pY-A&e= for > example. I am not sure what you are trying to do precisely. > >> If a new plugin is not the best way forward, what would be the best way to extend Aspect so that it could read other data formats and/or data sources (i.e., from remote web services and not local files)? > > A plugin would only allow for one of the many ways a user can read > data in to work. For example, you could write a > RemoteFileGravityAsciiData plugin. I would think a better option would > be to enhance read_and_distribute_file_content() in aspect/utilities.h > to accept URLs and downloading them as needed. This function is used > by various current and future plugins. You need to worry about a few > things though: > - on a cluster you might not have internet access > - in a parallel run you don't want every single of your 1000+ > processes to download the files > - caching of large files? > Best, > Timo > -- > Timo Heister > http://www.math.clemson.edu/~heister/ > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From bangerth at colostate.edu Fri Sep 21 10:36:49 2018 From: bangerth at colostate.edu (Wolfgang Bangerth) Date: Fri, 21 Sep 2018 17:36:49 +0000 Subject: [aspect-devel] Writing a plugin for a new data format/source In-Reply-To: <6CE4AEA6-2D96-47A1-9312-B45863CF173D@opendap.org> References: <6CE4AEA6-2D96-47A1-9312-B45863CF173D@opendap.org> Message-ID: James, > We working on a NSF-funded project (BALTO) to develop a ‘data broker’ and our PI uses Aspect in her work,, which serves as one of the use-cases for the project. We would like guidance on developing an Aspect plugin to read data not from a local ASCII file, but from a (remote) web service. > > If a new plugin is not the best way forward, what would be the best way to extend Aspect so that it could read other data formats and/or data sources (i.e., from remote web services and not local files)? Timo has already mentioned a number of issues, in particular the fact that on most clusters, individual nodes don't have access to the wider internet. This makes working with data sources located elsewhere awkward. Off the top of my head, what would be interesting would be, for example, to deal with large data sets that may be stored elsewhere. Data sets such as CRUST 1.0 might be interesting, or the STS elevation data for the earth surface. In all of these cases, however, if the data is too large to store locally, then it is also too large to download in its entirety from a WebDAV (or OpenDAP) server. A more interesting method would then of course be if the data could be obtained by sending a query to some server that then processes or subsets the data stored there. I don't know whether that's part of you're working on. An alternative -- maybe more interesting -- would be to not deal with input data but with output data. A common use case is if a simulation produces visualization data that is then visualized on some other system. Transferring this data after the run is often annoying and awkward. A nicer way would be if it is automatically transferred as part of the simulation run -- though one of course gets into the same issue with internet access from big clusters. So my recommendation would be to first find a good use case and then think about how to implement it, rather than the other way around. Best Wolfgang -- ------------------------------------------------------------------------ Wolfgang Bangerth email: bangerth at colostate.edu www: http://www.math.colostate.edu/~bangerth/ From jgallagher at opendap.org Mon Sep 24 16:44:31 2018 From: jgallagher at opendap.org (James Gallagher) Date: Mon, 24 Sep 2018 17:44:31 -0600 Subject: [aspect-devel] Writing a plugin for a new data format/source In-Reply-To: References: <6CE4AEA6-2D96-47A1-9312-B45863CF173D@opendap.org> Message-ID: <5E3141AC-FEF0-4593-A79C-AECD357B2154@opendap.org> > On Sep 21, 2018, at 09:35, Timo Heister wrote: > > James, > >> We working on a NSF-funded project (BALTO) to develop a ‘data broker’ and our PI uses Aspect in her work,, which serves as one of the use-cases for the project. We would like guidance on developing an Aspect plugin to read data not from a local ASCII file, but from a (remote) web service. > > We are reading various kind of files and formats in different plugins > (initial conditions, gravity, material model, to name a few). Search > for "data file name" in https://urldefense.proofpoint.com/v2/url?u=https-3A__tjhei.github.io_aspect-2Dwww-2Dmanual_&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=M72AauGyerMMa0b-5d6UAdfFfqi5dhWbxYokW5N_y94&s=nv5Ev6xyH8qE7UV-Ga5U9axYfwBn2t1xTVUY0x-pY-A&e= for > example. I am not sure what you are trying to do precisely. > >> If a new plugin is not the best way forward, what would be the best way to extend Aspect so that it could read other data formats and/or data sources (i.e., from remote web services and not local files)? > > A plugin would only allow for one of the many ways a user can read > data in to work. For example, you could write a > RemoteFileGravityAsciiData plugin. > I would think a better option would > be to enhance read_and_distribute_file_content() in aspect/utilities.h > to accept URLs and downloading them as needed. This function is used > by various current and future plugins. You need to worry about a few > things though: > - on a cluster you might not have internet access > - in a parallel run you don't want every single of your 1000+ > processes to download the files > - caching of large files? This sounds like an option we should look into - it fits with the use case we have for the BALTO project. We’ll look into this and ask more questions as needed. You raise three good points, but we have experience with caching and sharing data among multiple processes. Thanks, James > Best, > Timo > -- > Timo Heister > http://www.math.clemson.edu/~heister/ > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -- James Gallagher jgallagher at opendap.org -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 195 bytes Desc: Message signed with OpenPGP URL: From jgallagher at opendap.org Mon Sep 24 16:53:00 2018 From: jgallagher at opendap.org (James Gallagher) Date: Mon, 24 Sep 2018 17:53:00 -0600 Subject: [aspect-devel] Writing a plugin for a new data format/source In-Reply-To: References: <6CE4AEA6-2D96-47A1-9312-B45863CF173D@opendap.org> Message-ID: <11D1C93F-BC03-4ACD-92E7-5EB7950BCAD6@opendap.org> > On Sep 21, 2018, at 09:54, Lorraine Hwang wrote: > > Hi James, > > Can you define what you mean by data? Is this ASPECT input files or output files? Can you describe a use case? Input data. Our collaborator has written a plugin that will read ASCII data from a local file. She would like to have that plugin read data from a web service instead. Thanks, James PS. I think the answer from Timo is going to get us started > > Best, > -Lorraine > > ***************************** > Lorraine Hwang, Ph.D. > Associate Director, CIG > 530.752.3656 > > > >> On Sep 21, 2018, at 8:35 AM, Timo Heister > wrote: >> >> James, >> >>> We working on a NSF-funded project (BALTO) to develop a ‘data broker’ and our PI uses Aspect in her work,, which serves as one of the use-cases for the project. We would like guidance on developing an Aspect plugin to read data not from a local ASCII file, but from a (remote) web service. >> >> We are reading various kind of files and formats in different plugins >> (initial conditions, gravity, material model, to name a few). Search >> for "data file name" in https://urldefense.proofpoint.com/v2/url?u=https-3A__tjhei.github.io_aspect-2Dwww-2Dmanual_&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=M72AauGyerMMa0b-5d6UAdfFfqi5dhWbxYokW5N_y94&s=nv5Ev6xyH8qE7UV-Ga5U9axYfwBn2t1xTVUY0x-pY-A&e= for >> example. I am not sure what you are trying to do precisely. >> >>> If a new plugin is not the best way forward, what would be the best way to extend Aspect so that it could read other data formats and/or data sources (i.e., from remote web services and not local files)? >> >> A plugin would only allow for one of the many ways a user can read >> data in to work. For example, you could write a >> RemoteFileGravityAsciiData plugin. I would think a better option would >> be to enhance read_and_distribute_file_content() in aspect/utilities.h >> to accept URLs and downloading them as needed. This function is used >> by various current and future plugins. You need to worry about a few >> things though: >> - on a cluster you might not have internet access >> - in a parallel run you don't want every single of your 1000+ >> processes to download the files >> - caching of large files? >> Best, >> Timo >> -- >> Timo Heister >> http://www.math.clemson.edu/~heister/ >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -- James Gallagher jgallagher at opendap.org -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 195 bytes Desc: Message signed with OpenPGP URL: From jgallagher at opendap.org Mon Sep 24 16:59:04 2018 From: jgallagher at opendap.org (James Gallagher) Date: Mon, 24 Sep 2018 17:59:04 -0600 Subject: [aspect-devel] Writing a plugin for a new data format/source In-Reply-To: References: <6CE4AEA6-2D96-47A1-9312-B45863CF173D@opendap.org> Message-ID: > On Sep 21, 2018, at 11:36, Wolfgang Bangerth wrote: > > > James, > >> We working on a NSF-funded project (BALTO) to develop a ‘data broker’ and our PI uses Aspect in her work,, which serves as one of the use-cases for the project. We would like guidance on developing an Aspect plugin to read data not from a local ASCII file, but from a (remote) web service. >> >> If a new plugin is not the best way forward, what would be the best way to extend Aspect so that it could read other data formats and/or data sources (i.e., from remote web services and not local files)? > > Timo has already mentioned a number of issues, in particular the fact > that on most clusters, individual nodes don't have access to the wider > internet. This makes working with data sources located elsewhere awkward. Yes. I’ll talk with our collaborator about this. > > Off the top of my head, what would be interesting would be, for example, > to deal with large data sets that may be stored elsewhere. Data sets > such as CRUST 1.0 might be interesting, or the STS elevation data for > the earth surface. In all of these cases, however, if the data is too > large to store locally, then it is also too large to download in its > entirety from a WebDAV (or OpenDAP) server. A more interesting method > would then of course be if the data could be obtained by sending a query > to some server that then processes or subsets the data stored there. I > don't know whether that's part of you're working on. Our server, which will be the basic of the BALTO broker, does support subsetting data. > > An alternative -- maybe more interesting -- would be to not deal with > input data but with output data. A common use case is if a simulation > produces visualization data that is then visualized on some other > system. Transferring this data after the run is often annoying and > awkward. A nicer way would be if it is automatically transferred as part > of the simulation run -- though one of course gets into the same issue > with internet access from big clusters. That does sound interesting but I’m not sure it is within the scope of our project. Thanks, James > > > So my recommendation would be to first find a good use case and then > think about how to implement it, rather than the other way around. > > Best > Wolfgang > > > -- > ------------------------------------------------------------------------ > Wolfgang Bangerth email: bangerth at colostate.edu > www: http://www.math.colostate.edu/~bangerth/ > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -- James Gallagher jgallagher at opendap.org -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: signature.asc Type: application/pgp-signature Size: 195 bytes Desc: Message signed with OpenPGP URL: From marine.lasbleis at elsi.jp Tue Sep 25 01:22:35 2018 From: marine.lasbleis at elsi.jp (Marine Lasbleis) Date: Tue, 25 Sep 2018 10:22:35 +0200 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> Message-ID: Sorry, I sent the first message from the wrong email address. It would be better to use this one! -------- Forwarded Message -------- Subject: Installation on cluster CentOS Date: Tue, 25 Sep 2018 10:17:18 +0200 From: Marine Lasbleis To: aspect-devel at geodynamics.org Hi all, I just moved to a new lab, and I am trying to install ASPECT on the cluster here. So far, it works on my new computer + the test computer, so I'm quite happy... But the installation on the cluster is more complicated. I am stucked with the installation of deal_ii. The cluster is on CentOS, and uses modules. So far, I asked the staff to provide most of the required librairies, and I think we got almost all of them. I am using candi to do the installation, and simply following the instructions. I am using gcc 8.1.0, and we have lapack compiled with gcc8.1.0. cmake is 3.9.6. I have tried to use hdf5 and petsc from the modules, but the system did not recognize the HDF5_DIR and PETSC_DIR, so I compiled them through candi. I got all installed (parmetis, p4est, slepc, hdf5, trilinos), but the compilation of deal_ii end up with an error (highlighted below). And I looked at the log file //home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log/, but there is absolutely no error in the log file... So I'm a little bit puzzled. / / Any idea that could help me? / / (attached file: log file. And below, last lines I got while running candi.sh) Best, Marine / / /[more things here] ......../ /-- Include /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_cpack.cmake// //-- // //-- Include /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_custom_targets.cmake// //-- // //-- Include /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_finalize.cmake// //CMake Error at cmake/setup_finalize.cmake:95 (MESSAGE):// //// // //*  Configuration error: Cannot compile a test program with the final set of*/*/ /**/    compiler and linker flags:/*/ //      CXX flags (DEBUG): -pedantic -fPIC -Wall -Wextra -Wpointer-arith -Wwrite-strings -Wsynth -Wsign-compare -Wswitch -Woverloaded-virtual -Wno-placement-new -Wno-deprecated-declarations -Wno-literal-suffix -fopenmp-simd -std=c++17 -Wno-parentheses -Wno-unused-local-typedefs -Og -ggdb -Wa,--compress-debug-sections// //      LD flags  (DEBUG): -Wl,--as-needed -rdynamic -fuse-ld=gold -ggdb// //      LIBRARIES (DEBUG): /usr/lib64/libz.so;rt;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-interface.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteko.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazitpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libModeLaplace.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasaziepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelostpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelosepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen_extras.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-xpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-epetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libisorropia.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra-sup.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyratpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyracore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrainout.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkostsqr.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrakernels.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassiclinalg.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassicnodeapi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassic.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtriutils.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libsacado.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/librtop.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscompat.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosremainder.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosnumerics.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosparameterlist.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkosalgorithms.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscontainers.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscore.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5_hl.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBO.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBool.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBRep.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKernel.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFeat.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFillet.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG2d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG3d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKHLR.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKIGES.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMath.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMesh.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKOffset.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKPrim.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKShHealing.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPAttr.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP209.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTL.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKTopAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKXSBase.so;mpi_usempif08;mpi_usempi_ignore_tkr;mpi_mpifh;mpi;c;gcc_s;gcc;/home/LPGN/lasbleis-m/bin/slepc-3.7.3/lib/libslepc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpetsc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libcmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libdmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libsmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libzmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libmumps_common.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpord.a;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libparmetis.so;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libmetis.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libHYPRE.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libscalapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/liblapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/libblas.a;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempif08.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempi_ignore_tkr.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_mpifh.so;gfortran;quadmath;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_cxx.so;m;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi.so;pthread;dl;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libp4est.so;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libsc.so// //// // //// // //Call Stack (most recent call first):// //  cmake/macros/macro_verbose_include.cmake:19 (INCLUDE)// //  CMakeLists.txt:132 (VERBOSE_INCLUDE)// // // //-- Configuring incomplete, errors occurred!// //See also "/home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log".// //Failure with exit status: 1// //Exit message: There was a problem configuring dealii v9.0.0./ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: CMakeOutput.log Type: text/x-log Size: 99924 bytes Desc: not available URL: From marine.lasbleis at ens-lyon.fr Tue Sep 25 01:17:18 2018 From: marine.lasbleis at ens-lyon.fr (Marine Lasbleis) Date: Tue, 25 Sep 2018 10:17:18 +0200 Subject: [aspect-devel] Installation on cluster CentOS Message-ID: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> Hi all, I just moved to a new lab, and I am trying to install ASPECT on the cluster here. So far, it works on my new computer + the test computer, so I'm quite happy... But the installation on the cluster is more complicated. I am stucked with the installation of deal_ii. The cluster is on CentOS, and uses modules. So far, I asked the staff to provide most of the required librairies, and I think we got almost all of them. I am using candi to do the installation, and simply following the instructions. I am using gcc 8.1.0, and we have lapack compiled with gcc8.1.0. cmake is 3.9.6. I have tried to use hdf5 and petsc from the modules, but the system did not recognize the HDF5_DIR and PETSC_DIR, so I compiled them through candi. I got all installed (parmetis, p4est, slepc, hdf5, trilinos), but the compilation of deal_ii end up with an error (highlighted below). And I looked at the log file //home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log/, but there is absolutely no error in the log file... So I'm a little bit puzzled. / / Any idea that could help me? / / (attached file: log file. And below, last lines I got while running candi.sh) Best, Marine / / /[more things here] ......../ /-- Include /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_cpack.cmake// //-- // //-- Include /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_custom_targets.cmake// //-- // //-- Include /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_finalize.cmake// //CMake Error at cmake/setup_finalize.cmake:95 (MESSAGE):// //// // //*  Configuration error: Cannot compile a test program with the final set of*/*/ /**/    compiler and linker flags:/*/ //      CXX flags (DEBUG): -pedantic -fPIC -Wall -Wextra -Wpointer-arith -Wwrite-strings -Wsynth -Wsign-compare -Wswitch -Woverloaded-virtual -Wno-placement-new -Wno-deprecated-declarations -Wno-literal-suffix -fopenmp-simd -std=c++17 -Wno-parentheses -Wno-unused-local-typedefs -Og -ggdb -Wa,--compress-debug-sections// //      LD flags  (DEBUG): -Wl,--as-needed -rdynamic -fuse-ld=gold -ggdb// //      LIBRARIES (DEBUG): /usr/lib64/libz.so;rt;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-interface.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteko.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazitpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libModeLaplace.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasaziepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelostpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelosepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen_extras.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-xpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-epetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libisorropia.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra-sup.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyratpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyracore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrainout.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkostsqr.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrakernels.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassiclinalg.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassicnodeapi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassic.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtriutils.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libsacado.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/librtop.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscompat.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosremainder.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosnumerics.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosparameterlist.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkosalgorithms.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscontainers.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscore.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5_hl.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBO.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBool.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBRep.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKernel.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFeat.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFillet.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG2d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG3d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKHLR.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKIGES.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMath.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMesh.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKOffset.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKPrim.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKShHealing.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPAttr.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP209.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTL.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKTopAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKXSBase.so;mpi_usempif08;mpi_usempi_ignore_tkr;mpi_mpifh;mpi;c;gcc_s;gcc;/home/LPGN/lasbleis-m/bin/slepc-3.7.3/lib/libslepc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpetsc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libcmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libdmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libsmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libzmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libmumps_common.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpord.a;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libparmetis.so;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libmetis.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libHYPRE.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libscalapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/liblapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/libblas.a;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempif08.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempi_ignore_tkr.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_mpifh.so;gfortran;quadmath;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_cxx.so;m;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi.so;pthread;dl;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libp4est.so;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libsc.so// //// // //// // //Call Stack (most recent call first):// //  cmake/macros/macro_verbose_include.cmake:19 (INCLUDE)// //  CMakeLists.txt:132 (VERBOSE_INCLUDE)// // // //-- Configuring incomplete, errors occurred!// //See also "/home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log".// //Failure with exit status: 1// //Exit message: There was a problem configuring dealii v9.0.0./ -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: CMakeOutput.log Type: text/x-log Size: 99922 bytes Desc: not available URL: From maxrudolph at ucdavis.edu Tue Sep 25 11:31:08 2018 From: maxrudolph at ucdavis.edu (Max Rudolph) Date: Tue, 25 Sep 2018 11:31:08 -0700 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> Message-ID: Are you using candi to install trilinos, p4est, and deal.ii? I found that this is much easier on our cluster than installing the packages manually. On Tue, Sep 25, 2018 at 1:29 AM Marine Lasbleis wrote: > Sorry, I sent the first message from the wrong email address. It would be > better to use this one! > > > -------- Forwarded Message -------- > Subject: Installation on cluster CentOS > Date: Tue, 25 Sep 2018 10:17:18 +0200 > From: Marine Lasbleis > > To: aspect-devel at geodynamics.org > > Hi all, > > I just moved to a new lab, and I am trying to install ASPECT on the > cluster here. So far, it works on my new computer + the test computer, so > I'm quite happy... But the installation on the cluster is more complicated. > > I am stucked with the installation of deal_ii. > > The cluster is on CentOS, and uses modules. So far, I asked the staff to > provide most of the required librairies, and I think we got almost all of > them. > > I am using candi to do the installation, and simply following the > instructions. > > I am using gcc 8.1.0, and we have lapack compiled with gcc8.1.0. cmake is > 3.9.6. I have tried to use hdf5 and petsc from the modules, but the system > did not recognize the HDF5_DIR and PETSC_DIR, so I compiled them through > candi. > > I got all installed (parmetis, p4est, slepc, hdf5, trilinos), but the > compilation of deal_ii end up with an error (highlighted below). And I > looked at the log file > */home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log*, > but there is absolutely no error in the log file... So I'm a little bit > puzzled. > > Any idea that could help me? > > (attached file: log file. And below, last lines I got while running > candi.sh) > > Best, > > Marine > > > *[more things here] ........* > > *-- Include > /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_cpack.cmake* > *-- * > *-- Include > /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_custom_targets.cmake* > *-- * > *-- Include > /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_finalize.cmake* > *CMake Error at cmake/setup_finalize.cmake:95 (MESSAGE):* > > > * Configuration error: Cannot compile a test program with the final set > of* > * compiler and linker flags:* > * CXX flags (DEBUG): -pedantic -fPIC -Wall -Wextra -Wpointer-arith > -Wwrite-strings -Wsynth -Wsign-compare -Wswitch -Woverloaded-virtual > -Wno-placement-new -Wno-deprecated-declarations -Wno-literal-suffix > -fopenmp-simd -std=c++17 -Wno-parentheses -Wno-unused-local-typedefs -Og > -ggdb -Wa,--compress-debug-sections* > * LD flags (DEBUG): -Wl,--as-needed -rdynamic -fuse-ld=gold -ggdb* > * LIBRARIES (DEBUG): > /usr/lib64/libz.so;rt;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-interface.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteko.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazitpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libModeLaplace.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasaziepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelostpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelosepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen_extras.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-xpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-epetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libisorropia.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra-sup.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyratpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyracore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrainout.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkostsqr.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrakernels.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassiclinalg.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassicnodeapi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassic.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtriutils.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libsacado.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/librtop.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscompat.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosremainder.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosnumerics.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosparameterlist.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkosalgorithms.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscontainers.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscore.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5_hl.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBO.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBool.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBRep.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKernel.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFeat.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFillet.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG2d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG3d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKHLR.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKIGES.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMath.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMesh.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKOffset.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKPrim.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKShHealing.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPAttr.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP209.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTL.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKTopAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKXSBase.so;mpi_usempif08;mpi_usempi_ignore_tkr;mpi_mpifh;mpi;c;gcc_s;gcc;/home/LPGN/lasbleis-m/bin/slepc-3.7.3/lib/libslepc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpetsc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libcmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libdmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libsmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libzmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libmumps_common.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpord.a;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libparmetis.so;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libmetis.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libHYPRE.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libscalapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/liblapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/libblas.a;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempif08.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempi_ignore_tkr.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_mpifh.so;gfortran;quadmath;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_cxx.so;m;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi.so;pthread;dl;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libp4est.so;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libsc.so* > > > > > *Call Stack (most recent call first):* > * cmake/macros/macro_verbose_include.cmake:19 (INCLUDE)* > * CMakeLists.txt:132 (VERBOSE_INCLUDE)* > > > *-- Configuring incomplete, errors occurred!* > *See also > "/home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log".* > *Failure with exit status: 1* > *Exit message: There was a problem configuring dealii v9.0.0.* > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From rene.gassmoeller at mailbox.org Tue Sep 25 12:00:12 2018 From: rene.gassmoeller at mailbox.org (Rene Gassmoeller) Date: Tue, 25 Sep 2018 19:00:12 -0000 Subject: [aspect-devel] ASPECT Newsletter #64 Message-ID: <20180925185124.3D156AC1AFF@geodynamics.org> Hello everyone! This is ASPECT newsletter #64. It automatically reports recently merged features and discussions about the ASPECT mantle convection code. ## Below you find a list of recently proposed or merged features: #2671: Fix bound preserving limiter in 3d (proposed by gassmoeller) https://github.com/geodynamics/aspect/pull/2671 #2670: Fix input check for bound preserving limiter option (proposed by gassmoeller; merged) https://github.com/geodynamics/aspect/pull/2670 #2669: Fix assert two merged boxes in point_in_domain on free surface. (proposed by MFraters) https://github.com/geodynamics/aspect/pull/2669 #2668: Negative longitude in uniform radial particle generator (proposed by bartniday; merged) https://github.com/geodynamics/aspect/pull/2668 #2667: Fix reference temperature for discontinuous element face terms (proposed by gassmoeller) https://github.com/geodynamics/aspect/pull/2667 #2666: Add cpp material model (proposed by jperryhouts) https://github.com/geodynamics/aspect/pull/2666 #2665: www: citing.html re-add lost line (proposed by tjhei; merged) https://github.com/geodynamics/aspect/pull/2665 #2664: update script for database.js for citing.html (proposed by tjhei; merged) https://github.com/geodynamics/aspect/pull/2664 #2663: www: citing.html: fix escaping (proposed by tjhei; merged) https://github.com/geodynamics/aspect/pull/2663 #2662: fix umlauts in citation files (proposed by jdannberg) https://github.com/geodynamics/aspect/pull/2662 #2661: allow /rebuild in Jenkins (proposed by tjhei) https://github.com/geodynamics/aspect/pull/2661 #2659: Unify heat flux computation (proposed by gassmoeller; merged) https://github.com/geodynamics/aspect/pull/2659 #2652: Provide formulas for the material-statistics postprocessor. (proposed by bangerth; merged) https://github.com/geodynamics/aspect/pull/2652 ## And this is a list of recently opened or closed discussions: #2618: discuss: developer communication via chat (closed) https://github.com/geodynamics/aspect/issues/2618 A list of all major changes since the last release can be found at https://aspect.geodynamics.org/doc/doxygen/changes_current.html. Thanks for being part of the community! Let us know about questions, problems, bugs or just share your experience by writing to aspect-devel at geodynamics.org, or by opening issues or pull requests at https://www.github.com/geodynamics/aspect. Additional information can be found at https://aspect.geodynamics.org/, and https://geodynamics.org/cig/software/aspect/. -------------- next part -------------- An HTML attachment was scrubbed... URL: From marine.lasbleis at elsi.jp Tue Sep 25 13:17:52 2018 From: marine.lasbleis at elsi.jp (Marine Lasbleis) Date: Tue, 25 Sep 2018 22:17:52 +0200 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> Message-ID: Yes, candi! I also found it quite easy... But I really don't understand the error (especially as there is none in the log file) On 2018年09月25日 20:31, Max Rudolph wrote: > Are you using candi to install trilinos, p4est, and deal.ii? I found > that this is much easier on our cluster than installing the packages > manually. > > On Tue, Sep 25, 2018 at 1:29 AM Marine Lasbleis > > wrote: > > Sorry, I sent the first message from the wrong email address. It > would be better to use this one! > > > > -------- Forwarded Message -------- > Subject: Installation on cluster CentOS > Date: Tue, 25 Sep 2018 10:17:18 +0200 > From: Marine Lasbleis > > To: aspect-devel at geodynamics.org > > > > > Hi all, > > I just moved to a new lab, and I am trying to install ASPECT on > the cluster here. So far, it works on my new computer + the test > computer, so I'm quite happy... But the installation on the > cluster is more complicated. > > I am stucked with the installation of deal_ii. > > The cluster is on CentOS, and uses modules. So far, I asked the > staff to provide most of the required librairies, and I think we > got almost all of them. > > I am using candi to do the installation, and simply following the > instructions. > > I am using gcc 8.1.0, and we have lapack compiled with gcc8.1.0. > cmake is 3.9.6. I have tried to use hdf5 and petsc from the > modules, but the system did not recognize the HDF5_DIR and > PETSC_DIR, so I compiled them through candi. > > I got all installed (parmetis, p4est, slepc, hdf5, trilinos), but > the compilation of deal_ii end up with an error (highlighted > below). And I looked at the log file > //home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log/, > but there is absolutely no error in the log file... So I'm a > little bit puzzled. / > / > > Any idea that could help me? / > / > > (attached file: log file. And below, last lines I got while > running candi.sh) > > Best, > > Marine > > / > / > > /[more things here] ......../ > > /-- Include > /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_cpack.cmake// > //-- // > //-- Include > /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_custom_targets.cmake// > //-- // > //-- Include > /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_finalize.cmake// > //CMake Error at cmake/setup_finalize.cmake:95 (MESSAGE):// > //// > // > //*  Configuration error: Cannot compile a test program with the > final set of*/*/ > /**/    compiler and linker flags:/*/ > //      CXX flags (DEBUG): -pedantic -fPIC -Wall -Wextra > -Wpointer-arith -Wwrite-strings -Wsynth -Wsign-compare -Wswitch > -Woverloaded-virtual -Wno-placement-new > -Wno-deprecated-declarations -Wno-literal-suffix -fopenmp-simd > -std=c++17 -Wno-parentheses -Wno-unused-local-typedefs -Og -ggdb > -Wa,--compress-debug-sections// > //      LD flags  (DEBUG): -Wl,--as-needed -rdynamic -fuse-ld=gold > -ggdb// > //      LIBRARIES (DEBUG): > /usr/lib64/libz.so;rt;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-interface.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteko.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazitpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libModeLaplace.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasaziepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelostpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelosepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen_extras.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-xpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-epetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libisorropia.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra-sup.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyratpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyracore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrainout.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkostsqr.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrakernels.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassiclinalg.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassicnodeapi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassic.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtriutils.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libsacado.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/librtop.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscompat.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosremainder.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosnumerics.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosparameterlist.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkosalgorithms.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscontainers.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscore.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5_hl.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBO.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBool.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBRep.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKernel.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFeat.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFillet.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG2d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG3d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKHLR.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKIGES.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMath.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMesh.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKOffset.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKPrim.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKShHealing.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPAttr.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP209.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTL.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKTopAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKXSBase.so;mpi_usempif08;mpi_usempi_ignore_tkr;mpi_mpifh;mpi;c;gcc_s;gcc;/home/LPGN/lasbleis-m/bin/slepc-3.7.3/lib/libslepc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpetsc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libcmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libdmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libsmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libzmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libmumps_common.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpord.a;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libparmetis.so;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libmetis.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libHYPRE.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libscalapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/liblapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/libblas.a;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempif08.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempi_ignore_tkr.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_mpifh.so;gfortran;quadmath;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_cxx.so;m;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi.so;pthread;dl;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libp4est.so;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libsc.so// > //// > // > //// > // > //Call Stack (most recent call first):// > //  cmake/macros/macro_verbose_include.cmake:19 (INCLUDE)// > //  CMakeLists.txt:132 (VERBOSE_INCLUDE)// > // > // > //-- Configuring incomplete, errors occurred!// > //See also > "/home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log".// > //Failure with exit status: 1// > //Exit message: There was a problem configuring dealii v9.0.0./ > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From jbnaliboff at ucdavis.edu Tue Sep 25 13:30:25 2018 From: jbnaliboff at ucdavis.edu (John Naliboff) Date: Tue, 25 Sep 2018 13:30:25 -0700 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> Message-ID: <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> Hi Marine, A couple questions to help with debugging the installation issue:   1. Can you send over the candi.cfg file that you used and any additional specified options used when executing candi.sh?   2. In the specified candi install directory the following folder should be present: tmp/build/deal.II-v9.0.0. Within that directory should be a file called candi_configure.log, which contains detailed information about the configuration process (including errors). If present, can you send that file over as well? As a side note, I often configure candi to only install p4est and Trilinos with deal.II. Unless you are using PETSc instead of Trilinos, there is no need to include it, slepsc and parmetis. Likely completely unrelated to the issue you encountered, but not building these packages will help speedup the installation. Cheers, John On 09/25/2018 01:17 PM, Marine Lasbleis wrote: > > Yes, candi! I also found it quite easy... But I really don't > understand the error (especially as there is none in the log file) > > > On 2018年09月25日 20:31, Max Rudolph wrote: >> Are you using candi to install trilinos, p4est, and deal.ii? I found >> that this is much easier on our cluster than installing the packages >> manually. >> >> On Tue, Sep 25, 2018 at 1:29 AM Marine Lasbleis >> > wrote: >> >> Sorry, I sent the first message from the wrong email address. It >> would be better to use this one! >> >> >> >> -------- Forwarded Message -------- >> Subject: Installation on cluster CentOS >> Date: Tue, 25 Sep 2018 10:17:18 +0200 >> From: Marine Lasbleis >> >> To: aspect-devel at geodynamics.org >> >> >> >> >> Hi all, >> >> I just moved to a new lab, and I am trying to install ASPECT on >> the cluster here. So far, it works on my new computer + the test >> computer, so I'm quite happy... But the installation on the >> cluster is more complicated. >> >> I am stucked with the installation of deal_ii. >> >> The cluster is on CentOS, and uses modules. So far, I asked the >> staff to provide most of the required librairies, and I think we >> got almost all of them. >> >> I am using candi to do the installation, and simply following the >> instructions. >> >> I am using gcc 8.1.0, and we have lapack compiled with gcc8.1.0. >> cmake is 3.9.6. I have tried to use hdf5 and petsc from the >> modules, but the system did not recognize the HDF5_DIR and >> PETSC_DIR, so I compiled them through candi. >> >> I got all installed (parmetis, p4est, slepc, hdf5, trilinos), but >> the compilation of deal_ii end up with an error (highlighted >> below). And I looked at the log file >> //home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log/, >> but there is absolutely no error in the log file... So I'm a >> little bit puzzled. / >> / >> >> Any idea that could help me? / >> / >> >> (attached file: log file. And below, last lines I got while >> running candi.sh) >> >> Best, >> >> Marine >> >> / >> / >> >> /[more things here] ......../ >> >> /-- Include >> /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_cpack.cmake// >> //-- // >> //-- Include >> /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_custom_targets.cmake// >> //-- // >> //-- Include >> /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_finalize.cmake// >> //CMake Error at cmake/setup_finalize.cmake:95 (MESSAGE):// >> //// >> // >> //*  Configuration error: Cannot compile a test program with the >> final set of*/*/ >> /**/    compiler and linker flags:/*/ >> //      CXX flags (DEBUG): -pedantic -fPIC -Wall -Wextra >> -Wpointer-arith -Wwrite-strings -Wsynth -Wsign-compare -Wswitch >> -Woverloaded-virtual -Wno-placement-new >> -Wno-deprecated-declarations -Wno-literal-suffix -fopenmp-simd >> -std=c++17 -Wno-parentheses -Wno-unused-local-typedefs -Og -ggdb >> -Wa,--compress-debug-sections// >> //      LD flags  (DEBUG): -Wl,--as-needed -rdynamic >> -fuse-ld=gold -ggdb// >> //      LIBRARIES (DEBUG): >> /usr/lib64/libz.so;rt;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-interface.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteko.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazitpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libModeLaplace.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasaziepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelostpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelosepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen_extras.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-xpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-epetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libisorropia.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra-sup.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyratpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyracore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrainout.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkostsqr.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrakernels.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassiclinalg.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassicnodeapi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassic.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtriutils.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libsacado.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/librtop.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscompat.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosremainder.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosnumerics.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosparameterlist.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkosalgorithms.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscontainers.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscore.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5_hl.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBO.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBool.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBRep.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKernel.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFeat.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFillet.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG2d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG3d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKHLR.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKIGES.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMath.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMesh.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKOffset.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKPrim.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKShHealing.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPAttr.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP209.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTL.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKTopAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKXSBase.so;mpi_usempif08;mpi_usempi_ignore_tkr;mpi_mpifh;mpi;c;gcc_s;gcc;/home/LPGN/lasbleis-m/bin/slepc-3.7.3/lib/libslepc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpetsc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libcmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libdmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libsmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libzmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libmumps_common.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpord.a;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libparmetis.so;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libmetis.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libHYPRE.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libscalapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/liblapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/libblas.a;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempif08.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempi_ignore_tkr.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_mpifh.so;gfortran;quadmath;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_cxx.so;m;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi.so;pthread;dl;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libp4est.so;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libsc.so// >> //// >> // >> //// >> // >> //Call Stack (most recent call first):// >> //  cmake/macros/macro_verbose_include.cmake:19 (INCLUDE)// >> //  CMakeLists.txt:132 (VERBOSE_INCLUDE)// >> // >> // >> //-- Configuring incomplete, errors occurred!// >> //See also >> "/home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log".// >> //Failure with exit status: 1// >> //Exit message: There was a problem configuring dealii v9.0.0./ >> >> >> >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >> >> >> >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From rene.gassmoeller at mailbox.org Tue Sep 25 14:15:37 2018 From: rene.gassmoeller at mailbox.org (Rene Gassmoeller) Date: Tue, 25 Sep 2018 14:15:37 -0700 Subject: [aspect-devel] ASPECT online user meeting, Oct 2nd, 9am PT In-Reply-To: <608819df-ae5d-7ed5-7b76-94866b762a35@mailbox.org> References: <608819df-ae5d-7ed5-7b76-94866b762a35@mailbox.org> Message-ID: <6dcec7f5-0cff-0014-cf57-e97e6ed75f28@mailbox.org> Hi all, this is an early reminder that we will have our ASPECT user meeting next week, Oct 2nd at 9 am PT. As always the meeting will be available using the Zoom software, at: https://zoom.us/j/691977614 Topics we could discuss are the recent developments in terms of using DG elements for temperature, the structure of the ASPECT assembler classes (how to introduce new terms into the equations that are solved), and maybe a sneak peak at a new form of communication with other project members. As always, if you are interested in a certain feature of ASPECT, or want to present a particular work of you related to ASPECT then let us know until the end of the week so that we can schedule it for next week. Looking forward to seeing you next week. Best, Rene From marine.lasbleis at elsi.jp Wed Sep 26 01:21:20 2018 From: marine.lasbleis at elsi.jp (Marine Lasbleis) Date: Wed, 26 Sep 2018 10:21:20 +0200 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> Message-ID: <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> Hi John, For candi: I used the exact candi.cfg, with only add for BLAS and LAPACK directories BLAS_DIR=/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64 LAPACK_DIR=/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64 And the only additional info I gave candi was the install directory (so ./candi.sh -p __path__) I join the file candi_configure from the deal_ii folder At first, I tried to remove the installation of petsc and a couple of things, but I wasn't sure how to ask deal_ii not to use them, so I ended up installing them anyway. It's good to know only p4est and trilinos are required! I will see how to do that. And yes, it will definitely speedup the installation :-) I'll update you later if I manage to do something while removing the installation of petsc and others. And let me know if anyone has an idea to help :-) Best, Marine On 2018年09月25日 22:30, John Naliboff wrote: > Hi Marine, > > A couple questions to help with debugging the installation issue: > >   1. Can you send over the candi.cfg file that you used and any > additional specified options used when executing candi.sh? > >   2. In the specified candi install directory the following folder > should be present: tmp/build/deal.II-v9.0.0. Within that directory > should be a file called candi_configure.log, which contains detailed > information about the configuration process (including errors). If > present, can you send that file over as well? > > As a side note, I often configure candi to only install p4est and > Trilinos with deal.II. Unless you are using PETSc instead of Trilinos, > there is no need to include it, slepsc and parmetis. Likely completely > unrelated to the issue you encountered, but not building these > packages will help speedup the installation. > > Cheers, > John > > On 09/25/2018 01:17 PM, Marine Lasbleis wrote: >> >> Yes, candi! I also found it quite easy... But I really don't >> understand the error (especially as there is none in the log file) >> >> >> On 2018年09月25日 20:31, Max Rudolph wrote: >>> Are you using candi to install trilinos, p4est, and deal.ii? I found >>> that this is much easier on our cluster than installing the packages >>> manually. >>> >>> On Tue, Sep 25, 2018 at 1:29 AM Marine Lasbleis >>> > wrote: >>> >>> Sorry, I sent the first message from the wrong email address. It >>> would be better to use this one! >>> >>> >>> >>> -------- Forwarded Message -------- >>> Subject: Installation on cluster CentOS >>> Date: Tue, 25 Sep 2018 10:17:18 +0200 >>> From: Marine Lasbleis >>> >>> To: aspect-devel at geodynamics.org >>> >>> >>> >>> >>> Hi all, >>> >>> I just moved to a new lab, and I am trying to install ASPECT on >>> the cluster here. So far, it works on my new computer + the test >>> computer, so I'm quite happy... But the installation on the >>> cluster is more complicated. >>> >>> I am stucked with the installation of deal_ii. >>> >>> The cluster is on CentOS, and uses modules. So far, I asked the >>> staff to provide most of the required librairies, and I think we >>> got almost all of them. >>> >>> I am using candi to do the installation, and simply following >>> the instructions. >>> >>> I am using gcc 8.1.0, and we have lapack compiled with gcc8.1.0. >>> cmake is 3.9.6. I have tried to use hdf5 and petsc from the >>> modules, but the system did not recognize the HDF5_DIR and >>> PETSC_DIR, so I compiled them through candi. >>> >>> I got all installed (parmetis, p4est, slepc, hdf5, trilinos), >>> but the compilation of deal_ii end up with an error (highlighted >>> below). And I looked at the log file >>> //home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log/, >>> but there is absolutely no error in the log file... So I'm a >>> little bit puzzled. / >>> / >>> >>> Any idea that could help me? / >>> / >>> >>> (attached file: log file. And below, last lines I got while >>> running candi.sh) >>> >>> Best, >>> >>> Marine >>> >>> / >>> / >>> >>> /[more things here] ......../ >>> >>> /-- Include >>> /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_cpack.cmake// >>> //-- // >>> //-- Include >>> /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_custom_targets.cmake// >>> //-- // >>> //-- Include >>> /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_finalize.cmake// >>> //CMake Error at cmake/setup_finalize.cmake:95 (MESSAGE):// >>> //// >>> // >>> //*  Configuration error: Cannot compile a test program with the >>> final set of*/*/ >>> /**/    compiler and linker flags:/*/ >>> //      CXX flags (DEBUG): -pedantic -fPIC -Wall -Wextra >>> -Wpointer-arith -Wwrite-strings -Wsynth -Wsign-compare -Wswitch >>> -Woverloaded-virtual -Wno-placement-new >>> -Wno-deprecated-declarations -Wno-literal-suffix -fopenmp-simd >>> -std=c++17 -Wno-parentheses -Wno-unused-local-typedefs -Og -ggdb >>> -Wa,--compress-debug-sections// >>> //      LD flags  (DEBUG): -Wl,--as-needed -rdynamic >>> -fuse-ld=gold -ggdb// >>> //      LIBRARIES (DEBUG): >>> /usr/lib64/libz.so;rt;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-interface.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteko.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazitpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libModeLaplace.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasaziepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelostpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelosepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen_extras.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-xpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-epetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libisorropia.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra-sup.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyratpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyracore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrainout.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkostsqr.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrakernels.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassiclinalg.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassicnodeapi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassic.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtriutils.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libsacado.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/librtop.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscompat.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosremainder.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosnumerics.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosparameterlist.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkosalgorithms.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscontainers.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscore.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5_hl.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBO.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBool.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBRep.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKernel.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFeat.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFillet.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG2d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG3d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKHLR.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKIGES.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMath.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMesh.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKOffset.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKPrim.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKShHealing.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPAttr.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP209.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTL.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKTopAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKXSBase.so;mpi_usempif08;mpi_usempi_ignore_tkr;mpi_mpifh;mpi;c;gcc_s;gcc;/home/LPGN/lasbleis-m/bin/slepc-3.7.3/lib/libslepc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpetsc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libcmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libdmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libsmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libzmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libmumps_common.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpord.a;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libparmetis.so;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libmetis.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libHYPRE.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libscalapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/liblapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/libblas.a;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempif08.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempi_ignore_tkr.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_mpifh.so;gfortran;quadmath;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_cxx.so;m;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi.so;pthread;dl;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libp4est.so;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libsc.so// >>> //// >>> // >>> //// >>> // >>> //Call Stack (most recent call first):// >>> //  cmake/macros/macro_verbose_include.cmake:19 (INCLUDE)// >>> //  CMakeLists.txt:132 (VERBOSE_INCLUDE)// >>> // >>> // >>> //-- Configuring incomplete, errors occurred!// >>> //See also >>> "/home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log".// >>> //Failure with exit status: 1// >>> //Exit message: There was a problem configuring dealii v9.0.0./ >>> >>> >>> >>> _______________________________________________ >>> Aspect-devel mailing list >>> Aspect-devel at geodynamics.org >>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>> >>> >>> >>> _______________________________________________ >>> Aspect-devel mailing list >>> Aspect-devel at geodynamics.org >>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >> >> >> >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- #!/usr/bin/env bash declare -x ARCH="x86_64" declare -x BAD="\\033[1;31m" declare -x BLAS_DIR="/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64" declare -x BOLD="\\033[1m" declare -x BUILDCHAIN="cmake" declare -x BUILDDIR="/home/LPGN/lasbleis-m/bin//tmp/build/deal.II-v9.0.0" declare -x BUILD_PATH="/home/LPGN/lasbleis-m/bin//tmp/build" declare -x CC="mpicc" declare -x CLEAN_BUILD="false" declare -x CMAKE_DIR="/trinity/shared/apps/ccipl/machine-dependant/nazare-dc/soft/cmake/3.10.2" declare -x CMD_PACKAGES="" declare -x COL="\\033[1;32m" declare -x CONFIGURATION_PATH="/home/LPGN/lasbleis-m/bin//configuration" declare -x CONFOPTS=" -D CMAKE_BUILD_TYPE=DebugRelease -D DEAL_II_WITH_MPI:BOOL=ON -D DEAL_II_WITH_THREADS:BOOL=ON -D DEAL_II_FORCE_BUNDLED_THREADS:BOOL=OFF -D DEAL_II_COMPONENT_DOCUMENTATION:BOOL=OFF -D DEAL_II_WITH_LAPACK:BOOL=ON -D DEAL_II_WITH_UMFPACK:BOOL=ON -D DEAL_II_FORCE_BUNDLED_UMFPACK:BOOL=OFF -D DEAL_II_WITH_BOOST:BOOL=ON -D DEAL_II_FORCE_BUNDLED_BOOST:BOOL=OFF -D DEAL_II_WITH_ZLIB:BOOL=ON -D DEAL_II_WITH_METIS:BOOL=ON -D METIS_DIR=/home/LPGN/lasbleis-m/bin//parmetis-4.0.3 -D DEAL_II_WITH_P4EST:BOOL=ON -D P4EST_DIR=/home/LPGN/lasbleis-m/bin//p4est-2.0 -D DEAL_II_WITH_HDF5:BOOL=ON -D HDF5_DIR=/home/LPGN/lasbleis-m/bin//hdf5-1.10.1 -D DEAL_II_WITH_TRILINOS:BOOL=ON -D TRILINOS_DIR=/home/LPGN/lasbleis-m/bin//trilinos-release-12-10-1 -D DEAL_II_WITH_PETSC:BOOL=ON -D PETSC_DIR=/home/LPGN/lasbleis-m/bin//petsc-3.7.6 -D DEAL_II_WITH_SLEPC:BOOL=ON -D SLEPC_DIR=/home/LPGN/lasbleis-m/bin//slepc-3.7.3 -D DEAL_II_WITH_OPENCASCADE:BOOL=ON" declare -x CPATH="/trinity/shared/apps/cv-standard/petsc/3.10.0/include:/trinity/shared/apps/cv-standard/netcdf/4.4.1.1/include:/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/include:/trinity/shared/apps/cv-standard/valgrind/3.13.0/include:/trinity/shared/apps/cv-standard/hdf5/openmpi/gcc81/1.10.2/include:/trinity/shared/apps/cv-standard/gcc/8.1.0/include:/trinity/shared/apps/cv-standard/gcc/7.2.0/include" declare -x CVS_RSH="ssh" declare -x CXX="mpicxx" declare -x C_INCLUDE_PATH="/trinity/shared/apps/cv-standard/petsc/3.10.0/include:/trinity/shared/apps/cv-standard/netcdf/4.4.1.1/include:/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/include:/trinity/shared/apps/cv-standard/valgrind/3.13.0/include:/trinity/shared/apps/cv-standard/hdf5/openmpi/gcc81/1.10.2/include:/trinity/shared/apps/cv-standard/gcc/8.1.0/include:/trinity/shared/apps/cv-standard/gcc/8.1.0/lib/gcc/x86_64-pc-linux-gnu/8.1.0/include:/trinity/shared/apps/cv-standard/gcc/7.2.0/include" declare -x DATE_CMD="/usr/bin/date" declare -x DEAL_CONFOPTS="" declare -x DEAL_II_VERSION="v9.0.0" declare -x DEVELOPER_MODE="OFF" declare -x DISPLAY="localhost:25.0" declare -x DOWNLOADERS=" curl wget" declare -x DOWNLOAD_PATH="/home/LPGN/lasbleis-m/bin//tmp/src" declare -x EDITOR="vim" declare -x EXTRACTSTO="deal.II-v9.0.0" declare -x FC="mpif90" declare -x FF="mpif77" declare -x GOOD="\\033[1;32m" declare -x HDF5_DIR="/home/LPGN/lasbleis-m/bin//hdf5-1.10.1" declare -x HISTCONTROL="ignoredups" declare -x HISTSIZE="1000" declare -x HOME="/home/LPGN/lasbleis-m" declare -x HOSTNAME="jaws.cluster" declare -x HTTPS_PROXY="http://proxy-upgrade.univ-nantes.prive:3128" declare -x HTTP_PROXY="http://proxy-upgrade.univ-nantes.prive:3128" declare -x INCLUDE="/trinity/shared/apps/cv-standard/petsc/3.10.0/include:/trinity/shared/apps/cv-standard/netcdf/4.4.1.1/include:/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/include:/trinity/shared/apps/cv-standard/valgrind/3.13.0/include:/trinity/shared/apps/cv-standard/hdf5/openmpi/gcc81/1.10.2/include:/trinity/shared/apps/cv-standard/gcc/8.1.0/include:/trinity/shared/apps/cv-standard/gcc/8.1.0/lib/gcc/x86_64-pc-linux-gnu/8.1.0/include:/trinity/shared/apps/cv-standard/gcc/7.2.0/include" declare -x INFO="\\033[1;34m" declare -x INSTALL_PATH="/home/LPGN/lasbleis-m/bin//deal.II-v9.0.0" declare -x KDEDIRS="/usr" declare -x LANG="en_US.UTF-8" declare -x LAPACK_DIR="/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64" declare -x LC_ADDRESS="ja_JP.UTF-8" declare -x LC_IDENTIFICATION="ja_JP.UTF-8" declare -x LC_MEASUREMENT="ja_JP.UTF-8" declare -x LC_MONETARY="ja_JP.UTF-8" declare -x LC_NAME="ja_JP.UTF-8" declare -x LC_NUMERIC="ja_JP.UTF-8" declare -x LC_PAPER="ja_JP.UTF-8" declare -x LC_TELEPHONE="ja_JP.UTF-8" declare -x LC_TIME="ja_JP.UTF-8" declare -x LDSUFFIX="so" declare -x LD_LIBRARY_PATH="/trinity/shared/apps/cv-standard/petsc/3.10.0/lib:/trinity/shared/apps/cv-standard/netcdf/4.4.1.1/lib:/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib:/trinity/shared/apps/cv-standard/valgrind/3.13.0/lib:/trinity/shared/apps/cv-standard/hdf5/openmpi/gcc81/1.10.2/lib:/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64:/trinity/shared/apps/cv-standard/gcc/8.1.0/lib64:/trinity/shared/apps/cv-standard/gcc/8.1.0/lib:/trinity/shared/apps/cv-standard/gcc/7.2.0/lib64:/trinity/shared/apps/cv-standard/gcc/7.2.0/lib:/usr/mpi/gcc/openmpi-1.10.2/lib64" declare -x LESSOPEN="||/usr/bin/lesspipe.sh %s" declare -x LIBRARY_PATH="/trinity/shared/apps/cv-standard/petsc/3.10.0/lib:/trinity/shared/apps/cv-standard/netcdf/4.4.1.1/lib:/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib:/trinity/shared/apps/cv-standard/valgrind/3.13.0/lib:/trinity/shared/apps/cv-standard/hdf5/openmpi/gcc81/1.10.2/lib:/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64:/trinity/shared/apps/cv-standard/gcc/8.1.0/lib64:/trinity/shared/apps/cv-standard/gcc/8.1.0/lib:/trinity/shared/apps/cv-standard/gcc/7.2.0/lib64:/trinity/shared/apps/cv-standard/gcc/7.2.0/lib" declare -x LOADEDMODULES="gcc/7.2.0:cmake/3.10.2:gcc/8.1.0:lapack/3.8.0_gcc81:hdf5/openmpi/gcc81/1.10.2:valgrind/3.13.0:openmpi/psm2/gcc81/3.1.0:netcdf/4.4.1.1:petsc/3.10.0" declare -x LOGNAME="lasbleis-m" declare -x LS_COLORS="rs=0:di=38;5;27:ln=38;5;51:mh=44;38;5;15:pi=40;38;5;11:so=38;5;13:do=38;5;5:bd=48;5;232;38;5;11:cd=48;5;232;38;5;3:or=48;5;232;38;5;9:mi=05;48;5;232;38;5;15:su=48;5;196;38;5;15:sg=48;5;11;38;5;16:ca=48;5;196;38;5;226:tw=48;5;10;38;5;16:ow=48;5;10;38;5;21:st=48;5;21;38;5;15:ex=38;5;34:*.tar=38;5;9:*.tgz=38;5;9:*.arc=38;5;9:*.arj=38;5;9:*.taz=38;5;9:*.lha=38;5;9:*.lz4=38;5;9:*.lzh=38;5;9:*.lzma=38;5;9:*.tlz=38;5;9:*.txz=38;5;9:*.tzo=38;5;9:*.t7z=38;5;9:*.zip=38;5;9:*.z=38;5;9:*.Z=38;5;9:*.dz=38;5;9:*.gz=38;5;9:*.lrz=38;5;9:*.lz=38;5;9:*.lzo=38;5;9:*.xz=38;5;9:*.bz2=38;5;9:*.bz=38;5;9:*.tbz=38;5;9:*.tbz2=38;5;9:*.tz=38;5;9:*.deb=38;5;9:*.rpm=38;5;9:*.jar=38;5;9:*.war=38;5;9:*.ear=38;5;9:*.sar=38;5;9:*.rar=38;5;9:*.alz=38;5;9:*.ace=38;5;9:*.zoo=38;5;9:*.cpio=38;5;9:*.7z=38;5;9:*.rz=38;5;9:*.cab=38;5;9:*.jpg=38;5;13:*.jpeg=38;5;13:*.gif=38;5;13:*.bmp=38;5;13:*.pbm=38;5;13:*.pgm=38;5;13:*.ppm=38;5;13:*.tga=38;5;13:*.xbm=38;5;13:*.xpm=38;5;13:*.tif=38;5;13:*.tiff=38;5;13:*.png=38;5;13:*.svg=38;5;13:*.svgz=38;5;13:*.mng=38;5;13:*.pcx=38;5;13:*.mov=38;5;13:*.mpg=38;5;13:*.mpeg=38;5;13:*.m2v=38;5;13:*.mkv=38;5;13:*.webm=38;5;13:*.ogm=38;5;13:*.mp4=38;5;13:*.m4v=38;5;13:*.mp4v=38;5;13:*.vob=38;5;13:*.qt=38;5;13:*.nuv=38;5;13:*.wmv=38;5;13:*.asf=38;5;13:*.rm=38;5;13:*.rmvb=38;5;13:*.flc=38;5;13:*.avi=38;5;13:*.fli=38;5;13:*.flv=38;5;13:*.gl=38;5;13:*.dl=38;5;13:*.xcf=38;5;13:*.xwd=38;5;13:*.yuv=38;5;13:*.cgm=38;5;13:*.emf=38;5;13:*.axv=38;5;13:*.anx=38;5;13:*.ogv=38;5;13:*.ogx=38;5;13:*.aac=38;5;45:*.au=38;5;45:*.flac=38;5;45:*.mid=38;5;45:*.midi=38;5;45:*.mka=38;5;45:*.mp3=38;5;45:*.mpc=38;5;45:*.ogg=38;5;45:*.ra=38;5;45:*.wav=38;5;45:*.axa=38;5;45:*.oga=38;5;45:*.spx=38;5;45:*.xspf=38;5;45:" declare -x MAIL="/var/spool/mail/lasbleis-m" declare -x MAJORVER="1.10" declare -x MANPATH="/trinity/shared/apps/cv-standard/valgrind/3.13.0/share/man:/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/man:/trinity/shared/apps/cv-standard/gcc/8.1.0/share/man:/trinity/shared/apps/ccipl/machine-dependant/nazare-dc/soft/cmake/3.10.2/man:/trinity/shared/apps/cv-standard/gcc/7.2.0/share/man:/usr/mpi/gcc/openmpi-1.10.2/share/man:" declare -x METIS_DIR="/home/LPGN/lasbleis-m/bin//parmetis-4.0.3" declare -x MINORVER="1" declare -x MIRROR="https://www.ces.clemson.edu/dealii/mirror/" declare -x MKL="OFF" declare -x MODULEPATH="/usr/share/Modules/modulefiles:/trinity/shared/modulefiles/ccipl:/trinity/shared/modulefiles/compilateurs_interpreteurs:/trinity/shared/modulefiles/calcul_parallele:/trinity/shared/modulefiles/profilage:/trinity/shared/modulefiles/bibliotheques_scientifiques:/trinity/shared/modulefiles/outils:/trinity/shared/modulefiles/arch-dependant/local-arch" declare -x MODULESHOME="/usr/share/Modules" declare -x MPI_HOME="/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0" declare -x MPI_ROOT="/usr/mpi/gcc/openmpi-1.10.2" declare -x MPI_RUN="/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/bin/mpirun" declare -x NAME="dealii.git" declare -x NetCDF_ROOT="/trinity/shared/apps/cv-standard/netcdf/4.4.1.1" declare -x OLDPWD="/home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0" declare -x OPA_JAVA="/opt/pgi/linux86-64/2016/java/jre1.8.0_45" declare -x OPENCASCADE_DIR="/home/LPGN/lasbleis-m/bin//oce-OCE-0.18.2" declare -x ORIG_CONFIGURATION_PATH="/home/LPGN/lasbleis-m/bin//configuration" declare -x ORIG_DIR="/home/LPGN/lasbleis-m/Dev/candi" declare -x ORIG_INSTALL_PATH="/home/LPGN/lasbleis-m/bin/" declare -x ORIG_PROCS="1" declare -x P4EST_DIR="/home/LPGN/lasbleis-m/bin//p4est-2.0" declare -x PACKAGE="dealii" declare -x PACKAGES="load:dealii-prepare once:opencascade once:parmetis once:hdf5 once:p4est once:trilinos once:petsc once:slepc dealii" declare -x PACKAGES_OFF="" declare -x PACKING="git" declare -x PARMETIS_DIR="/home/LPGN/lasbleis-m/bin//parmetis-4.0.3" declare -x PATH="/trinity/shared/apps/cv-standard/netcdf/4.4.1.1/bin:/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/bin:/trinity/shared/apps/cv-standard/valgrind/3.13.0/bin:/trinity/shared/apps/cv-standard/hdf5/openmpi/gcc81/1.10.2/bin:/trinity/shared/apps/cv-standard/gcc/8.1.0/bin:/trinity/shared/apps/ccipl/machine-dependant/nazare-dc/soft/cmake/3.10.2/bin:/trinity/shared/apps/cv-standard/gcc/7.2.0/bin:/usr/mpi/gcc/openmpi-1.10.2/bin:/usr/local/bin:/usr/bin:/usr/local/sbin:/usr/sbin" declare -x PETSC_DIR="/home/LPGN/lasbleis-m/bin//petsc-3.7.6" declare -x PLATFORM="deal.II-toolchain/platforms/supported/centos7.platform" declare -x PLATFORM_OSTYPE="linux" declare -x PREFIX="/home/LPGN/lasbleis-m/bin/" declare -x PREFIX_PATH="/home/LPGN/lasbleis-m/bin/" declare -x PROCS="1" declare -x PROJECT="deal.II-toolchain" declare -x PWD="/home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0" declare -x PYTHONVER="2.7" declare -x PYTHON_INTERPRETER="python2" declare -x QT_GRAPHICSSYSTEM_CHECKED="1" declare -x QT_PLUGIN_PATH="/usr/lib64/kde4/plugins:/usr/lib/kde4/plugins" declare -x RE="^[0-9]+\$" declare -x REPLY="" declare -x SCALAPACK_DIR="/home/LPGN/lasbleis-m/bin//petsc-3.7.6" declare -x SHELL="/bin/bash" declare -x SHLVL="2" declare -x SKIP="false" declare -x SLEPC_DIR="/home/LPGN/lasbleis-m/bin//slepc-3.7.3" declare -x SOURCE="https://github.com/dealii/" declare -x SSH_ASKPASS="/usr/libexec/openssh/gnome-ssh-askpass" declare -x SSH_CLIENT="172.16.13.38 44452 22" declare -x SSH_CONNECTION="172.16.13.38 44452 193.52.106.252 22" declare -x SSH_TTY="/dev/pts/18" declare -x STABLE_BUILD="true" declare -x STATUS="0" declare -x TERM="xterm-256color" declare -x TIC="1537797985" declare -x TIC_GLOBAL="1537797981" declare -x TIMINGS="\\ndealii-prepare: 0 s\\nopencascade: 0 s\\nparmetis: 0 s\\nhdf5: 0 s\\np4est: 0 s\\ntrilinos: 0 s\\npetsc: 0 s\\nslepc: 0 s" declare -x TOC="0" declare -x TRILINOS_DIR="/home/LPGN/lasbleis-m/bin//trilinos-release-12-10-1" declare -x TRILINOS_MAJOR_VERSION="AUTO" declare -x TRILINOS_PARMETIS_CONFOPTS=" -D HAVE_PARMETIS_VERSION_4_0_3=ON" declare -x UNPACK_PATH="/home/LPGN/lasbleis-m/bin//tmp/unpack" declare -x USER="lasbleis-m" declare -x VALGRIND_DIR="/trinity/shared/apps/cv-standard/valgrind/3.13.0" declare -x VALUE="/home/LPGN/lasbleis-m/bin//tmp/build/dealii.git" declare -x VAR="BUILDDIR" declare -x VERSION="v9.0.0" declare -x WARN="\\033[1;35m" declare -x XDG_RUNTIME_DIR="/run/user/286207" declare -x XDG_SESSION_ID="9859" declare -x _LMFILES_="/trinity/shared/modulefiles/compilateurs_interpreteurs/gcc/7.2.0:/trinity/shared/modulefiles/compilateurs_interpreteurs/cmake/3.10.2:/trinity/shared/modulefiles/compilateurs_interpreteurs/gcc/8.1.0:/trinity/shared/modulefiles/bibliotheques_scientifiques/lapack/3.8.0_gcc81:/trinity/shared/modulefiles/bibliotheques_scientifiques/hdf5/openmpi/gcc81/1.10.2:/trinity/shared/modulefiles/profilage/valgrind/3.13.0:/trinity/shared/modulefiles/calcul_parallele/openmpi/psm2/gcc81/3.1.0:/trinity/shared/modulefiles/bibliotheques_scientifiques/netcdf/4.4.1.1:/trinity/shared/modulefiles/bibliotheques_scientifiques/petsc/3.10.0" declare -x cmd_file="candi_configure" declare -x external_pkg="mumps" declare -x http_proxy="http://proxy-upgrade.univ-nantes.prive:3128" declare -x https_proxy="http://proxy-upgrade.univ-nantes.prive:3128" declare -x param="-p" set -e cmake -D CMAKE_BUILD_TYPE=DebugRelease -D DEAL_II_WITH_MPI:BOOL=ON -D DEAL_II_WITH_THREADS:BOOL=ON -D DEAL_II_FORCE_BUNDLED_THREADS:BOOL=OFF -D DEAL_II_COMPONENT_DOCUMENTATION:BOOL=OFF -D DEAL_II_WITH_LAPACK:BOOL=ON -D DEAL_II_WITH_UMFPACK:BOOL=ON -D DEAL_II_FORCE_BUNDLED_UMFPACK:BOOL=OFF -D DEAL_II_WITH_BOOST:BOOL=ON -D DEAL_II_FORCE_BUNDLED_BOOST:BOOL=OFF -D DEAL_II_WITH_ZLIB:BOOL=ON -D DEAL_II_WITH_METIS:BOOL=ON -D METIS_DIR=/home/LPGN/lasbleis-m/bin//parmetis-4.0.3 -D DEAL_II_WITH_P4EST:BOOL=ON -D P4EST_DIR=/home/LPGN/lasbleis-m/bin//p4est-2.0 -D DEAL_II_WITH_HDF5:BOOL=ON -D HDF5_DIR=/home/LPGN/lasbleis-m/bin//hdf5-1.10.1 -D DEAL_II_WITH_TRILINOS:BOOL=ON -D TRILINOS_DIR=/home/LPGN/lasbleis-m/bin//trilinos-release-12-10-1 -D DEAL_II_WITH_PETSC:BOOL=ON -D PETSC_DIR=/home/LPGN/lasbleis-m/bin//petsc-3.7.6 -D DEAL_II_WITH_SLEPC:BOOL=ON -D SLEPC_DIR=/home/LPGN/lasbleis-m/bin//slepc-3.7.3 -D DEAL_II_WITH_OPENCASCADE:BOOL=ON -DCMAKE_INSTALL_PREFIX=/home/LPGN/lasbleis-m/bin//deal.II-v9.0.0 /home/LPGN/lasbleis-m/bin//tmp/unpack/deal.II-v9.0.0 From heister at clemson.edu Wed Sep 26 05:43:48 2018 From: heister at clemson.edu (Timo Heister) Date: Wed, 26 Sep 2018 06:43:48 -0600 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> Message-ID: Marine, can you post your tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeError.log please? The errors at the end of the file should help us identify what is going wrong. On Wed, Sep 26, 2018 at 2:21 AM Marine Lasbleis wrote: > > Hi John, > > For candi: I used the exact candi.cfg, with only add for BLAS and LAPACK directories > > BLAS_DIR=/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64 > LAPACK_DIR=/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64 > > And the only additional info I gave candi was the install directory (so ./candi.sh -p __path__) > > I join the file candi_configure from the deal_ii folder > > At first, I tried to remove the installation of petsc and a couple of things, but I wasn't sure how to ask deal_ii not to use them, so I ended up installing them anyway. It's good to know only p4est and trilinos are required! I will see how to do that. And yes, it will definitely speedup the installation :-) > > > I'll update you later if I manage to do something while removing the installation of petsc and others. And let me know if anyone has an idea to help :-) > > Best, > Marine > > On 2018年09月25日 22:30, John Naliboff wrote: > > Hi Marine, > > A couple questions to help with debugging the installation issue: > > 1. Can you send over the candi.cfg file that you used and any additional specified options used when executing candi.sh? > > 2. In the specified candi install directory the following folder should be present: tmp/build/deal.II-v9.0.0. Within that directory should be a file called candi_configure.log, which contains detailed information about the configuration process (including errors). If present, can you send that file over as well? > > As a side note, I often configure candi to only install p4est and Trilinos with deal.II. Unless you are using PETSc instead of Trilinos, there is no need to include it, slepsc and parmetis. Likely completely unrelated to the issue you encountered, but not building these packages will help speedup the installation. > > Cheers, > John > > On 09/25/2018 01:17 PM, Marine Lasbleis wrote: > > Yes, candi! I also found it quite easy... But I really don't understand the error (especially as there is none in the log file) > > > On 2018年09月25日 20:31, Max Rudolph wrote: > > Are you using candi to install trilinos, p4est, and deal.ii? I found that this is much easier on our cluster than installing the packages manually. > > On Tue, Sep 25, 2018 at 1:29 AM Marine Lasbleis wrote: >> >> Sorry, I sent the first message from the wrong email address. It would be better to use this one! >> >> >> >> -------- Forwarded Message -------- >> Subject: Installation on cluster CentOS >> Date: Tue, 25 Sep 2018 10:17:18 +0200 >> From: Marine Lasbleis >> To: aspect-devel at geodynamics.org >> >> >> Hi all, >> >> I just moved to a new lab, and I am trying to install ASPECT on the cluster here. So far, it works on my new computer + the test computer, so I'm quite happy... But the installation on the cluster is more complicated. >> >> I am stucked with the installation of deal_ii. >> >> The cluster is on CentOS, and uses modules. So far, I asked the staff to provide most of the required librairies, and I think we got almost all of them. >> >> I am using candi to do the installation, and simply following the instructions. >> >> I am using gcc 8.1.0, and we have lapack compiled with gcc8.1.0. cmake is 3.9.6. I have tried to use hdf5 and petsc from the modules, but the system did not recognize the HDF5_DIR and PETSC_DIR, so I compiled them through candi. >> >> I got all installed (parmetis, p4est, slepc, hdf5, trilinos), but the compilation of deal_ii end up with an error (highlighted below). And I looked at the log file /home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log, but there is absolutely no error in the log file... So I'm a little bit puzzled. >> >> Any idea that could help me? >> >> (attached file: log file. And below, last lines I got while running candi.sh) >> >> Best, >> >> Marine >> >> >> [more things here] ........ >> >> -- Include /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_cpack.cmake >> -- >> -- Include /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_custom_targets.cmake >> -- >> -- Include /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_finalize.cmake >> CMake Error at cmake/setup_finalize.cmake:95 (MESSAGE): >> >> >> Configuration error: Cannot compile a test program with the final set of >> compiler and linker flags: >> CXX flags (DEBUG): -pedantic -fPIC -Wall -Wextra -Wpointer-arith -Wwrite-strings -Wsynth -Wsign-compare -Wswitch -Woverloaded-virtual -Wno-placement-new -Wno-deprecated-declarations -Wno-literal-suffix -fopenmp-simd -std=c++17 -Wno-parentheses -Wno-unused-local-typedefs -Og -ggdb -Wa,--compress-debug-sections >> LD flags (DEBUG): -Wl,--as-needed -rdynamic -fuse-ld=gold -ggdb >> LIBRARIES (DEBUG): /usr/lib64/libz.so;rt;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-interface.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteko.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazitpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libModeLaplace.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasaziepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelostpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelosepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen_extras.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-xpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-epetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libisorropia.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra-sup.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyratpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyracore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrainout.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkostsqr.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrakernels.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassiclinalg.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassicnodeapi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassic.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtriutils.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libsacado.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/librtop.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscompat.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosremainder.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosnumerics.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosparameterlist.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkosalgorithms.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscontainers.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscore.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5_hl.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBO.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBool.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBRep.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKernel.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFeat.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFillet.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG2d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG3d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKHLR.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKIGES.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMath.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMesh.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKOffset.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKPrim.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKShHealing.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPAttr.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP209.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTL.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKTopAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKXSBase.so;mpi_usempif08;mpi_usempi_ignore_tkr;mpi_mpifh;mpi;c;gcc_s;gcc;/home/LPGN/lasbleis-m/bin/slepc-3.7.3/lib/libslepc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpetsc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libcmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libdmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libsmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libzmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libmumps_common.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpord.a;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libparmetis.so;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libmetis.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libHYPRE.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libscalapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/liblapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/libblas.a;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempif08.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempi_ignore_tkr.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_mpifh.so;gfortran;quadmath;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_cxx.so;m;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi.so;pthread;dl;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libp4est.so;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libsc.so >> >> >> >> >> Call Stack (most recent call first): >> cmake/macros/macro_verbose_include.cmake:19 (INCLUDE) >> CMakeLists.txt:132 (VERBOSE_INCLUDE) >> >> >> -- Configuring incomplete, errors occurred! >> See also "/home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log". >> Failure with exit status: 1 >> Exit message: There was a problem configuring dealii v9.0.0. >> >> >> >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=-BQ7o919kgwREBbg5KGcJrTh74nJ644nNel14gZGGwk&s=fMDmQ2ePzI1vCwo1QpZl6bae-2y-hRgNDqQsfQqAAyM&e= > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=-BQ7o919kgwREBbg5KGcJrTh74nJ644nNel14gZGGwk&s=fMDmQ2ePzI1vCwo1QpZl6bae-2y-hRgNDqQsfQqAAyM&e= > > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=-BQ7o919kgwREBbg5KGcJrTh74nJ644nNel14gZGGwk&s=fMDmQ2ePzI1vCwo1QpZl6bae-2y-hRgNDqQsfQqAAyM&e= > > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=-BQ7o919kgwREBbg5KGcJrTh74nJ644nNel14gZGGwk&s=fMDmQ2ePzI1vCwo1QpZl6bae-2y-hRgNDqQsfQqAAyM&e= > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=-BQ7o919kgwREBbg5KGcJrTh74nJ644nNel14gZGGwk&s=fMDmQ2ePzI1vCwo1QpZl6bae-2y-hRgNDqQsfQqAAyM&e= -- Timo Heister http://www.math.clemson.edu/~heister/ From marine.lasbleis at elsi.jp Wed Sep 26 06:44:08 2018 From: marine.lasbleis at elsi.jp (Marine Lasbleis) Date: Wed, 26 Sep 2018 15:44:08 +0200 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> Message-ID: <4161d2d1-1b0a-bba1-f389-b737be041319@elsi.jp> Hi Timo, There is no file with this name. It really looks like it is not even trying to compile... I have the CMakeOutput.log (attached to the first email), but no Error.log... On 2018年09月26日 14:43, Timo Heister wrote: > Marine, > > can you post your tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeError.log > please? The errors at the end of the file should help us identify what > is going wrong. > > On Wed, Sep 26, 2018 at 2:21 AM Marine Lasbleis wrote: >> Hi John, >> >> For candi: I used the exact candi.cfg, with only add for BLAS and LAPACK directories >> >> BLAS_DIR=/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64 >> LAPACK_DIR=/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64 >> >> And the only additional info I gave candi was the install directory (so ./candi.sh -p __path__) >> >> I join the file candi_configure from the deal_ii folder >> >> At first, I tried to remove the installation of petsc and a couple of things, but I wasn't sure how to ask deal_ii not to use them, so I ended up installing them anyway. It's good to know only p4est and trilinos are required! I will see how to do that. And yes, it will definitely speedup the installation :-) >> >> >> I'll update you later if I manage to do something while removing the installation of petsc and others. And let me know if anyone has an idea to help :-) >> >> Best, >> Marine >> >> On 2018年09月25日 22:30, John Naliboff wrote: >> >> Hi Marine, >> >> A couple questions to help with debugging the installation issue: >> >> 1. Can you send over the candi.cfg file that you used and any additional specified options used when executing candi.sh? >> >> 2. In the specified candi install directory the following folder should be present: tmp/build/deal.II-v9.0.0. Within that directory should be a file called candi_configure.log, which contains detailed information about the configuration process (including errors). If present, can you send that file over as well? >> >> As a side note, I often configure candi to only install p4est and Trilinos with deal.II. Unless you are using PETSc instead of Trilinos, there is no need to include it, slepsc and parmetis. Likely completely unrelated to the issue you encountered, but not building these packages will help speedup the installation. >> >> Cheers, >> John >> >> On 09/25/2018 01:17 PM, Marine Lasbleis wrote: >> >> Yes, candi! I also found it quite easy... But I really don't understand the error (especially as there is none in the log file) >> >> >> On 2018年09月25日 20:31, Max Rudolph wrote: >> >> Are you using candi to install trilinos, p4est, and deal.ii? I found that this is much easier on our cluster than installing the packages manually. >> >> On Tue, Sep 25, 2018 at 1:29 AM Marine Lasbleis wrote: >>> Sorry, I sent the first message from the wrong email address. It would be better to use this one! >>> >>> >>> >>> -------- Forwarded Message -------- >>> Subject: Installation on cluster CentOS >>> Date: Tue, 25 Sep 2018 10:17:18 +0200 >>> From: Marine Lasbleis >>> To: aspect-devel at geodynamics.org >>> >>> >>> Hi all, >>> >>> I just moved to a new lab, and I am trying to install ASPECT on the cluster here. So far, it works on my new computer + the test computer, so I'm quite happy... But the installation on the cluster is more complicated. >>> >>> I am stucked with the installation of deal_ii. >>> >>> The cluster is on CentOS, and uses modules. So far, I asked the staff to provide most of the required librairies, and I think we got almost all of them. >>> >>> I am using candi to do the installation, and simply following the instructions. >>> >>> I am using gcc 8.1.0, and we have lapack compiled with gcc8.1.0. cmake is 3.9.6. I have tried to use hdf5 and petsc from the modules, but the system did not recognize the HDF5_DIR and PETSC_DIR, so I compiled them through candi. >>> >>> I got all installed (parmetis, p4est, slepc, hdf5, trilinos), but the compilation of deal_ii end up with an error (highlighted below). And I looked at the log file /home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log, but there is absolutely no error in the log file... So I'm a little bit puzzled. >>> >>> Any idea that could help me? >>> >>> (attached file: log file. And below, last lines I got while running candi.sh) >>> >>> Best, >>> >>> Marine >>> >>> >>> [more things here] ........ >>> >>> -- Include /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_cpack.cmake >>> -- >>> -- Include /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_custom_targets.cmake >>> -- >>> -- Include /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_finalize.cmake >>> CMake Error at cmake/setup_finalize.cmake:95 (MESSAGE): >>> >>> >>> Configuration error: Cannot compile a test program with the final set of >>> compiler and linker flags: >>> CXX flags (DEBUG): -pedantic -fPIC -Wall -Wextra -Wpointer-arith -Wwrite-strings -Wsynth -Wsign-compare -Wswitch -Woverloaded-virtual -Wno-placement-new -Wno-deprecated-declarations -Wno-literal-suffix -fopenmp-simd -std=c++17 -Wno-parentheses -Wno-unused-local-typedefs -Og -ggdb -Wa,--compress-debug-sections >>> LD flags (DEBUG): -Wl,--as-needed -rdynamic -fuse-ld=gold -ggdb >>> LIBRARIES (DEBUG): /usr/lib64/libz.so;rt;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-interface.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteko.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazitpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libModeLaplace.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasaziepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelostpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelosepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen_extras.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-xpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-epetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libisorropia.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra-sup.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyratpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyracore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrainout.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkostsqr.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrakernels.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassiclinalg.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassicnodeapi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassic.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtriutils.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libsacado.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/librtop.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscompat.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosremainder.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosnumerics.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosparameterlist.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkosalgorithms.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscontainers.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscore.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5_hl.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBO.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBool.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBRep.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKernel.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFeat.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFillet.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG2d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG3d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKHLR.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKIGES.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMath.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMesh.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKOffset.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKPrim.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKShHealing.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPAttr.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP209.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTL.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKTopAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKXSBase.so;mpi_usempif08;mpi_usempi_ignore_tkr;mpi_mpifh;mpi;c;gcc_s;gcc;/home/LPGN/lasbleis-m/bin/slepc-3.7.3/lib/libslepc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpetsc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libcmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libdmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libsmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libzmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libmumps_common.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpord.a;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libparmetis.so;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libmetis.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libHYPRE.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libscalapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/liblapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/libblas.a;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempif08.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempi_ignore_tkr.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_mpifh.so;gfortran;quadmath;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_cxx.so;m;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi.so;pthread;dl;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libp4est.so;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libsc.so >>> >>> >>> >>> >>> Call Stack (most recent call first): >>> cmake/macros/macro_verbose_include.cmake:19 (INCLUDE) >>> CMakeLists.txt:132 (VERBOSE_INCLUDE) >>> >>> >>> -- Configuring incomplete, errors occurred! >>> See also "/home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log". >>> Failure with exit status: 1 >>> Exit message: There was a problem configuring dealii v9.0.0. >>> >>> >>> >>> _______________________________________________ >>> Aspect-devel mailing list >>> Aspect-devel at geodynamics.org >>> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=-BQ7o919kgwREBbg5KGcJrTh74nJ644nNel14gZGGwk&s=fMDmQ2ePzI1vCwo1QpZl6bae-2y-hRgNDqQsfQqAAyM&e= >> >> >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=-BQ7o919kgwREBbg5KGcJrTh74nJ644nNel14gZGGwk&s=fMDmQ2ePzI1vCwo1QpZl6bae-2y-hRgNDqQsfQqAAyM&e= >> >> >> >> >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=-BQ7o919kgwREBbg5KGcJrTh74nJ644nNel14gZGGwk&s=fMDmQ2ePzI1vCwo1QpZl6bae-2y-hRgNDqQsfQqAAyM&e= >> >> >> >> >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=-BQ7o919kgwREBbg5KGcJrTh74nJ644nNel14gZGGwk&s=fMDmQ2ePzI1vCwo1QpZl6bae-2y-hRgNDqQsfQqAAyM&e= >> >> >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIFaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=-BQ7o919kgwREBbg5KGcJrTh74nJ644nNel14gZGGwk&s=fMDmQ2ePzI1vCwo1QpZl6bae-2y-hRgNDqQsfQqAAyM&e= From jbnaliboff at ucdavis.edu Wed Sep 26 10:58:31 2018 From: jbnaliboff at ucdavis.edu (John Naliboff) Date: Wed, 26 Sep 2018 10:58:31 -0700 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> Message-ID: <34b20089-9f5d-d4c2-fde6-43427f2df343@ucdavis.edu> Hi Marine, You can exclude certain packages by commenting out the associated lines in candi.cfg . The following wiki page shows how to do this in addition to other relevant steps: https://github.com/geodynamics/aspect/wiki/Compiling-and-Running-ASPECT-on-TACC-Stampede2 When executing the candi installer, are you specifying the install directory? For example, on the link above the following command is used to specify the number of processors available for the installation and the install directory: ./candi.sh -j 48 --prefix="$WORK"/stampede2/software/candi/install/ If the install directory is not specified, candi places it in some default location. Perhaps this is part of the issue? My suggestion is to delete the old install directory and then specify a directory where you would like the packages to be installed. tmp/build/deal.II-v9.0.0/candi_configure.log will be located in the install directory. Unfortunately hard to debug what is going wrong without more info. Cheers, John On 09/26/2018 01:21 AM, Marine Lasbleis wrote: > > Hi John, > > For candi: I used the exact candi.cfg, with only add for BLAS and > LAPACK directories > > BLAS_DIR=/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64 > LAPACK_DIR=/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64 > > And the only additional info I gave candi was the install directory > (so ./candi.sh -p __path__) > > I join the file candi_configure from the deal_ii folder > > At first, I tried to remove the installation of petsc and a couple of > things, but I wasn't sure how to ask deal_ii not to use them, so I > ended up installing them anyway. It's good to know only p4est and > trilinos are required! I will see how to do that. And yes, it will > definitely speedup the installation :-) > > > I'll update you later if I manage to do something while removing the > installation of petsc and others. And let me know if anyone has an > idea to help :-) > > Best, > Marine > > On 2018年09月25日 22:30, John Naliboff wrote: >> Hi Marine, >> >> A couple questions to help with debugging the installation issue: >> >>   1. Can you send over the candi.cfg file that you used and any >> additional specified options used when executing candi.sh? >> >>   2. In the specified candi install directory the following folder >> should be present: tmp/build/deal.II-v9.0.0. Within that directory >> should be a file called candi_configure.log, which contains detailed >> information about the configuration process (including errors). If >> present, can you send that file over as well? >> >> As a side note, I often configure candi to only install p4est and >> Trilinos with deal.II. Unless you are using PETSc instead of >> Trilinos, there is no need to include it, slepsc and parmetis. Likely >> completely unrelated to the issue you encountered, but not building >> these packages will help speedup the installation. >> >> Cheers, >> John >> >> On 09/25/2018 01:17 PM, Marine Lasbleis wrote: >>> >>> Yes, candi! I also found it quite easy... But I really don't >>> understand the error (especially as there is none in the log file) >>> >>> >>> On 2018年09月25日 20:31, Max Rudolph wrote: >>>> Are you using candi to install trilinos, p4est, and deal.ii? I >>>> found that this is much easier on our cluster than installing the >>>> packages manually. >>>> >>>> On Tue, Sep 25, 2018 at 1:29 AM Marine Lasbleis >>>> > wrote: >>>> >>>> Sorry, I sent the first message from the wrong email address. >>>> It would be better to use this one! >>>> >>>> >>>> >>>> -------- Forwarded Message -------- >>>> Subject: Installation on cluster CentOS >>>> Date: Tue, 25 Sep 2018 10:17:18 +0200 >>>> From: Marine Lasbleis >>>> >>>> To: aspect-devel at geodynamics.org >>>> >>>> >>>> >>>> >>>> Hi all, >>>> >>>> I just moved to a new lab, and I am trying to install ASPECT on >>>> the cluster here. So far, it works on my new computer + the >>>> test computer, so I'm quite happy... But the installation on >>>> the cluster is more complicated. >>>> >>>> I am stucked with the installation of deal_ii. >>>> >>>> The cluster is on CentOS, and uses modules. So far, I asked the >>>> staff to provide most of the required librairies, and I think >>>> we got almost all of them. >>>> >>>> I am using candi to do the installation, and simply following >>>> the instructions. >>>> >>>> I am using gcc 8.1.0, and we have lapack compiled with >>>> gcc8.1.0. cmake is 3.9.6. I have tried to use hdf5 and petsc >>>> from the modules, but the system did not recognize the HDF5_DIR >>>> and PETSC_DIR, so I compiled them through candi. >>>> >>>> I got all installed (parmetis, p4est, slepc, hdf5, trilinos), >>>> but the compilation of deal_ii end up with an error >>>> (highlighted below). And I looked at the log file >>>> //home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log/, >>>> but there is absolutely no error in the log file... So I'm a >>>> little bit puzzled. / >>>> / >>>> >>>> Any idea that could help me? / >>>> / >>>> >>>> (attached file: log file. And below, last lines I got while >>>> running candi.sh) >>>> >>>> Best, >>>> >>>> Marine >>>> >>>> / >>>> / >>>> >>>> /[more things here] ......../ >>>> >>>> /-- Include >>>> /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_cpack.cmake// >>>> //-- // >>>> //-- Include >>>> /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_custom_targets.cmake// >>>> //-- // >>>> //-- Include >>>> /home/LPGN/lasbleis-m/bin/tmp/unpack/deal.II-v9.0.0/cmake/setup_finalize.cmake// >>>> //CMake Error at cmake/setup_finalize.cmake:95 (MESSAGE):// >>>> //// >>>> // >>>> //*  Configuration error: Cannot compile a test program with >>>> the final set of*/*/ >>>> /**/    compiler and linker flags:/*/ >>>> //      CXX flags (DEBUG): -pedantic -fPIC -Wall -Wextra >>>> -Wpointer-arith -Wwrite-strings -Wsynth -Wsign-compare -Wswitch >>>> -Woverloaded-virtual -Wno-placement-new >>>> -Wno-deprecated-declarations -Wno-literal-suffix -fopenmp-simd >>>> -std=c++17 -Wno-parentheses -Wno-unused-local-typedefs -Og >>>> -ggdb -Wa,--compress-debug-sections// >>>> //      LD flags  (DEBUG): -Wl,--as-needed -rdynamic >>>> -fuse-ld=gold -ggdb// >>>> //      LIBRARIES (DEBUG): >>>> /usr/lib64/libz.so;rt;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu-interface.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libmuelu.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteko.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libstratimikosifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2-adapters.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazitpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libModeLaplace.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasaziepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libanasazi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelostpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelosepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libbelos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libml.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libifpack.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan2.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen_extras.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libpamgen.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libamesos.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-xpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libgaleri-epetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libaztecoo.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libisorropia.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra-sup.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libxpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyratpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyraepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libthyracore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraext.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrainout.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkostsqr.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetrakernels.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassiclinalg.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassicnodeapi.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtpetraclassic.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libtriutils.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libzoltan.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libsacado.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/librtop.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoskokkoscompat.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosremainder.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosnumerics.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscomm.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchosparameterlist.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libteuchoscore.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkosalgorithms.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscontainers.so;/home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libkokkoscore.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5_hl.so;/home/LPGN/lasbleis-m/bin/hdf5-1.10.1/lib/libhdf5.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBO.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBool.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKBRep.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKernel.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFeat.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKFillet.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG2d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKG3d.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKGeomBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKHLR.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKIGES.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMath.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKMesh.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKOffset.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKPrim.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKShHealing.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPAttr.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEPBase.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTEP209.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKSTL.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKTopAlgo.so;/home/LPGN/lasbleis-m/bin/oce-OCE-0.18.2/lib/libTKXSBase.so;mpi_usempif08;mpi_usempi_ignore_tkr;mpi_mpifh;mpi;c;gcc_s;gcc;/home/LPGN/lasbleis-m/bin/slepc-3.7.3/lib/libslepc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpetsc.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libcmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libdmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libsmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libzmumps.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libmumps_common.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libpord.a;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libparmetis.so;/home/LPGN/lasbleis-m/bin/parmetis-4.0.3/lib/libmetis.so;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libHYPRE.a;/home/LPGN/lasbleis-m/bin/petsc-3.7.6/lib/libscalapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/liblapack.a;/trinity/shared/apps/cv-standard/lapack/3.8.0_gcc81/lib64/libblas.a;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempif08.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_usempi_ignore_tkr.so;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_mpifh.so;gfortran;quadmath;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi_cxx.so;m;/trinity/shared/apps/cv-standard/openmpi/psm2/gcc81/3.1.0/lib/libmpi.so;pthread;dl;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libp4est.so;/home/LPGN/lasbleis-m/bin/p4est-2.0/DEBUG/lib/libsc.so// >>>> //// >>>> // >>>> //// >>>> // >>>> //Call Stack (most recent call first):// >>>> //cmake/macros/macro_verbose_include.cmake:19 (INCLUDE)// >>>> //  CMakeLists.txt:132 (VERBOSE_INCLUDE)// >>>> // >>>> // >>>> //-- Configuring incomplete, errors occurred!// >>>> //See also >>>> "/home/LPGN/lasbleis-m/bin/tmp/build/deal.II-v9.0.0/CMakeFiles/CMakeOutput.log".// >>>> //Failure with exit status: 1// >>>> //Exit message: There was a problem configuring dealii v9.0.0./ >>>> >>>> >>>> >>>> _______________________________________________ >>>> Aspect-devel mailing list >>>> Aspect-devel at geodynamics.org >>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>>> >>>> >>>> >>>> _______________________________________________ >>>> Aspect-devel mailing list >>>> Aspect-devel at geodynamics.org >>>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >>> >>> >>> >>> _______________________________________________ >>> Aspect-devel mailing list >>> Aspect-devel at geodynamics.org >>> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel >> >> >> >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From heister at clemson.edu Wed Sep 26 11:09:58 2018 From: heister at clemson.edu (Timo Heister) Date: Wed, 26 Sep 2018 14:09:58 -0400 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: <4161d2d1-1b0a-bba1-f389-b737be041319@elsi.jp> References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> <4161d2d1-1b0a-bba1-f389-b737be041319@elsi.jp> Message-ID: > There is no file with this name. It really looks like it is not even > trying to compile... > > I have the CMakeOutput.log (attached to the first email), but no > Error.log... That is weird. cmake should create a .log file for successful and one for failing output. The precise error should be in that file. Are you running out of disk space? Can you just run "./candi.sh -j 4 --packages="dealii"" and look at the output (specify some path with "-p")? -- Timo Heister http://www.math.clemson.edu/~heister/ From marine.lasbleis at elsi.jp Wed Sep 26 11:18:13 2018 From: marine.lasbleis at elsi.jp (Marine Lasbleis) Date: Wed, 26 Sep 2018 20:18:13 +0200 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> <4161d2d1-1b0a-bba1-f389-b737be041319@elsi.jp> Message-ID: <34d6eca1-7769-634b-9677-7818db13ca9e@elsi.jp> Hi John and Timo, Thanks for the help. I'll try with doing a new install tomorrow (this time lag is quite weird ;) ). So far I used only 1 core for the installation, I'll try with a couple of them instead. Space should definitely not be a problem... We looked at what should trigger the error "cannot compile a test program with the final set of compiler and linker flags", but the answer was quite obscur for us. And yes, I specified the path for the installation, so this also should not be a problem. With a clean install, I'll tell you if I can get more infos. Best, Marine On 2018年09月26日 20:09, Timo Heister wrote: >> There is no file with this name. It really looks like it is not even >> trying to compile... >> >> I have the CMakeOutput.log (attached to the first email), but no >> Error.log... > That is weird. cmake should create a .log file for successful and one > for failing output. The precise error should be in that file. Are you > running out of disk space? > > Can you just run "./candi.sh -j 4 --packages="dealii"" and look at the > output (specify some path with "-p")? > > > From maxrudolph at ucdavis.edu Wed Sep 26 11:27:49 2018 From: maxrudolph at ucdavis.edu (Max Rudolph) Date: Wed, 26 Sep 2018 11:27:49 -0700 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: <34d6eca1-7769-634b-9677-7818db13ca9e@elsi.jp> References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> <4161d2d1-1b0a-bba1-f389-b737be041319@elsi.jp> <34d6eca1-7769-634b-9677-7818db13ca9e@elsi.jp> Message-ID: I have seen this error message when trying to use certain compiler and mpi combinations on our cluster. I think that it is because the module files for gcc 8 on our cluster were not set up properly, so I am stuck using an older gcc with intel MPI. On Wed, Sep 26, 2018 at 11:18 AM Marine Lasbleis wrote: > Hi John and Timo, > > Thanks for the help. I'll try with doing a new install tomorrow (this > time lag is quite weird ;) ). So far I used only 1 core for the > installation, I'll try with a couple of them instead. Space should > definitely not be a problem... We looked at what should trigger the > error "cannot compile a test program with the final set of compiler and > linker flags", but the answer was quite obscur for us. > > And yes, I specified the path for the installation, so this also should > not be a problem. > > With a clean install, I'll tell you if I can get more infos. > > > Best, > > Marine > > > On 2018年09月26日 20:09, Timo Heister wrote: > >> There is no file with this name. It really looks like it is not even > >> trying to compile... > >> > >> I have the CMakeOutput.log (attached to the first email), but no > >> Error.log... > > That is weird. cmake should create a .log file for successful and one > > for failing output. The precise error should be in that file. Are you > > running out of disk space? > > > > Can you just run "./candi.sh -j 4 --packages="dealii"" and look at the > > output (specify some path with "-p")? > > > > > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From marine.lasbleis at elsi.jp Thu Sep 27 08:14:01 2018 From: marine.lasbleis at elsi.jp (Marine Lasbleis) Date: Thu, 27 Sep 2018 17:14:01 +0200 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> <4161d2d1-1b0a-bba1-f389-b737be041319@elsi.jp> <34d6eca1-7769-634b-9677-7818db13ca9e@elsi.jp> Message-ID: Hi, I sent a message with the logs, but apparently the files were too big and the email is awaiting moderation. In between, I looked at the errors, and it seems that the main problem are a couple of functions from LAPACK, used by trilinos, that are deprecated in the most up-to-date versions of LAPACK (if version >= 3.6) Internet says to add a flag to the installation of trilinos (to use deprecated functions). Here, we decided to switch to an older version of LAPACK, so I am running the installation with LAPACK 3.5 (and still gcc8). So far, it's going better, and it is building right now. I'll let you know if I manage to install deal_ii! Best, Marine On 2018年09月26日 20:27, Max Rudolph wrote: > I have seen this error message when trying to use certain compiler and > mpi combinations on our cluster. I think that it is because the module > files for gcc 8 on our cluster were not set up properly, so I am stuck > using an older gcc with intel MPI. > > On Wed, Sep 26, 2018 at 11:18 AM Marine Lasbleis > > wrote: > > Hi John and Timo, > > Thanks for the help. I'll try with doing a new install tomorrow (this > time lag is quite weird ;) ). So far I used only 1 core for the > installation, I'll try with a couple of them instead. Space should > definitely not be a problem... We looked at what should trigger the > error "cannot compile a test program with the final set of > compiler and > linker flags", but the answer was quite obscur for us. > > And yes, I specified the path for the installation, so this also > should > not be a problem. > > With a clean install, I'll tell you if I can get more infos. > > > Best, > > Marine > > > On 2018年09月26日 20:09, Timo Heister wrote: > >> There is no file with this name. It really looks like it is not > even > >> trying to compile... > >> > >> I have the CMakeOutput.log (attached to the first email), but no > >> Error.log... > > That is weird. cmake should create a .log file for successful > and one > > for failing output. The precise error should be in that file. > Are you > > running out of disk space? > > > > Can you just run "./candi.sh -j 4 --packages="dealii"" and look > at the > > output (specify some path with "-p")? > > > > > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From marine.lasbleis at elsi.jp Thu Sep 27 08:18:03 2018 From: marine.lasbleis at elsi.jp (Marine Lasbleis) Date: Thu, 27 Sep 2018 17:18:03 +0200 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> <4161d2d1-1b0a-bba1-f389-b737be041319@elsi.jp> <34d6eca1-7769-634b-9677-7818db13ca9e@elsi.jp> Message-ID: Yes! dealii.git has now been installed Now it's time to try ASPECT... On 2018年09月26日 20:27, Max Rudolph wrote: > I have seen this error message when trying to use certain compiler and > mpi combinations on our cluster. I think that it is because the module > files for gcc 8 on our cluster were not set up properly, so I am stuck > using an older gcc with intel MPI. > > On Wed, Sep 26, 2018 at 11:18 AM Marine Lasbleis > > wrote: > > Hi John and Timo, > > Thanks for the help. I'll try with doing a new install tomorrow (this > time lag is quite weird ;) ). So far I used only 1 core for the > installation, I'll try with a couple of them instead. Space should > definitely not be a problem... We looked at what should trigger the > error "cannot compile a test program with the final set of > compiler and > linker flags", but the answer was quite obscur for us. > > And yes, I specified the path for the installation, so this also > should > not be a problem. > > With a clean install, I'll tell you if I can get more infos. > > > Best, > > Marine > > > On 2018年09月26日 20:09, Timo Heister wrote: > >> There is no file with this name. It really looks like it is not > even > >> trying to compile... > >> > >> I have the CMakeOutput.log (attached to the first email), but no > >> Error.log... > > That is weird. cmake should create a .log file for successful > and one > > for failing output. The precise error should be in that file. > Are you > > running out of disk space? > > > > Can you just run "./candi.sh -j 4 --packages="dealii"" and look > at the > > output (specify some path with "-p")? > > > > > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From marine.lasbleis at elsi.jp Thu Sep 27 01:24:06 2018 From: marine.lasbleis at elsi.jp (Marine Lasbleis) Date: Thu, 27 Sep 2018 10:24:06 +0200 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> <4161d2d1-1b0a-bba1-f389-b737be041319@elsi.jp> <34d6eca1-7769-634b-9677-7818db13ca9e@elsi.jp> Message-ID: <321d508a-a32c-df8a-d399-025edadfeb84@elsi.jp> Hi all, I have re-run a clean installation with candi, removing everything except trilinos and p4est. Error is same... But this time I got error log from Cmake! So here are the error log (and the associated output.log). I can't really understand what is happening... Any idea? I'll try later to switch to an older compiler, if I can find associated openmpi and lapack libraries. Also, at the end of the compilation, the system tries to look at a serie of DEAL_II_WITH_*** (like DEAL_II_WITH_SUNDIALS), but most of them except trilinos and p4est have "unmet external dependencies." It's OK, right? Best, Marine On 2018年09月26日 20:27, Max Rudolph wrote: > I have seen this error message when trying to use certain compiler and > mpi combinations on our cluster. I think that it is because the module > files for gcc 8 on our cluster were not set up properly, so I am stuck > using an older gcc with intel MPI. > > On Wed, Sep 26, 2018 at 11:18 AM Marine Lasbleis > > wrote: > > Hi John and Timo, > > Thanks for the help. I'll try with doing a new install tomorrow (this > time lag is quite weird ;) ). So far I used only 1 core for the > installation, I'll try with a couple of them instead. Space should > definitely not be a problem... We looked at what should trigger the > error "cannot compile a test program with the final set of > compiler and > linker flags", but the answer was quite obscur for us. > > And yes, I specified the path for the installation, so this also > should > not be a problem. > > With a clean install, I'll tell you if I can get more infos. > > > Best, > > Marine > > > On 2018年09月26日 20:09, Timo Heister wrote: > >> There is no file with this name. It really looks like it is not > even > >> trying to compile... > >> > >> I have the CMakeOutput.log (attached to the first email), but no > >> Error.log... > > That is weird. cmake should create a .log file for successful > and one > > for failing output. The precise error should be in that file. > Are you > > running out of disk space? > > > > Can you just run "./candi.sh -j 4 --packages="dealii"" and look > at the > > output (specify some path with "-p")? > > > > > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel > > > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: CMakeError.log Type: text/x-log Size: 33399 bytes Desc: not available URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: CMakeOutput.log Type: text/x-log Size: 213144 bytes Desc: not available URL: From mxliu at math.tsinghua.edu.cn Thu Sep 27 06:12:54 2018 From: mxliu at math.tsinghua.edu.cn (mxliu) Date: Thu, 27 Sep 2018 21:12:54 +0800 Subject: [aspect-devel] Error on running subduction models Message-ID: An HTML attachment was scrubbed... URL: -------------- next part -------------- A non-text attachment was scrubbed... Name: 490_error_message Type: application/octet-stream Size: 94613 bytes Desc: not available URL: From heister at clemson.edu Thu Sep 27 08:48:21 2018 From: heister at clemson.edu (Timo Heister) Date: Thu, 27 Sep 2018 11:48:21 -0400 Subject: [aspect-devel] Error on running subduction models In-Reply-To: References: Message-ID: Mengxue, this is an error that happened during parallel I/O in the graphical output. This can have several reasons: - you are running out of disk space - the file system you are writing to is not usable for parallel IO (most clusters have a slow home directory on NSF and a fast parallel filesystem for I/O) - something else, like your MPI does not support MPI I/O. Things to try: - You can try to run one of the cookbooks in parallel and see if graphical output works. - You can change the output directory. - You can change the "Number of grouped files" to 0 to see if output works without using parallel I/O. On Thu, Sep 27, 2018 at 11:38 AM mxliu wrote: > > Hi all, > I'm now running a subduction model with aspect-2.0(dealii-9.0), I set Nonlinear solver tolerance = 1e-3, it seems to converge on timestep 0(on 3 levels), however, it failed to do with Postprocessing, here is part of the error message: > > ....... > > Number of active cells: 124,000 (on 3 levels) > Number of degrees of freedom: 5,116,081 (998,182+125,171+499,091+499,091+499,091+499,091+499,091+499,091+499,091+499,091) > > Number of free surface degrees of freedom: 250342 > *** Timestep 0: t=0 years > Solving mesh velocity system... 0 iterations. > Solving temperature system... 0 iterations. > Solving ContinentalUpperCrust system ... 0 iterations. > Solving ContinentalLowerCrust system ... 0 iterations. > Solving ContinentalMantle system ... 0 iterations. > Solving Sediment system ... 0 iterations. > Solving OceanicCrust system ... 0 iterations. > Solving OceanicMantle system ... 0 iterations. > Solving WeakZone system ... 0 iterations. > Rebuilding Stokes preconditioner... > Solving Stokes system... 200+9 iterations. > Relative nonlinear residual (Stokes system) after nonlinear iteration 1: 1 > > Rebuilding Stokes preconditioner... > Solving Stokes system... 200+12 iterations. > Relative nonlinear residual (Stokes system) after nonlinear iteration 2: 0.00559706 > ...... > > Rebuilding Stokes preconditioner... > Solving Stokes system... 200+3 iterations. > Relative nonlinear residual (Stokes system) after nonlinear iteration 28: 0.000800572 > > > Postprocessing: > > > ---------------------------------------------------- > Exception on MPI process <16> while running postprocessor : > > > ---------------------------------------------------- > Exception on MPI process <19> while running postprocessor : > > -------------------------------------------------------- > An error occurred in line <6632> of file in function > void dealii::DataOutInterface::write_vtu_in_parallel(const char*, MPI_Comm) const [with int dim = 2; int spacedim = 2; MPI_Comm = ompi_communicator_t*] > The violated condition was: > ierr == MPI_SUCCESS > Additional information: > deal.II encountered an error while calling an MPI function. > The description of the error provided by MPI is "MPI_ERR_OTHER: known error not in list". > The numerical value of the original error code is 16. > -------------------------------------------------------- > > Aborting! > ---------------------------------------------------- > > Any ideas on how this might be solved? Thanks in advance. > > Cheers, > Mengxue > > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=3YF5UBWv7QtvW2xEDKaM5Lc9Ws5MU4W3zChJR4c29Vg&s=_nX_-72XBVEzLgYRh4AQglxsFszSzHpwgc92PPG-Tbg&e= -- Timo Heister http://www.math.clemson.edu/~heister/ From heister at clemson.edu Thu Sep 27 08:50:50 2018 From: heister at clemson.edu (Timo Heister) Date: Thu, 27 Sep 2018 11:50:50 -0400 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: <321d508a-a32c-df8a-d399-025edadfeb84@elsi.jp> References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> <4161d2d1-1b0a-bba1-f389-b737be041319@elsi.jp> <34d6eca1-7769-634b-9677-7818db13ca9e@elsi.jp> <321d508a-a32c-df8a-d399-025edadfeb84@elsi.jp> Message-ID: > Error is same... But this time I got error log from Cmake! > > So here are the error log (and the associated output.log). I can't really understand what is happening... Any idea? The error is (at the bottom): > /home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so: error: undefined reference to 'dggsvd_' > /home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so: error: undefined reference to 'sggsvd_' This means your LAPACK library does not export all symbols that Trilinos needs. > Also, at the end of the compilation, the system tries to look at a serie of DEAL_II_WITH_*** (like DEAL_II_WITH_SUNDIALS), but most of them except trilinos and p4est have "unmet external dependencies." It's OK, right? Yes, that is perfectly normal. -- Timo Heister http://www.math.clemson.edu/~heister/ From maxrudolph at ucdavis.edu Thu Sep 27 08:55:52 2018 From: maxrudolph at ucdavis.edu (Max Rudolph) Date: Thu, 27 Sep 2018 08:55:52 -0700 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> <4161d2d1-1b0a-bba1-f389-b737be041319@elsi.jp> <34d6eca1-7769-634b-9677-7818db13ca9e@elsi.jp> <321d508a-a32c-df8a-d399-025edadfeb84@elsi.jp> Message-ID: Would it be hard to add packages so that candi could install blas and lapack? ATLAS? On Thu, Sep 27, 2018 at 8:51 AM Timo Heister wrote: > > Error is same... But this time I got error log from Cmake! > > > > So here are the error log (and the associated output.log). I can't > really understand what is happening... Any idea? > > The error is (at the bottom): > > /home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so: > error: undefined reference to 'dggsvd_' > > /home/LPGN/lasbleis-m/bin/trilinos-release-12-10-1/lib/libepetra.so: > error: undefined reference to 'sggsvd_' > > This means your LAPACK library does not export all symbols that Trilinos > needs. > > > Also, at the end of the compilation, the system tries to look at a serie > of DEAL_II_WITH_*** (like DEAL_II_WITH_SUNDIALS), but most of them except > trilinos and p4est have "unmet external dependencies." It's OK, right? > > Yes, that is perfectly normal. > > > -- > Timo Heister > http://www.math.clemson.edu/~heister/ > _______________________________________________ > Aspect-devel mailing list > Aspect-devel at geodynamics.org > http://lists.geodynamics.org/cgi-bin/mailman/listinfo/aspect-devel -------------- next part -------------- An HTML attachment was scrubbed... URL: From heister at clemson.edu Thu Sep 27 09:02:39 2018 From: heister at clemson.edu (Timo Heister) Date: Thu, 27 Sep 2018 12:02:39 -0400 Subject: [aspect-devel] Fwd: Installation on cluster CentOS In-Reply-To: References: <5fb7391d-dfdf-0c71-0082-8479c016688b@ens-lyon.fr> <97167a09-2d74-e206-287d-90fb90fbf3e7@ucdavis.edu> <3409460e-6c9a-5711-8365-d4cd2e63f843@elsi.jp> <4161d2d1-1b0a-bba1-f389-b737be041319@elsi.jp> <34d6eca1-7769-634b-9677-7818db13ca9e@elsi.jp> <321d508a-a32c-df8a-d399-025edadfeb84@elsi.jp> Message-ID: > Would it be hard to add packages so that candi could install blas and lapack? ATLAS? Not very difficult but it requires testing it. I created an issue for it: https://urldefense.proofpoint.com/v2/url?u=https-3A__github.com_dealii_candi_issues_94&d=DwIBaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=c08Btfq4m9QEScXN3ZQwLZzzWQE7S8CYq1IYuzKV_Zk&m=Syz3NIsW6nLvnHef8Ydt-PEz7F7bdVEDp4H2XzqhUpI&s=SF5r1ri4DK-tn1vF4lwRbE5urrk1HxaCESSKiLvK6w4&e= -- Timo Heister http://www.math.clemson.edu/~heister/ From jbnaliboff at ucdavis.edu Thu Sep 27 13:17:28 2018 From: jbnaliboff at ucdavis.edu (John Naliboff) Date: Thu, 27 Sep 2018 13:17:28 -0700 Subject: [aspect-devel] Error on running subduction models In-Reply-To: References: Message-ID: <38905394-98ea-e7a9-2969-b5485c7a6657@ucdavis.edu> Hi all, I am familiar with the cluster in question and using set "Number of grouped files = 0" should indeed fix the issue below. To write restart files one also needs to disable MPI-IO in p4est. Cheers, John On 09/27/2018 08:48 AM, Timo Heister wrote: > Mengxue, > > this is an error that happened during parallel I/O in the graphical > output. This can have several reasons: > - you are running out of disk space > - the file system you are writing to is not usable for parallel IO > (most clusters have a slow home directory on NSF and a fast parallel > filesystem for I/O) > - something else, like your MPI does not support MPI I/O. > > Things to try: > - You can try to run one of the cookbooks in parallel and see if > graphical output works. > - You can change the output directory. > - You can change the "Number of grouped files" to 0 to see if output > works without using parallel I/O. > > On Thu, Sep 27, 2018 at 11:38 AM mxliu wrote: >> Hi all, >> I'm now running a subduction model with aspect-2.0(dealii-9.0), I set Nonlinear solver tolerance = 1e-3, it seems to converge on timestep 0(on 3 levels), however, it failed to do with Postprocessing, here is part of the error message: >> >> ....... >> >> Number of active cells: 124,000 (on 3 levels) >> Number of degrees of freedom: 5,116,081 (998,182+125,171+499,091+499,091+499,091+499,091+499,091+499,091+499,091+499,091) >> >> Number of free surface degrees of freedom: 250342 >> *** Timestep 0: t=0 years >> Solving mesh velocity system... 0 iterations. >> Solving temperature system... 0 iterations. >> Solving ContinentalUpperCrust system ... 0 iterations. >> Solving ContinentalLowerCrust system ... 0 iterations. >> Solving ContinentalMantle system ... 0 iterations. >> Solving Sediment system ... 0 iterations. >> Solving OceanicCrust system ... 0 iterations. >> Solving OceanicMantle system ... 0 iterations. >> Solving WeakZone system ... 0 iterations. >> Rebuilding Stokes preconditioner... >> Solving Stokes system... 200+9 iterations. >> Relative nonlinear residual (Stokes system) after nonlinear iteration 1: 1 >> >> Rebuilding Stokes preconditioner... >> Solving Stokes system... 200+12 iterations. >> Relative nonlinear residual (Stokes system) after nonlinear iteration 2: 0.00559706 >> ...... >> >> Rebuilding Stokes preconditioner... >> Solving Stokes system... 200+3 iterations. >> Relative nonlinear residual (Stokes system) after nonlinear iteration 28: 0.000800572 >> >> >> Postprocessing: >> >> >> ---------------------------------------------------- >> Exception on MPI process <16> while running postprocessor : >> >> >> ---------------------------------------------------- >> Exception on MPI process <19> while running postprocessor : >> >> -------------------------------------------------------- >> An error occurred in line <6632> of file in function >> void dealii::DataOutInterface::write_vtu_in_parallel(const char*, MPI_Comm) const [with int dim = 2; int spacedim = 2; MPI_Comm = ompi_communicator_t*] >> The violated condition was: >> ierr == MPI_SUCCESS >> Additional information: >> deal.II encountered an error while calling an MPI function. >> The description of the error provided by MPI is "MPI_ERR_OTHER: known error not in list". >> The numerical value of the original error code is 16. >> -------------------------------------------------------- >> >> Aborting! >> ---------------------------------------------------- >> >> Any ideas on how this might be solved? Thanks in advance. >> >> Cheers, >> Mengxue >> >> _______________________________________________ >> Aspect-devel mailing list >> Aspect-devel at geodynamics.org >> https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.geodynamics.org_cgi-2Dbin_mailman_listinfo_aspect-2Ddevel&d=DwIGaQ&c=Ngd-ta5yRYsqeUsEDgxhcqsYYY1Xs5ogLxWPA_2Wlc4&r=R5lvg9JC99XvuTgScgbY_QFS80R7PEA2q0EPwDy7VQw&m=3YF5UBWv7QtvW2xEDKaM5Lc9Ws5MU4W3zChJR4c29Vg&s=_nX_-72XBVEzLgYRh4AQglxsFszSzHpwgc92PPG-Tbg&e= -------------- next part -------------- An HTML attachment was scrubbed... URL: