[CIG-SEISMO] techniques for generating large meshes with CUBIT
Brad Aagaard
baagaard at usgs.gov
Wed Jan 26 10:14:53 PST 2011
CUBIT users:
We appear to have resolved most of the problems we were encountering
generating a large mesh with CUBIT. Our refinement groups were not being
created correctly so they included a much larger region than necessary.
This resulted in using much more memory; I believe it is related to the
overhead of forming the dual graph (how cells are connected to one
another), because memory use is highest during refinement. We setup the
refinement at a coarser resolution and then do a global uniform
refinement using the refine command on all volumes with the numsplit
option. This approach allowed us to generate the mesh we wanted.
I was curious and used both the 32-bit and 64-bit linux versions of
CUBIT 12.2 on a 2-D test problem and found the maximum memory used in
the 64-bit version was a little less than 2x that used in the 32-bit
version. This is more overhead than I would have expected.
Brad
> On Friday, January 21, 2011, Brad Aagaard<baagaard at usgs.gov> wrote:
>> Hi all:
>>
>> We are running into some issues generating large meshes (few million hex
>> cells) for PyLith using CUBIT. We want a mesh in which some regions have
>> a finer resolution mesh and have been using
>>
>> refine node in REFINEGROUP depth 1
>>
>> to refine the mesh in the desired regions. We appear to be running out
>> of memory on a machine with 16GB of RAM when the number of cells gets to
>> be more than about a few million.
>>
>> If anyone has experience generating a hex (or tet) mesh with 5-10
>> million cells or more using CUBIT in which some regions have a finer
>> resolution, I would appreciate learning what procedure you used and how
>> much memory was required.
>>
>> Thanks,
>> Brad
>> _______________________________________________
>> CIG-SEISMO mailing list
>> CIG-SEISMO at geodynamics.org
>> http://geodynamics.org/cgi-bin/mailman/listinfo/cig-seismo
>>
>
More information about the CIG-SEISMO
mailing list