Please login with a confirmed email address before reporting spam
Posted:
2 decades ago
2009年11月16日 GMT-5 05:45
I belive this problem is very similar to one which I have encountered during solving very fine meshed problems(comsol sayed : out of memory during LU factorization) . The operation's demand for memory just exceeds your RAM. I cant say why it is not possible to use SWAP or TEMP for data storage, but I think you can solve it by rendering the mesh less fine(if the model remains reliable, of course).
I belive this problem is very similar to one which I have encountered during solving very fine meshed problems(comsol sayed : out of memory during LU factorization) . The operation's demand for memory just exceeds your RAM. I cant say why it is not possible to use SWAP or TEMP for data storage, but I think you can solve it by rendering the mesh less fine(if the model remains reliable, of course).
Please login with a confirmed email address before reporting spam
Posted:
2 decades ago
2009年11月17日 GMT-5 12:24
Hi Hoa,
I also had some experiences with the problem. Let try with the following things:
- Make mesh less fine with sub-domains not important or all of sub-domains
- Using symmetrical geometry properties to model for only small parts of device or machine, so you can reduce required memory significantly and your program will run faster.
Good luck!
Hung Vu Xuan
Hi Hoa,
I also had some experiences with the problem. Let try with the following things:
- Make mesh less fine with sub-domains not important or all of sub-domains
- Using symmetrical geometry properties to model for only small parts of device or machine, so you can reduce required memory significantly and your program will run faster.
Good luck!
Hung Vu Xuan
Please login with a confirmed email address before reporting spam
Posted:
2 decades ago
2009年11月18日 GMT-5 08:52
Hi Hung,
Thank you all. I'll try it
Hi Hung,
Thank you all. I'll try it
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2009年12月13日 GMT-5 18:38
On Page 124 of v3.5a User's Guide there is a description of how to increase the default Java heap space by changing the MAXHEAP variable in the file comsol.opts, which is loaded on launch. The default value that came with v3.5 was 256 MB. Try increasing this, taking into account how much memory you have in your machine. Let us know how this works.
On Page 124 of v3.5a User's Guide there is a description of how to increase the default Java heap space by changing the MAXHEAP variable in the file comsol.opts, which is loaded on launch. The default value that came with v3.5 was 256 MB. Try increasing this, taking into account how much memory you have in your machine. Let us know how this works.
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2009年12月13日 GMT-5 18:43
Another approach might be to experiment with using different linear solvers. Some of them require a contiguous heap, some don't, and at least one of them allegedly does the calculation out of core. This last presumably means the solver breaks it up into bits and swapping parts in/out from disk as needed. It would would be slow, but from my reading may be the only way to handle really large problems.
Another approach might be to experiment with using different linear solvers. Some of them require a contiguous heap, some don't, and at least one of them allegedly does the calculation out of core. This last presumably means the solver breaks it up into bits and swapping parts in/out from disk as needed. It would would be slow, but from my reading may be the only way to handle really large problems.
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2010年3月19日 GMT-4 10:12
Hi - since I am running into the sampe problem of too little memory I am looking for answers and found your comment. I dont't think that increasing Java's heap space does solve the problem because the solvers don't use the heap as far as I know. Therefore, it might even be better to decrease the size of the heap to free more RAM for the solver. Any experiences with that?
Pardiso out-of-core also ran out of memory in one of my problems and I cannot make the mesh coarser or else the solution is not correct. I keep trying other solvers...
Hi - since I am running into the sampe problem of too little memory I am looking for answers and found your comment. I dont't think that increasing Java's heap space does solve the problem because the solvers don't use the heap as far as I know. Therefore, it might even be better to decrease the size of the heap to free more RAM for the solver. Any experiences with that?
Pardiso out-of-core also ran out of memory in one of my problems and I cannot make the mesh coarser or else the solution is not correct. I keep trying other solvers...
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2010年3月19日 GMT-4 12:27
Hi,
MAXHEAP will not solve the problem. This is not a magic parameter. MAXHEAP is also limited by the available memory and has nothing to do with the solver.
There is a limit in your memory which is more related to the physical part ,the hardware that you have.
The only thing that you can do is to play with the mesh, switch between direct and indirect solvers.
The radical or the ultimate solution is to ask/buy more memory.
Good luck
Hi,
MAXHEAP will not solve the problem. This is not a magic parameter. MAXHEAP is also limited by the available memory and has nothing to do with the solver.
There is a limit in your memory which is more related to the physical part ,the hardware that you have.
The only thing that you can do is to play with the mesh, switch between direct and indirect solvers.
The radical or the ultimate solution is to ask/buy more memory.
Good luck
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2010年3月19日 GMT-4 13:57
I have received similar error with very few and coarse mesh on a very simple geometry. I have 3GB available memory and >2Ghz dual core CPU! Isn't it odd? I have used to run other commercial programs with much more mesh and more complicated geometry with a weaker machine. So, I think it may not be necessarily only a memory problem.
Hi,
MAXHEAP will not solve the problem. This is not a magic parameter. MAXHEAP is also limited by the available memory and has nothing to do with the solver.
There is a limit in your memory which is more related to the physical part ,the hardware that you have.
The only thing that you can do is to play with the mesh, switch between direct and indirect solvers.
The radical or the ultimate solution is to ask/buy more memory.
Good luck
I have received similar error with very few and coarse mesh on a very simple geometry. I have 3GB available memory and >2Ghz dual core CPU! Isn't it odd? I have used to run other commercial programs with much more mesh and more complicated geometry with a weaker machine. So, I think it may not be necessarily only a memory problem.
[QUOTE]
Hi,
MAXHEAP will not solve the problem. This is not a magic parameter. MAXHEAP is also limited by the available memory and has nothing to do with the solver.
There is a limit in your memory which is more related to the physical part ,the hardware that you have.
The only thing that you can do is to play with the mesh, switch between direct and indirect solvers.
The radical or the ultimate solution is to ask/buy more memory.
Good luck
[/QUOTE]
Jim Freels
mechanical side of nuclear engineering, multiphysics analysis, COMSOL specialist
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2010年3月20日 GMT-4 22:58
I would suggest first to use less mesh if possible. If not possible to use less mesh, then get a 64-bit computer with more memory. You can also try the segregated solver and iterative solver to reduce memory requirements. Direct solvers use the most memory.
I would suggest first to use less mesh if possible. If not possible to use less mesh, then get a 64-bit computer with more memory. You can also try the segregated solver and iterative solver to reduce memory requirements. Direct solvers use the most memory.
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2010年3月21日 GMT-4 05:23
Since posting my comment above I bought a new quad-core machine with 24 GB memory, and am running Win7-64bit. The old our-of-memory problems disappeared, but it's easy to make them reappear again with a fine enough mesh. Hit the refine-mesh enough times and eventually you run out of memory. For me now it is at 38M nodes. Even for "only" 8M nodes, the solvers ramp up the memory usage all the way up to 24GB, and then it starts swapping.
However, there is now another problem not directly related to solving a particular problem, but rather rendering it on the screen. For mesh sizes that are too large you can get another, but different, kind of out-of-memory problem that reflects the size of your VRAM. However, this can be solved by turning off the automatic rendering, running your problem, then reducing the size of the mesh for the solution so that it will display.
Since posting my comment above I bought a new quad-core machine with 24 GB memory, and am running Win7-64bit. The old our-of-memory problems disappeared, but it's easy to make them reappear again with a fine enough mesh. Hit the refine-mesh enough times and eventually you run out of memory. For me now it is at 38M nodes. Even for "only" 8M nodes, the solvers ramp up the memory usage all the way up to 24GB, and then it starts swapping.
However, there is now another problem not directly related to solving a particular problem, but rather rendering it on the screen. For mesh sizes that are too large you can get another, but different, kind of out-of-memory problem that reflects the size of your VRAM. However, this can be solved by turning off the automatic rendering, running your problem, then reducing the size of the mesh for the solution so that it will display.
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2010年3月21日 GMT-4 05:34
About a slightly different set of limitations - I wanted to know where a simulation was spending most its time, so I exported to an M-file and ran it from Matlab with the profiler turned on. It turns out it was spending most of its time in the Java garbage collector. So, one wonders if perhaps changing the heap size might have an effect at least on the computation time.
About a slightly different set of limitations - I wanted to know where a simulation was spending most its time, so I exported to an M-file and ran it from Matlab with the profiler turned on. It turns out it was spending most of its time in the Java garbage collector. So, one wonders if perhaps changing the heap size might have an effect at least on the computation time.