Please login with a confirmed email address before reporting spam
Posted:
2 decades ago
2009年8月26日 GMT-4 16:30
Hi Mohammad,
Yup your computer is running out of memory. Are you using UMFPACK or SPOOLES solver? SPOOLES is supposed to use less memory though it is also more unstable.
Hi Mohammad,
Yup your computer is running out of memory. Are you using UMFPACK or SPOOLES solver? SPOOLES is supposed to use less memory though it is also more unstable.
Please login with a confirmed email address before reporting spam
Posted:
2 decades ago
2009年8月26日 GMT-4 18:36
Hi Fatima,
Many thanks. I tried both the solvers (UMFPACK and SPOOLES), but still couldn't resolve the problem. How about using different mesh elements?
Mohammed
Hi Fatima,
Many thanks. I tried both the solvers (UMFPACK and SPOOLES), but still couldn't resolve the problem. How about using different mesh elements?
Mohammed
Please login with a confirmed email address before reporting spam
Posted:
2 decades ago
2009年8月26日 GMT-4 22:41
you might want to go for coarser mesh. I had the same memory problem for some analysis but it worked when i scaled my geometry which reduced the number of degrees of freedom to solve for.
Manohar
you might want to go for coarser mesh. I had the same memory problem for some analysis but it worked when i scaled my geometry which reduced the number of degrees of freedom to solve for.
Manohar
Ivar KJELBERG
COMSOL Multiphysics(r) fan, retired, former "Senior Expert" at CSEM SA (CH)
Please login with a confirmed email address before reporting spam
Posted:
2 decades ago
2009年8月27日 GMT-4 04:05
Hi
there is another way to get a little more RAM, works successfully for me, by using the server-client approach as then the client can be swapped during the solve processing, this has allowed me to rougly double the DoF I can solve on my laptop (when I was using a laptop, now I'm on a larger server with much more RAM and havnt used this approach for some time).
This does not mean that you should reduce the mesh size to get your model smaller, with the risk o less accuracy
Playing with the sovers also can help, but that again depends largely on the problem you are solving, symmetric or no, linear or not ... , there arequite some developments about this in the doc, sometimes with "local" application mode tricks in the specific application mode documentation, generally under the title "solver"
Good luck
Ivar
Hi
there is another way to get a little more RAM, works successfully for me, by using the server-client approach as then the client can be swapped during the solve processing, this has allowed me to rougly double the DoF I can solve on my laptop (when I was using a laptop, now I'm on a larger server with much more RAM and havnt used this approach for some time).
This does not mean that you should reduce the mesh size to get your model smaller, with the risk o less accuracy
Playing with the sovers also can help, but that again depends largely on the problem you are solving, symmetric or no, linear or not ... , there arequite some developments about this in the doc, sometimes with "local" application mode tricks in the specific application mode documentation, generally under the title "solver"
Good luck
Ivar
Please login with a confirmed email address before reporting spam
Posted:
2 decades ago
2009年8月31日 GMT-4 15:36
Hi,
I have had better luck running COMSOL on a 64 bit machine (using Linux Ubuntu). 64 bits makes a big difference.
Josh
Hi,
I have had better luck running COMSOL on a 64 bit machine (using Linux Ubuntu). 64 bits makes a big difference.
Josh
Ivar KJELBERG
COMSOL Multiphysics(r) fan, retired, former "Senior Expert" at CSEM SA (CH)
Please login with a confirmed email address before reporting spam
Posted:
2 decades ago
2009年8月31日 GMT-4 16:28
Hi
What you mean, I beleive is that with more addressable RAM, as you can use on a 64bit (addressable) CPU, you can digest lager models.
That is ceranly true, just as well as you can run multiple processors in parallel, too, and substantially speed things up.
Unfortunately, you will not gain in precision on the numerics, as the "64 bits" do normally not imply any change to the floating point processor.
But, I have heard that GPU parallel processors might be soon used too, so if you are really looking for large models, they will soon be as "easy" to solve as simple 2D sketches are today
Ivar
Hi
What you mean, I beleive is that with more addressable RAM, as you can use on a 64bit (addressable) CPU, you can digest lager models.
That is ceranly true, just as well as you can run multiple processors in parallel, too, and substantially speed things up.
Unfortunately, you will not gain in precision on the numerics, as the "64 bits" do normally not imply any change to the floating point processor.
But, I have heard that GPU parallel processors might be soon used too, so if you are really looking for large models, they will soon be as "easy" to solve as simple 2D sketches are today
Ivar
Please login with a confirmed email address before reporting spam
Posted:
2 decades ago
2009年9月13日 GMT-4 07:37
I had the same problem with win XP on a quad processor. I have linux ubuntu 64 bit on the same computer and it works great.
Eyal
I had the same problem with win XP on a quad processor. I have linux ubuntu 64 bit on the same computer and it works great.
Eyal
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2009年11月26日 GMT-5 07:56
Hi,
recently i have done some quite strange experience with this. When i run simulation on WXP, i encounter "out of memmory" error, but when i run same simulation on the same machine using ubuntu 9.04, the computer manages to acomodate the data into RAM and, according to the sounds which come out of the machine(HDD runs for its life), also the in SWAP partition. Not suprisingly, the simulation runs slow because of the hard drive data flow limit. Ergo it seems, that WXP somehow cant use the paging file. I have tried increasing the size of it to 8 GB with no effect. During the simulation, that crashes because of lack of memory, the page file grows about 100MB and not more even though the RAM usage grows over 1G and than the "out of memory" error comes, so the use of paging file would be expected. Another interesting thing is, that when I run simulation on WXP, which is small enough to fit in RAM only and therefore it succesfully finishes, it runs significantly faster, than on ubuntu 9.04 ( approx 40% faster - same task, same solver, etc. ). Some interesting thing is, that when you look on the CPU usage graphs during calculatin in both systems, you can see why the computing time is so signficantly longer. In WXP both my cores (i have dual core 32 bit processor) have almost same performance graphs during the evaluation. But in UBUNTU, the cores never work on 100% at one time. The 100% performance jumps from one core to another in about five seconds(approx, irregularly). The second core is never used more than 30%.
So long story short : during my calculation in WXP the system fails to use paging file(and generates "out of memmory" error), but in ubuntu 9.04, the system fails to split the calculation effectively for the CPU cores and runs significantly slower.
Hi,
recently i have done some quite strange experience with this. When i run simulation on WXP, i encounter "out of memmory" error, but when i run same simulation on the same machine using ubuntu 9.04, the computer manages to acomodate the data into RAM and, according to the sounds which come out of the machine(HDD runs for its life), also the in SWAP partition. Not suprisingly, the simulation runs slow because of the hard drive data flow limit. Ergo it seems, that WXP somehow cant use the paging file. I have tried increasing the size of it to 8 GB with no effect. During the simulation, that crashes because of lack of memory, the page file grows about 100MB and not more even though the RAM usage grows over 1G and than the "out of memory" error comes, so the use of paging file would be expected. Another interesting thing is, that when I run simulation on WXP, which is small enough to fit in RAM only and therefore it succesfully finishes, it runs significantly faster, than on ubuntu 9.04 ( approx 40% faster - same task, same solver, etc. ). Some interesting thing is, that when you look on the CPU usage graphs during calculatin in both systems, you can see why the computing time is so signficantly longer. In WXP both my cores (i have dual core 32 bit processor) have almost same performance graphs during the evaluation. But in UBUNTU, the cores never work on 100% at one time. The 100% performance jumps from one core to another in about five seconds(approx, irregularly). The second core is never used more than 30%.
So long story short : during my calculation in WXP the system fails to use paging file(and generates "out of memmory" error), but in ubuntu 9.04, the system fails to split the calculation effectively for the CPU cores and runs significantly slower.
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2009年11月26日 GMT-5 10:59
WinXP needs 2GB of the memory by default.
This can be changed with the option /3GB in the boot.ini:
www.comsol.com/support/knowledgebase/866/
WinXP needs 2GB of the memory by default.
This can be changed with the option /3GB in the boot.ini:
http://www.comsol.com/support/knowledgebase/866/
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2010年5月19日 GMT-4 05:33
Dear Mohammad,
could you solve your problem? I encounter out of memory and out of memory LU decomposition errors. I don't know how to deal with these memory errors. please guide me.
va saalamo alaikoom va rahmatollah
With Best regards,
Fatemeh Sharifi
Dear Mohammad,
could you solve your problem? I encounter out of memory and out of memory LU decomposition errors. I don't know how to deal with these memory errors. please guide me.
va saalamo alaikoom va rahmatollah
With Best regards,
Fatemeh Sharifi
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2010年5月19日 GMT-4 07:16
When your model grows in complexity (degree of freedom), you invariably run into out of memory errors because there is no limit as to how elaborate a model you can create/conceive but there is a very definite limit on hardware resources you have.
I would suggest you to check out iterative solvers described in the solver-related sections of multiphysics user guide. I use the RF module and most of the time I find these iterative solvers able to do the job. Though sometime I did run out of patience and canceled the solution process!
When your model grows in complexity (degree of freedom), you invariably run into out of memory errors because there is no limit as to how elaborate a model you can create/conceive but there is a very definite limit on hardware resources you have.
I would suggest you to check out iterative solvers described in the solver-related sections of multiphysics user guide. I use the RF module and most of the time I find these iterative solvers able to do the job. Though sometime I did run out of patience and canceled the solution process!
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2010年10月2日 GMT-4 10:56
how can i active server-client approuch?
how can i active server-client approuch?
Ivar KJELBERG
COMSOL Multiphysics(r) fan, retired, former "Senior Expert" at CSEM SA (CH)
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2010年10月3日 GMT-4 03:44
Hi
Read the explanations in the install & operation documentation
(comsol_iog.pdf for V4, install.pdf in V3.5a)
--
Good luck
Ivar
Hi
Read the explanations in the install & operation documentation
(comsol_iog.pdf for V4, install.pdf in V3.5a)
--
Good luck
Ivar
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2011年12月25日 GMT-5 03:27
I solved once a laser welding model. Resolving again of that model encountered to this error.
No changing the mesh or solver helped...
my system: quad4, 4gb ram, windows 64 7...comsol 4.2
and anybody know why UMFPACK is absent in comsol 4.2??
Thanks for help...
I solved once a laser welding model. Resolving again of that model encountered to this error.
No changing the mesh or solver helped...
my system: quad4, 4gb ram, windows 64 7...comsol 4.2
and anybody know why UMFPACK is absent in comsol 4.2??
Thanks for help...
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2013年3月8日 GMT-5 06:35
Hi Fatima,
Many thanks. I tried both the solvers (UMFPACK and SPOOLES), but still couldn't resolve the problem. How about using different mesh elements?
Mohammed
how can i add this (UMFPACK and SPOOLES) solver to the comsol......
[QUOTE]
Hi Fatima,
Many thanks. I tried both the solvers (UMFPACK and SPOOLES), but still couldn't resolve the problem. How about using different mesh elements?
Mohammed
[/QUOTE]
how can i add this (UMFPACK and SPOOLES) solver to the comsol......
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2013年3月8日 GMT-5 10:45
Hello Mohammed
I had the same problem few weeks ago try to make your mehs bigger ... this helped me
Greetings from Germany
Hello Mohammed
I had the same problem few weeks ago try to make your mehs bigger ... this helped me
Greetings from Germany
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2013年10月21日 GMT-4 06:33
I am a new member in the COMSOL community. I am trying to solve a simple 3D problem having Ac/DC ,transformer 3D model. However, I am getting this message "out of memory during LU factorization." Is my computer running out of memory to solve this particular problem or should I use a better solver to solve this problem? Could anyone share his/her experiences with COMSOL?
I am a new member in the COMSOL community. I am trying to solve a simple 3D problem having Ac/DC ,transformer 3D model. However, I am getting this message "out of memory during LU factorization." Is my computer running out of memory to solve this particular problem or should I use a better solver to solve this problem? Could anyone share his/her experiences with COMSOL?
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2014年2月19日 GMT-5 14:16
In linux you can close the desktop (killing every graphic process) and then you can run the study using the command line interface.
For example, using Debian or Ubuntu and assuming gnome as the desktop running:
press Ctrl + Alt + F1
login with your username and password
execute: sudo service gdm3 stop
execute comsol batch -inputfile in.mph -outputfile out.mph -batchlog out.log
What I dont know if this is better or worse than the client/server approach.
Does anybody knows?
Felipe BM
In linux you can close the desktop (killing every graphic process) and then you can run the study using the command line interface.
For example, using Debian or Ubuntu and assuming gnome as the desktop running:
[quote]
press Ctrl + Alt + F1
login with your username and password
execute: [i]sudo service gdm3 stop[/i]
execute [i]comsol batch -inputfile in.mph -outputfile out.mph -batchlog out.log[/i]
[/quote]
What I dont know if this is better or worse than the client/server approach.
Does anybody knows?
Felipe BM
Please login with a confirmed email address before reporting spam
Posted:
1 decade ago
2014年6月5日 GMT-4 02:23
What is server client approach... do you have some tutorial regarding this..? and I am using win 7 can I apply same in win 7?
What is server client approach... do you have some tutorial regarding this..? and I am using win 7 can I apply same in win 7?
Please login with a confirmed email address before reporting spam
Posted:
8 years ago
2016年7月27日 GMT-4 01:47
Hi Ivar,
My case is a little bit different enough thought my failed job gives me the same error message of "Out of memory during LU factorization."
In my case, I have run 4 jobs which use about 40GB of RAM. Then I would like to run one more which requires 22GB, but it fails from the very beginning with the error of out of memory. While I am pretty sure there are plenty of memory left. (my workstation has 256GB of RAM in total). Another sign of this malfunction is that from Task Manager, there is a discrepancy of memory between the total memory usage (31%) and the sum of all user respectively (17%) and it turns out about 16% X 256G = 40GB goes somewhere I cannot figure out.
This matter (failure because of out of memory and the discrepancy of memory) occurred once three months ago when I run 4 or 5 jobs at the same time and the solution was to restart my workstation at that time.
Btw, no other software is open or running except for COMSOL.
I should have asked some COMSOL technicians about that but your answers always hit the target precisely. I don't know whether you have ever encountered this kind of problem before, or if you have any idea, please let me know.
I would greatly appreciate it.
Thanks,
Yi
Hi Ivar,
My case is a little bit different enough thought my failed job gives me the same error message of "Out of memory during LU factorization."
In my case, I have run 4 jobs which use about 40GB of RAM. Then I would like to run one more which requires 22GB, but it fails from the very beginning with the error of out of memory. While I am pretty sure there are plenty of memory left. (my workstation has 256GB of RAM in total). Another sign of this malfunction is that from Task Manager, there is a discrepancy of memory between the total memory usage (31%) and the sum of all user respectively (17%) and it turns out about 16% X 256G = 40GB goes somewhere I cannot figure out.
This matter (failure because of out of memory and the discrepancy of memory) occurred once three months ago when I run 4 or 5 jobs at the same time and the solution was to restart my workstation at that time.
Btw, no other software is open or running except for COMSOL.
I should have asked some COMSOL technicians about that but your answers always hit the target precisely. I don't know whether you have ever encountered this kind of problem before, or if you have any idea, please let me know.
I would greatly appreciate it.
Thanks,
Yi