Author Topic: ATKError: bad allocation  (Read 7367 times)

0 Members and 1 Guest are viewing this topic.

Offline lserrato

  • New QuantumATK user
  • *
  • Posts: 3
  • Reputation: 0
    • View Profile
ATKError: bad allocation
« on: May 21, 2010, 17:05 »
I´m optimizing the geometry and I get this message

Traceback (most recent call last):
  File "c:/docume~1/config~1/temp/tmpd3egc8.nl", line 457, in ?
  geometric_constraints = (0,1,2,3,4,5,6,7,8,9,10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27,28,29,68,69,70,71,72,73,74,75,76,77,78,79,80,81,82,83,84,85,86,87,88,89,90,91,92,93,94,95,96,97,)
ATKError: bad allocation
Terminated Abnormally

I have no idea about this error, somebody can help me? thanks very much


Offline Anders Blom

  • QuantumATK Staff
  • Supreme QuantumATK Wizard
  • *****
  • Posts: 5576
  • Country: dk
  • Reputation: 96
    • View Profile
    • QuantumATK at Synopsys
Re: ATKError: bad allocation
« Reply #1 on: May 21, 2010, 21:40 »
This means the program ran out of memory

Offline Po-Wen

  • New QuantumATK user
  • *
  • Posts: 2
  • Reputation: 0
    • View Profile
Re: ATKError: bad allocation
« Reply #2 on: May 24, 2010, 07:39 »
I also had this problem. I redo the calculation and at the same time check the system efficiency (CPU and memory), but the memory used for calculation is far below required. I have 12 Gb memory, but
this problem still occurs. Why that?

Offline Anders Blom

  • QuantumATK Staff
  • Supreme QuantumATK Wizard
  • *****
  • Posts: 5576
  • Country: dk
  • Reputation: 96
    • View Profile
    • QuantumATK at Synopsys
Re: ATKError: bad allocation
« Reply #3 on: May 24, 2010, 08:43 »
This error is frequently reported, but the only way it can arise is that the machine runs out of memory. If you are running the calculation in parallel, and placing several MPI processes on the same machine, you will quickly ramp up the memory usage since each process uses the same amount of memory.

Offline rahulprajesh

  • Regular QuantumATK user
  • **
  • Posts: 23
  • Reputation: 0
    • View Profile
Re: ATKError: bad allocation
« Reply #4 on: May 20, 2011, 11:23 »
This error is frequently reported, but the only way it can arise is that the machine runs out of memory. If you are running the calculation in parallel, and placing several MPI processes on the same machine, you will quickly ramp up the memory usage since each process uses the same amount of memory.

Sir in that case which Computer configuration would be sufficient ? Tell me what should be the memory n processors?

Offline Anders Blom

  • QuantumATK Staff
  • Supreme QuantumATK Wizard
  • *****
  • Posts: 5576
  • Country: dk
  • Reputation: 96
    • View Profile
    • QuantumATK at Synopsys
Re: ATKError: bad allocation
« Reply #5 on: May 20, 2011, 11:43 »
The amount of memory needed for a calculation depends on very many factors. However the general rule is that if you have problems with memory not being enough, take care to only assign one MPI process per physical node on the cluster, and also make sure you have that node to yourself, otherwise other programs might run on it and limit the available RAM.

Offline rahulprajesh

  • Regular QuantumATK user
  • **
  • Posts: 23
  • Reputation: 0
    • View Profile
Re: ATKError: bad allocation
« Reply #6 on: May 20, 2011, 12:01 »
The amount of memory needed for a calculation depends on very many factors. However the general rule is that if you have problems with memory not being enough, take care to only assign one MPI process per physical node on the cluster, and also make sure you have that node to yourself, otherwise other programs might run on it and limit the available RAM.
Sir i've six core processor 3.0Ghz, 12Gb of memory isn't it sufficient ???

Offline Anders Blom

  • QuantumATK Staff
  • Supreme QuantumATK Wizard
  • *****
  • Posts: 5576
  • Country: dk
  • Reputation: 96
    • View Profile
    • QuantumATK at Synopsys
Re: ATKError: bad allocation
« Reply #7 on: May 20, 2011, 14:03 »
Sufficient for what system? How many atoms, k-points, which method, which boundary conditions? Parallel or serial? How many MPI processes per node? What else is running on the node? You see, all these things matter :)