Author Topic: out of memory error by allocating 1000 GB memory!  (Read 14 times)

0 Members and 1 Guest are viewing this topic.

Offline Alireza

  • New ATK user
  • *
  • Posts: 8
  • Country: de
  • Reputation: 0
    • View Profile
out of memory error by allocating 1000 GB memory!
« on: October 26, 2021, 16:29 »
Dear experts,

I am trying to get the electronic transport of a large system which contains 2604 atoms within the dftb calculations.
First I tried this setting:
--nodes=1
--ntasks-per-node=24
--cpus-per-task=1
--mem-per-cpu=2540
and the job ran into out-of-memory error.
Then I increased the number of nodes up to 20, leading to 480 cores and 1,219,200 mb total memory. Unfortunately, the calculation ran into the same error again. The attached figure is the memory utilization related to the mentioned job. At the time around 37 minutes, the semi empirical calculation finished, and the transmission calculation starts. Note that, the calculation ran into error at this time.

Appreciate any comments/suggestions

Cheers, A