Author Topic: The problem on calculating energy of hydrogen atom using LDA and GGA  (Read 3744 times)

0 Members and 1 Guest are viewing this topic.

Offline wot19920302

  • QuantumATK Guru
  • ****
  • Posts: 124
  • Country: cn
  • Reputation: 0
    • View Profile
Dear Quantumwise staffs,
          I notice that calculated energy of hydrogen atom rangs between 12-14 eV instead of exact 13.6 eV when using LDA or GGA. Exchange-Correlation Functional, basis sets and pseudo potential I use are as follow.
         GGA-FHI-Tight Tier1    -12.53 eV
         GGA-OMX-High    -14.33 eV
         LDA-FHI-Tight Tier1    -12.19 eV
         LDA-OMX-High    -13.96 eV
         Mesh cutoff I select is 200 hartree for these parameters. I wanna know what cases such deviation of calculated energy for hydrogen atom?  Is the limit of pseudo potential or other factors which result in such deviation?
         Yours
« Last Edit: August 19, 2018, 17:34 by wot19920302 »

Offline Ulrik G. Vej-Hansen

  • QuantumATK Staff
  • Supreme QuantumATK Wizard
  • *****
  • Posts: 426
  • Country: dk
  • Reputation: 9
    • View Profile
In QuantumATK, and most other DFT codes, absolute 0 is determined from the average potential in the system. So *any* difference between two calculations will change the position of 0 energy. It is therefore expected that different basis sets and pseudopotentials give different values.

Offline Anders Blom

  • QuantumATK Staff
  • Supreme QuantumATK Wizard
  • *****
  • Posts: 5576
  • Country: dk
  • Reputation: 96
    • View Profile
    • QuantumATK at Synopsys
It's essential to use a spin-polarized calculation for the single hydrogen atom. If you do that, pretty much any basis set or functional you use (I just tried all default settings except a lower Fermi broadening, corresponding to 100 K) give you -13.6 eV (I got -13.59025).