QuantumATK Forum

QuantumATK => General Questions and Answers => Topic started by: wot19920302 on August 19, 2018, 17:28

Title: The problem on calculating energy of hydrogen atom using LDA and GGA
Post by: wot19920302 on August 19, 2018, 17:28
Dear Quantumwise staffs,
          I notice that calculated energy of hydrogen atom rangs between 12-14 eV instead of exact 13.6 eV when using LDA or GGA. Exchange-Correlation Functional, basis sets and pseudo potential I use are as follow.
         GGA-FHI-Tight Tier1    -12.53 eV
         GGA-OMX-High    -14.33 eV
         LDA-FHI-Tight Tier1    -12.19 eV
         LDA-OMX-High    -13.96 eV
         Mesh cutoff I select is 200 hartree for these parameters. I wanna know what cases such deviation of calculated energy for hydrogen atom?  Is the limit of pseudo potential or other factors which result in such deviation?
         Yours
Title: Re: The problem on calculating energy of hydrogen atom using LDA and GGA
Post by: Ulrik G. Vej-Hansen on August 23, 2018, 09:32
In QuantumATK, and most other DFT codes, absolute 0 is determined from the average potential in the system. So *any* difference between two calculations will change the position of 0 energy. It is therefore expected that different basis sets and pseudopotentials give different values.
Title: Re: The problem on calculating energy of hydrogen atom using LDA and GGA
Post by: Anders Blom on October 2, 2018, 22:22
It's essential to use a spin-polarized calculation for the single hydrogen atom. If you do that, pretty much any basis set or functional you use (I just tried all default settings except a lower Fermi broadening, corresponding to 100 K) give you -13.6 eV (I got -13.59025).