Recent Posts

Pages: 1 ... 7 8 [9] 10
81
General Questions and Answers / Re: HartreeDifferencePotential caculation
« Last post by Anders Blom on September 3, 2025, 02:54 »
Maybe check the electron temperature,  the default was 300 K when the tutorial was written, now it's 1000 K, maybe that causes convergence problems. We will also check ourselves if the calculation settings are still correct, the tutorial is 9 years old and some adjustments may be needed for newer versions of the code.
82
As mentioned before, there is no difference between up and down spin unless you have a magnetic axis in the system, which defines up/down relative some reference direction. And in the collinear case there is no such direction, and since Gold is non-magnetic there is not even a parallel or antiparallel spin setup, so your up and down for Ni will remain degenerate and cannot be distinguished. In fact, I don't think you would see a different even in a noncollinear simulation since the magnetic field is not actually part of the simulation, and I honestly don't see physically how it would change anything in the experiment either, if the field is turned off before the measurement?
83
Sorry, I don't understand the question, please try to rephrase what you actually need.

The basics are simple: you need a network connection from the machine running the software to the machine hosting the license. The license can be tied to the Ethernet or Wifi adapter, no problem with either, and in fact does not have to match the network you connect over. The MAC address is only used to ensure the license is valid for the specific machine.
84
Ok, but it's more of a operating system thing, than anything we can control in our code

As I am not aware whether the issue I have faced is similar to this or not . But still I would like to request you to guide me regarding this. For my case- I have successfully executed the hdf5 file based on convergence IV simulation for 10hrs. But after that I can not visualize the HDF5 file from " Data" Section of the software.  I have also tried to close and reopen the software and also I have copied the file in to the output folder as well.  Please help me to retrieve the IV Data  atleast within excel file. Thank you.
85
I have attached the issue . Please help me to resolve. I can not visualize the hdf5 file
86
Hi Pshinyeong, QuantumATK ships it's own version of Intel MPI. The version shipped with QuantumATK X-2025.06 is Intel MPI 2021.15 which comes with Intel oneAPI 2024.2 (I believe), which is newer than the one you load in your job script. Newer versions of Intel MPI are not necessarily compatible with older versions, so that could be why it fails.

I suggest that you do not use a custom Intel MPI version unless you really, REALLY know what you're doing. So instead of loading oneAPI or OpenMPI as you would maybe normally do for other academic software your simply don't. QuantumATK is a self-container plug-and-play solution that works as-is without any installed libraries or tools.

So I recommend you remove the module loads and simply have this line to launch your job:

Code
srun /path/to/atkpython $PYTHON_SCRIPT > $LOG_FILE

If in doubt - use the built in job manager GUI to set up submission scripts for Slurm.
87
Hi,

Try closing and reopening QuantumATK. Also, make sure you have selected right folder under project icon (Top left)

Yes , I have done that but the issue persists.
88
Hi,

Try closing and reopening QuantumATK. Also, make sure you have selected right folder under project icon (Top left)
90
Hi Pshinyeong,
From what I see you have commented the following lines:
#mpirun ~/QuantumATK23/quantumatk/V-2023.09/bin/atkpython_system-mpi $PYTHON_SCRIPT > $LOG_FILE
#mpirun /home/edrl_05/QuantumATK/QuantumATK-U-2022.12-SP1/bin/atkpython $PYTHON_SCRIPT > $LOG_FILE

Also, instead of using mpirun, I would recommend using QATK inbuilt mpiexec.hydra for parallelization and atkpython for execution.
Also, you need to update the paths, so your SLURM script will look something like this:
Code
#!/bin/bash

#SBATCH --job-name=QuantumATK
#SBATCH --ntasks=60
#SBATCH --ntasks-per-node=60
#SBATCH --nodes=1
#SBATCH --cpus-per-task=1
#SBATCH --output=%x-%j.out
#SBATCH --error=%x-%j.err
#SBATCH --partition=normal
#SBATCH --mem=210GB
#SBATCH --nodelist=n16,n15,n14

cd $SLURM_SUBMIT_DIR
export ATK=/home/edrl_05/QuantumATK/quantumatk/X-2025.06/bin/atkpython
export MPI=/home/edrl_05/QuantumATK/quantumatk/X-2025.06/mpi/bin/mpiexec.hydra
export MPIE=/home/edrl_05/QuantumATK/quantumatk/X-2025.06/mpi/bin/mpiexec
export MKL_DYNAMIC=TRUE
export OMP_NUM_THREADS=1
export MKL_NUM_THREADS=1
export LM_LICENSE_FILE="path"
export SNPSLMD_LICENSE_FILE="path"

${MPI} ${ATK} in.py > out.log
Pages: 1 ... 7 8 [9] 10