We have designed a workaround for this problem, that hopefully works in most cases. The issue is memory, so the way to resolve it is to reduce the size of the grids. Transmission eigenstates are particularly difficult since they are complex, and so consume double the amount of memory compared to potentials and densities.
ATK uses the
mesh cut-off on the SCF object to determine the grid spacing. It is often sufficient to have a
50-100 Ry mesh cut-off for plotting the grids. However, it is certainly not always possible to converge the calculation at such small values - and if you do, maybe the results are not accurate enough!
Therefore, we need to overwrite the value of the mesh cut-off on the SCF object
after the calculation is finished to fool ATK into using different (larger) grid spacing for the evaluation of the grids to be plotted. To do this, insert the following line of code somewhere between the point where the SCF object is created and where you use the SCF object for evaluating the grids:
scf._attributeContainer().getAttributeContainer('SetupAttributes').setDouble(100.,'MeshCutoff')
Here we used the value
100 Rydberg (the unit is implicit); if the grid is still too big for VNL, reduce to 80 and try again, etc.
This works both if the SCF object was generated in the same script (by
executeSelfConsistentCalculation()) or restored from a checkpoint/NetCDF file (by
restoreSelfConsistentCalculation()).
In future editions of ATK, I imagine it will be possible to specify a custom mesh cut-off directly to the routines that evaluate grids. Plus, that the memory usage will be lower in VNL!