Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - hhspace

Pages: 1 [2]
16
About Linux
LSB Version:    :core-3.0-amd64:core-3.0-ia32:core-3.0-noarch:graphics-3.0-amd64:graphics-3.0-ia32:graphics-3.0-noarch
Distributor ID: RedHatEnterpriseAS
Description:    Red Hat Enterprise Linux AS release 4 (Nahant Update 6)
Release:        4
Codename:       NahantUpdate6

About MPI
mpich2-1.0.7rc1

17
I am confused about the parallel calculation using ATK. The manual of ATK tells that if ATK is properly configured to run in parallel on your system, the test_mpi.py file should be written like that:

 from ATK.MPI import *
 
 if processIsMaster():
     print '# Master node'
 else:
     print '# Slave node'

After the following command
   
mpiexec -n 2 $ATK_BIN_DIR/atk /home/myusername/test_mpi.py

the result should  be         Master node

                                     Slave node
However, when I did the same job I got the result like that:
                                   
                                     Master node
                                   
                                     Master node

Now if I conduct a parallel job in this setting, is it still a real parallel calculation??



18
Thank you for your reply, and I will follow your advice and try some possible changes! And thanks a lot again!

19
When I calculate a two-probe system, which has 313 atoms in the central region, the message 'ATKError: St9bad_alloc' always obtained after the calculation of the electrode has converged. What does 'ATKError: St9bad_alloc' really means?

Pages: 1 [2]