The mixing of elements in EAM potentials is generally pretty good, although it always depends on the particular element and potential. I don't see any principle reason why the EAM_Zhou_2004 should not not be suited to model the diffusion in such an array. But without knowing the exact elements, I would suggest that you make some benchmark calculations on a small test system against DFT, to see whether it works sufficiently well for your elements. In general, EAM should be a lot better than Tersoff to simulate diffusion, as Tersoff (e.g., used for Carbon) can have unphysically large bond-breaking-barriers.
Yes, the approach to simulate diffusion into the deposited layer would be to couple the substrate to a thermostat and simulate the deposited atoms in NVE, as described in the vapor deposition tutorial.
If you don't see any diffusion into the deposited layer during the deposition simulation, you could separate the deposition and diffusion simulation, meaning that you stop the deposition when the layer has reached a sufficient thickness, and then you set up a constant temperature simulation where you couple the entire surface to a thermostat and run it for a long time.
However, practically, the time scale of MD might still be too short if the diffusion process is associated with high barriers. In that case you could try an adaptive kinetic Monte Simulation (AKMC).