The easy access and availability of Virtual Reality (VR) technologies open up more and more application fields. Unfortunately, current VR applications hardly or only very rudimentarily take acoustics into account, so that much of the technology's potential remains untapped.In this paper, we present the results of our experiments for localizing audio signals and presence using VR technologies. We used the virtual representation of a realistic scene (360 movie shot) to accurately place sound signals in the three-dimensional space using object-based audio. We manipulated the volume and the setting of the environmental sound (different positions and types of audio signals). Additionally, we compared the effect of using simple stereo sound with spatial stereo which uses the overlap of the field of hearing (spatial sound) similar to the techniques used for stereoscopic depths perception in 3D movie theaters. Depths cameras and gyroscope, VR Glasses with Eye-tracker and flying stick controllers were used by the participants to point out the source of the audio signals. We tracked participants´ reaction time, accuracy, eye movements, and whole body movements. In the light of our experimental results, we discuss implications for practice, cognitive science and future VR research.