Acoustic Vector Sensing

Undergraduate Research by Ian Beil and Evan Nixon

Results

At this point in our research we have completed several tasks:

 

1. We have created a working GUI in LabView

2. We are able to successfully simulate data and use it to track a simulated source

3. We are able to successfully use real data to track the position of a sound source with several flaws

 

1) Our LabView interface (as detailed in the LabView section) takes in all inputs and works with MATLAB to produce the desired output.

 

2) Data simulated by a MATLAB script can be created and ‘fed’ into our system to produce the correct output. This was mostly a diagnostic tool for us to fix bugs in our MATLAB and LabView codes, but also could be helpful in other scenarios. It is also a good demonstration of how the system can work when it is fully functional

 

3) Using filtering and Capon spectrum beamforming, we were able to track a moving source in two dimensions. We were able to achieve this with triangulationGraph, but sometimes experienced an echo effect on the calculated position of the source- that is to say that sometimes the source would be calculated as on the opposite side of the sensor as positioned but otherwise correct (Y direction flipped, X direction correct)

Experimental Setup

 

Our experimental setup is partially pictured above. The equipment that we use for the experiment include two acoustic vector sensors, two signal conditioner boxes, one data acquisition box, and one computer. Each AVS is hooked up to its own pre-calibrated signal conditioner box via a manufacturer-supplied cable. Each signal conditioner box has four channels of output via BNC cable that hook up to our data acquisition box. This box connects 16 BNC inputs and 8 1/4” jacks to a Texas Instruments USB interface. The signal conditioner boxes are connected to this box which then interacts with the LabView Software.

 

Each AVS is oriented very specifically– the sensors are placed 1m away from each other at the same height. The blue side of the sensor is pointed in a direction perpendicular to the imaginary line connecting the sensors– this is denoted the positive Y direction. The first sensor is chosen as the one on the left when looking in the +Y direction and the second sensor is chosen as the one on the right.

 

When the experiment is run, two plots are generated:

The top plot shown is generated by Capon_Spectrum.m and shows the direction of the source from each sensor. The bottom plot shows where the source is estimated to be based upon our triangulation.

 

Conclusion

 

The AVS array is designed to provide real-time acoustic source localization. Our efforts this semester involved  creating a robust LabView Virtual Instrument to streamline this process. The VI has the ability to simulate sensor data, or read real data from a DAQ unit, in order to locate the sound source. Both options have been proven to work in the two-dimensional case, for both stationary and moving targets, using two sensors.

Future work will focus on improving the VI to locate acoustic sources in three dimensions. We will also attempt to use all four AVS in a single array, in order to improve spatial resolution. Eventually, we hope to distinguish between two sound sources and locate them separately at the same time.