Future Directions
Introduction |
Project Overview |
Recent Progress |
LabVIEW Environment |
Physical System |
Current Work |
Conclusions |
Future Directions |
Having constructed a working robotic acoustic source localization system with adaptive microphone movement there are a number of possible future directions for the continuation of this project. One notable concept to which adaptive acoustic source localization lends itself is that of using one robotic microphone pair to estimate source position. With robotic movement, a single microphone pair could provide an angle of arrival and then move to a position nearby to provide a second angle, at which point the source position could be estimated.
Another notable direction involves redesigning the system so that localization can occur during the actual time period when the robots are moving. Currently the system is organized in such a way that the robot is told to stop before the microphone data used in localization can be processed. This is necessitated by the fact that our current system can't determine robot position accurately until robotic movement has ended. Without knowing robotic position geometry, useful estimations cannot be made. One idea to accomplish movement and acquisition concurrently is placing a visual sensor on the robots and providing a track with cues (such as lines) for the sensor to read so that the system can determine robot position. The effect that sensing during movement would have on data cross correlation in determining delay and angle of arrival would also have to be considered in this scenario. If a chirp of significant time duration is used as the incoming sound wave, a moving microphone would receive the chirp over a range of positions. It seems logical that this would have an effect on the cross correlation functions involved in delay estimation.
Besides extending the simulation to include a case allowing for estimation with robotic rotation, the simulation has also been extended to allow for robot movement in the 2D plane rather than only on a 1 dimensional track. Building such a system would allow for more degrees of freedom in robot movement and doing this would enable the microphone array to achieve new orientations which allow for better resolution of the estimation. An algorithm controlling adaptive movement in this situation is currently being designed.
This project is a part of a multi-team Robotic Sensing project in the Department of Electrical and Systems Engineering. The Robotic Sensing Project focuses on undergraduate research relating to the application of a variety of sensing modalities in robotic systems of different kinds. With this in mind the LabVIEW interface developed in this project and the experience gained in general could be helpful in the development of an adaptive robotic system with other sensors, including the chemical, visual, acoustic vector sensor, RF electromagnetic or infra-red modalities.