Neurala awarded new contract from the US Air Force Research Labs

drone_fish_eyeNeurala has been awarded a contract from the Air Force Research Labs (AFRL) to design and field neuromorphic algorithms for multimodal processing and fusion. Sensory processing of visual, auditory, and other sensor information (e.g., LIDAR, RADAR) is conventionally based on "stovepiped", or isolated processing, with little interactions between modules.

In biological systems, on the other hand, sensory information continuously fuses, where one sense (e.g., vision) inform other senses (e.g., audition) of what is important and paid attention at in the environment based on the task at hand. For instance, understanding one's speech is much easier when the auditory system is informed of the location of a speaker by the visual system. Similarly, visually identifying a relevant object in the environment (e.g., a person, or a vehicle) is aided by the auditory system, which provides information to the visual system regarding rough position of objects, or location of objects outside of the field of view.

As a result, biological systems have dramatically better signal-to-noise ratio than state-of-the-art artificial visual and auditory recognition systems. In the next 1.5 years, Neurala will work with AFRL to design and field such systems and test them in mobile robots for increased robot situational awareness.