SEVGI ZÜBEYDE GÜRBÜZ / research / why cognitive sensing?

Nowadays, sensors are everywhere - from satellite imagers to airborne devices, in our homes, cars, personal devices, and even in implants - sensors are everywhere, ready for exploitation towards a plethora of applications dedicated to improving human quality of life. Combined, all of these resources form an exceedingly large distributed network, often linked through wireless technologies. The sensing environment is complex and dynamic: as a function of time, the movement and characteristics of the target may change; mission requirements may change; environmental characteristics, such as interference, obstructions, and clutter, may change; spectrum constraints driven by the need to share congensted frequency bands may also change.

Thus, one of the main challenges in sensing is how to efficiently (in terms of time and energy) process this deluge of data to yield the greatest performance in a multi-mission scenario: the network may not be doing the same function all the time, but instead may be switching between multiple functions, such as detection, localization, tracking, classification, dynamic pattern recognition, and communications. However, in current sensing schemes, neither the network, nor the sensors themselves, have the ability to adapt to changing requirements and environmental conditions: regardless of whether active or passive, sensors always acquire data with the same static process. Currently, algorithms only adaptively process and fuse the data acquired. Data science algorithms strive to extract the most information from the data acquired. But how can one build an effective sensor network if the configuration of the sensor network and its nodes is such that relevent phenomenon are missed entirely? An important contribution to the big data problem is first ensuring that data is acquired in such as way as to ensure the maximum amount of relevent information exists in the data.

Could you imagine a person functioning that way? How comical would it be if we heard a sound coming from the right, but continued to look left when trying to identify what we heard? Or if we closed our ears whenever we used our eyes?

Humans are highly cognitive beings, and we use our sensors in harmony and cooperation, changing how we use our senses to best understand the world. We decide which of our senses is best to understanding our surroundings, and when our eyes and ears (most often used, preferred sensors) are insufficient, we apply our senses of smell and taste. When our natural senses fail, we grab mechanical equipment to augment our capabilities, and we use our sensors simultaneously, each sense informing and directing use of the other senses.

Thus, the simple answer to "why cognitive sensing?" is simply that the current static approach fails to meet challenges put forward in complex environments, and that cognitive functions of humans and other intelligence lifeforms have served to inspire a better approach: wouldn't it be nifty if each sensor, and indeed the network as a whole, was itself a sentient being, capable of thinking and "figuring out" what the best way to collect data or transmit signals would be? Just imagine... sensors functioning as fully-sentient robots!

Mimicking Cognitive Processes