This year, the Long Night of the Sciences took place completely digitally – a good opportunity to think about the digitisation of our senses! Perceptronics was there live four times and reported on the topic “Digitisation of the Senses”. You missed the opportunity? Watch the cut video (German) from our livestream now on our youtube channel: https://www.youtube.com/channel/UC1BYnEWVKe4Z7XVOPzFHIIQ.
Digitization of the Senses
Our five senses enable us to perceive and interact with our environment. We have long been accustomed to the fact that technology can also interpret stimuli from the environment: Almost everyone carries a smartphone these days that recognises our face or cat in photos, responds to touch and interprets speech input. So technology can SEE, FEEL and HEAR. It is more difficult to SENTRY, that is, to interpret our “chemical environment”. Although prototypes of so-called electronic noses have existed for a long time, the sense of smell is still far from being digitised for mass use. This is a serious limitation in many ways, as the sense of smell plays an important role in our daily lives: Without smelling, our food tastes bland, smells warn us of dangerous situations such as fire or spoiled food, and the body odour of our fellow human beings tells us whether we are attracted to a person, whether they are sick or healthy, or even what emotions the person is experiencing. In fact, by smelling alone, people can tell if the smell of sweat is from exercise or from examinees before an oral exam! But why is smelling so much more complicated than other senses for a technological implementation? What does it take to teach sensors to smell and what would a world look like in which technology can smell? With a short tour through the “digitalisation of the senses”, we want to find answers to these questions, provide amazing information about our senses and debate whether the toaster of the future can smell.