Not one to be sniffed at, this interactive nose sculpture from multimedia artist Adnan Aga – called the Adnose – can ‘smell’ any object held in front of it.
How do we know? I mean, how does a nose on its own even tell us what it’s smelling? Well, a paper poetically describing the smell is printed from one of the nostrils!
Dubbed a ‘unique sensory experience’, the nose uses a Raspberry Pi camera module (with the addition of a fisheye lens) to take a snapshot of whatever is placed under its nostril.
Once the photo is processed via Google Lens (which helps it identify the object in question), it’s then up to a GPT-4 language model to produce a delightful description.
You can watch the video below to get a better idea of the process – and listen to the nose’s voice, because despite what I said above, it actually has one of those… somehow:
It’s quite a charming interactive art installation, and that’s without even knowing the background of the artist, who was actually born with the inability to smell – known as anosmia. This nose now does the job for her, just like a prosthetic – which is how Adnan himself has described it.
He hopes it will raise awareness of the anosmic community, which is often overlooked.
The Adnose was well received at its debut, which took place at a ‘smell-and-tell’ event at the Olfactory Art Keller in Chinatown, New York – where it fits right in with various other smell-related pieces.
Adnan actually created this art installation as part of his graduate thesis, and you can find out more about it via his LinkedIn page. You can also still visit the talking nose at the Olfactory Art Keller’s Cubiculum Odoratus where it’s currently residing.
What do we think – another great use of AI?
Source: ReviewGeek