Not to freak you out or anything, but your phone definitely knows about your NSFW selfies.
This week, a viral tweet revealed that the Photos app, on Apple’s iOS and macOS operating systems, recognises bras – and lets you search for them in your camera roll.
The app includes AI that can recognise thousands of search terms – including ‘brassiere’ (obviously our phones are far classier than we are).
Even when Chrissy Teigen tried it, her phone found all the cleavage pics in her camera roll.
But when I tested the feature, the results were noticeably less specific. Searching ‘brassiere’ found photos of the very sexy heat-rash on my chest from summer and, err, my ankle tattoo (I feel I should make it clear that I do have a tattoo of a bra on my foot).
I mean, I guess it makes sense – to the untrained eye, the roundness of an ankle bone could pass for a very sorry boob. But what’s even more confusing is that I have photos where my cleavage is out and proud artfully on display in my camera roll, that the AI chose to ignore. Basically, my phone thinks my ankle is more worthy of being a boob than my actual boobs are.
But the weirdness of Apple’s AI doesn’t stop there. While us common folk have only discovered #brassieregate this week, our phones have been categorising photos since the launch of iOS 10 (over a year ago). They can recognise more than 4,000 objects and scenes, from figs to taprooms (that’s ‘boozer’ to you and me), without us labelling any photos ourselves.
In fact, just browsing through my app’s categories sheds a whole new light on my world. According to my phone, not only is my ankle a boob, but my mum’s garden is a tomb, my best friend is a baby and my boyfriend is a toilet.
Sorry guys, but the Apple AI has spoken.