Apple Scanning Images • Download This Show
18 August 2021 6:40 pm
Apple scanning images on users’ devices always seemed like an inevitable move, but did we really not expect it, and did Apple not expect the backlash?
In an incredibly interesting week in the world of social media, I was once again over the moon to join Marc Fennell on Download This Show.
Since their inception, Apple has always been a tech company that prides itself on privacy. However, the tech giant hasn’t always walked the walk. In a move that has raised huge security concerns in the tech world, Apple announced plans this week for something called “neural match”.
What exactly is “neural match”? Well, essentially, Apple plans to scan images taken on your device, before they’ve been uploaded anywhere, and scan them for certain things – namely images featuring child sexual abuse. If Apple detects any of these images, that information can then be sent to the appropriate authorities.
While this is certainly a noble goal from the powers that be at Apple, it also opens up a massive can of worms in terms of privacy and even government censorship. Who knows what Apple scanning images is really going to look like going forward? However, it’s definitely something I will be following closely.
In Instagram news, creatives and influencers are up in arms, as the company has recently claimed that they are no longer a photo-sharing app; they are a video sharing app. While this is great news for video content creators, it leaves a huge percentage of content creators, who only focus on images, in the dust.
Will content creators who don’t want to make the move to video look elsewhere for another social media platform to highlight what they do? Will this will see a resurgence of people heading to Facebook for things other than checking someone’s birthday? Probably not.
Check out my other appearances on Download This Show