It’s also worth noting that the algorithm Apple uses to identify what’s in your photos is far from perfect. Why does Apple think I took a picture of a “badger dog”? Newman noted this particular case would violate Apple’s guidelines - but hey, it’s possible. “With that access secured, an app with bad intentions could provide its stated service, while also using Core ML to ascertain what products appear in your photos, or what activities you seem to enjoy, and then go on to use that information for targeted advertising.” “Think of a photo filter or editing app that you might grant access to your albums,” Lily Hay Newman wrote for Wired. (However, as Melanie Ehrenkranz wrote for Gizmodo, Nude’s algorithm is a little overeager.)Īpple has rules in place to protect developers from exploiting that data maliciously, but, as Wired noted, something sketchy could sneak through. For example, the app Nude uses Core ML to scan your photos for nudity, helping you pre-emptively file away any photos you wouldn’t want wandering eyes to see. It’s certainly possible that third parties could use a similar algorithm for more nefarious purposes.įor example, a new feature in iOS 11 for app developers called “Core ML” allows programmers to use machine learning algorithms to learn all sorts of things about you based on what’s in your phone. But that paranoia isn’t totally unwarranted. Apple likes to tout this feature as something that’ll help you sort through your thousands of photos, searching for keywords rather than scrolling around endlessly. ![]() Theoretically, all those folders that your iPhone has created on your behalf are stored locally on your device. Presumably, this tweet is expressing a paranoia that a massive corporation like Apple can poke through your pictures using machine learning and fancy algorithms to know exactly what you’ve taken photos of, without you ever asking it to do so.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |