andrerm 23 days ago [-]
We, the ones who fully understans how technology works, must stand in favor of the ones who don't and can't make a truly informed decision when using technology. Tech companies must stop the dark patterns, information obfuscation and misleading and start informing all it's users in a exhaustively, clear and transparent manner.
bradknowles 23 days ago [-]
One key difference here is that Apple was apparently actively trying to minimize any personal or sensitive information that leaked through to the “graders”, whereas I don’t think any other company gave a damn.

I certainly understand why ML systems need to be trained and once in production, they need ongoing training and tweaking at all levels.

On the whole, I don’t think Apple did anything wrong here, with the exception of running this service and not telling their users it was being done. They should have been more open about the need for ongoing training, and the extent to which they would go to anonymize the information being gathered.

I still would have opted out, just like I’ve opted out of all voice recognition/assistant systems from all other sources. But at least then Apple would have had a decent chance of keeping this service in operation.

pitcher 23 days ago [-]
How is it even humanly possible to listen to a 1000 recordings a shift?
msie 23 days ago [-]
One whistleblower cost 300 people their jobs.