Similar Posts
What I’m Sharing (weekly)
Facial Recognition Is Tech’s Biggest Mistake Mental Health Is Still a ‘Don’t Ask, Don’t Tell’ Subject at Work The True Meaning of eDiscovery Day Tainted Data Can Teach Algorithms the Wrong Lessons DDoS: An Underestimated Threat Tim Berners-Lee launches Google and Facebook-backed plan to fix the web Everyone’s Watching You Online: How to Fight Back…
Linked – Why you shouldn’t unlock your phone with your face
You know, it occurs to me that while Apple made it tougher for someone to get into your data through a phone backup, they made it super easy for law enforcement to unlock a phone, didn’t they? And in many countries—including the US—the police can legally force you to use your fingerprint to unlock your…
What I’m Sharing (weekly)
Mental Health Problems Need To Be Destigmatized In The Workplace Legal Tech: A Closer Look At 3 Summer Cases Concerning Lost Data The global shortage of privacy professionals A Seat at the Bench: Insights from the Relativity Fest 2019 Judicial Panel If You’re Not Requesting Slack Data in Ediscovery—or Preserving It—What Are You Waiting For?…
Linked: New study adds to debate over racial bias in algorithms widely used in courtrooms
This seems like it should have been an obvious short-coming, and yet, it also seems like no one thought about it: “Risk assessments are pitched as “race-neutral,” replacing human judgment—subjective, fraught with implicit bias—with objective, scientific criteria. Trouble is, the most accurate tools draw from existing criminal justice data: what happened to large numbers of…
Linked – AI and Mass Spying
Imagine, if you will, your smart TV or home assistant listening in on conversations you’ve been having about layoffs in your industry, and that data is shared with a financial institution that then decides that you’re not a good credit risk. The AI took that conversation and combined it with a ton of financial information from other people who work in your industry and made that call. Is it accurate? Probably not, but when you start grabbing data from all over the place and building these huge algorithmic models, things can get a little messy. You become less of an individual and more of a conglomeration of all the people who do things like you, and when you add in a little spying, that can lead to all sorts of disastrous consequences.
Do we want governments and corporations to have that much power? No, but as Bruce rightly points out, we haven’t done much of anything to stop them from taking it so far.
