AI Weekly: Facebook fiasco proves we need a better approach to personal data

AI Weekly: Facebook fiasco proves we need a better approach to personal data

Recent revelations about Cambridge Analytica’s use of Facebook data have a lot of people rightfully concerned about how their personal information is collected and used online. The cardinal sin is how we as a society have allowed the tech industry to collect and handle user data. Google controls what we know and how we get work done. I’d happily give Siri all the data I could about my dining preferences if Apple assured me it would be used only for booking me reservations through my Apple Watch. If they want to do something new with that data, they should ask users for permission at that point. That would have helped with the Cambridge Analytica case, in which data collected for a survey application was subsequently used for targeting political advertisements. I’m sure some people would be perfectly happy providing their personal information for political advertising, but that is a choice they should be able to make openly and honestly. That way, individuals can audit what they’re giving away and take their data with them for use with another provider’s machine learning systems, should they so choose. But I know one thing for sure: Unless we change the way tech companies handle user data, we’ll see more problems like what came along with Cambridge Analytica. Though tech giants in the […]

How Kayak stays ahead of the game with AI (VB Live)
Is Brain-Machine Interface The Future Of Social Media?
What Should Google Discuss At Tomorrow’s Google I/O Event?
Mark Zuckerberg testifies in front of the Senate Judiciary and Commerce Committees on April 10, 2018.
  • Facebook
  • Twitter
  • Google+
  • Buffer
  • Pinterest
  • LinkedIn
Above: Mark Zuckerberg testifies in front of the Senate Judiciary and Commerce Committees on April 10, 2018.

Recent revelations about Cambridge Analytica’s use of Facebook data have a lot of people rightfully concerned about how their personal information is collected and used online. But while Facebook is trying to position the Cambridge Analytics breach (and collection of user data by other third-party apps) as a function of bad actors on its platform, those actors wouldn’t be attracted to Facebook data in the first place if it wasn’t for its power.

In other words, the cardinal sin behind the Cambridge Analytica breach isn’t unethical developer behavior. The cardinal sin is how we as a society have allowed the tech industry to collect and handle user data. Facebook controls a significant component of its users’ social lives, both online and off. Google controls what we know and how we get work done.

This control extends not only to the data we see — like photos, videos, articles, and screeds from conspiracy theorist relatives — but also to the data we don’t see. All of these companies can view how we engage with data: what we find worthwhile, who we find interesting, and so on.

To get a touch dramatic, this is incredibly concerning from a philosophical standpoint, since we don’t have control over information about who we are. Our interactions online are as significant and real as those we have in meatspace, but we only genuinely control information about the latter. It’s something that makes me sincerely worried about the future.

There’s another reason to be worried from a far less intellectual standpoint, however: Our lack of control over this data makes it far harder for us to benefit from it. For example, Siri may never be able to make a decision based on information stored in my Google and Facebook accounts, and there’s nothing I can do about it.

I’d happily give Siri all the data I could about my dining preferences if Apple assured me it would be used only for booking me reservations through my Apple Watch. As it stands, I can’t do that, because that information is trapped inside Google, Foursquare, Yelp, OpenTable, Resy, and yes, Facebook.

Sure, Apple’s assistant integrates with two of the companies on that list to help arrange dining for its users. But that doesn’t provide the sort of deep, personal understanding necessary to turn “Hey Siri, book me dinner for two tonight” into a reservation that perfectly fits my schedule and preferences without further intervention.

Part of this has to do with tech companies wanting to cement their power using the network effects of their data. If I can’t get the information and functionality I want through Siri, I might be willing to switch to the Google Assistant or Alexa. Data control leads to revenue — just look at the quarterly financials of tech companies like Google, Facebook, and Microsoft. But the Cambridge Analytica fiasco also shows how bad actors can abuse data portability through false pretenses.

So what do we do? Data isolation harms the creation of intelligent experiences and increases the power of gigantic companies. Opening up data access provides the potential for abuse. Both are problems worth tackling, in equal measure.

In an ideal world, I’d like to see us shift to a centralized…

Pin It on Pinterest

Share This