SL: Well, like a lot of things, the idea to do this kind of stuff came way before we had the computer power and AI and sensor power to actually implement it. It was like in the 1970s, Nicholas Negroponte and the precursor to the Media Lab. He had this room where you’d go in, and without any hardware on you, you’d be able to control an interface of a computer. It took a very long time before we could do that, but now we have games where people pick up your motions and it’s pretty accurate. I think that Facebook, they specifically argued that it’s time now to tackle the remaining hurdles for that to be real. They have this lab in Seattle which is trying to make those remaining breakthroughs which they feel they’ve identified to make this stuff into reality.
LG: I couldn’t have said it better than Steven.
MC: All right. Well, we’re going to take a break right now, and then when we come back, we’ll talk a little bit more about Facebook.
MC: Welcome back. If the idea of giving Facebook access to your neural pathways makes you uneasy, you are probably not alone. Facebook has been criticized for violating user privacy and allowing disinformation to fester on its platforms. Public trust in the company isn’t exactly great. Now, Steven, you wrote the book on Facebook. So I’m going to ask you, should we be worried about wearing something made by Facebook?
SL: Well, I think we have to look at it with great scrutiny. A few years ago, Facebook came out with a device called the Portal, and basically, they put a camera in your living room, bedroom, wherever you put the thing. Everyone jumped on it saying, “This is the last company we want to buy this device from.” I think originally it didn’t sell well, and then eventually the technology wasn’t too bad and it started selling more. I think, though, Facebook’s gathering information is sort of coming to a head with government regulation, and I think that maybe in the long-term, by the time some of these things we’re talking about now come to fruition, Facebook will be constrained in what it can do with our information. So we just have to worry about anyone reading our brains, not particularly Facebook.
LG: Yeah. One of the things that I’m finding interesting about this particular corner of Facebook, their Reality Labs, is that the executive who’s leading the team, Andrew Bosworth, he is someone who seems to be pretty open. This team has posted blog posts fairly frequently about their big, sweeping plans, their 10 year vision, and Andrew Bosworth, or Boz, as he is known throughout the industry, or boztank, as he is known on Twitter, is actually an active Twitterer, and he engages a lot on that platform. He engages with reporters, and sometimes it gets pretty feisty. Sometimes he tweets things and I’m like, “I am just really not quite sure what he is talking about there, and I would love to poke at that a little bit more.”
But what’s happening, I think, is kind of what I referred to earlier, is that Facebook knows it has a trust problem, and so it is trying to engage on some of these emerging technologies early, but it’s also this dynamic where they say a lot without saying much. That’s certainly the experience I had earlier this week when I had the chance to ask Andrew Bosworth about this new wrist device. I said, “Should people trust Facebook?” And one of the things he said is, “Well, we have to earn trust,” and then he kind of went off about that, and I’m paraphrasing. Then I said, “So how do you earn that trust?” And the answer, there were some platitudes there. It was, “Oh, you can’t surprise people, and it takes a long time, and trust arrives on horseback…” What is it? “Trust arrives on foot and leaves on horseback,”-
social experiment by Livio Acerbo #greengroundit #wired https://www.wired.com/story/gadget-lab-podcast-496