The New York Times has published a stunningly boot-licking article about Facebook’s Chief Technology Officer that aims to humanize the executive’s plight through his own tears.

Jesus Christ. If this man — architect of the platform that connected 2 billion people around the world so that they could yell at, and, like, actually murder, each other — doesn’t have something to cry about, then nobody does.

The profile documents the rising, then spluttering, star of Mike Schroepfer, the current CTO of Facebook. It contrasts his oh-so-humble journey as a technology whiz makin’ it in the Valley with the current problems he now faces of developing artificial intelligence to stop humans from spewing their deadly bile across the world.

The takeaway of the article is pollyanna-ish in its revelation that — whoa! — A.I. might not solve all of Facebook’s hate, misinformation, and violence content all on its own. Humans, Schroepfer concedes, are dispensing toxicity in ways so novel that A.I. has a hard time keeping up. This is a stunning revelation apparently, because it comes from a man who feels bad about the scope of the problem.

How this profile fails to connect the dots between what Schroepfer innocently helped build and what he has to deal with today is beyond the understanding of this reader.

Schroepfer came to Facebook in 2008 a wide-eyed tech problem solver. His job was to scale the platform, which he described “like a bus rolling downhill on fire with four flat tires. Like: How do we keep it going?” Facebook grew exponentially in the intervening years; in the last decade, and under Schroepfer’s supervision, it has expanded from 2 million people to 2 billion.

Scaling the platform was no longer a technical challenge, so Schroepfer turned to a new task: building out an A.I. team and considering “the future.”

To Schroepfer and Zuckerberg, that meant developing A.I. for the noble aim of greater facial recognition technology in photos and videos, and, “better targeting ads.”

Lo and behold, after Schroepfer led the technical engine that scaled without problems and used artificial intelligence to identify, learn about, and manipulate humans, things started to go wrong. After the 2015 Paris terror attacks, it occurred to Zuckerberg to task Schroepfer with how to use A.I. to combat terrorist content.

That mission only became more urgent after the 2016 revelations that Facebook allowed the spread of misinformation and echo chambers that may have led to the combative polarization of America and the rise of Donald Trump. The Cambridge Analytica scandal added fuel to the fire, and Schroepfer found himself accountable to government bodies around the world for issues he helped create.

Now, Schroepfer is apparently emotionally overwhelmed by the scope of the problems facing Facebook, his team, and the world. That is understandable: Schroepfer is a human person, and has, along with the rest of us, watched tragedy unfold and our social fabric shift in heartbreaking ways, live on Facebook. Albeit, he had a better view.

Of course Schroepfer should be crying. A man crying is not so admirable, so unusual a phenomenon, that it deserves to be the centerpiece of a humanizing profile. And it’s not our job to take this piece of pitiable P.R. bait, and heap a dose of forgiveness and understanding onto the people now tasked with solving the problems they themselves created, just because they’re sad about it, too.

Facebook is not a faceless corporation. Just like every company, it is a conglomeration of people, who are probably just trying to do their best. But Facebook should have afforded its users the same consideration its P.R. stunts are now asking from users about keeping the wheels strapped to its fiery bus: that Facebook users were humans with our flaws, ready to be exploited for capital gain, and, apparently, humanitarian loss. Perhaps it should have considered the challenges — human and technical — that it is now facing, before monetizing human connection, at all costs.

LEAVE A REPLY

Please enter your comment!
Please enter your name here