Say what you will about PandoDaily, but their tale of Facebook’s disappointing Platform is honest, fair, and perfectly contextualized: http://pandodaily.com/2013/07/23/move-fast-break-things-the-sad-story-of-platform-facebooks-gigantic-missed-opportunity/
Facebook’s inconstant behavior on Platform, however, has never been malicious. Rather, it is a result of an engineering-led culture. Facebook’s platform team started off small, and it was led by programmers. Whenever possible, they wanted to find solutions that didn’t require human intervention. That way the operation could stay lean and move fast.
That approach left Platform without a clear set of policies that would have provided the stability and sense of security that is so crucial to a development environment. It also meant that humans had a minimal role in the quality control process. Unlike Apple, which requires that all apps intended for distribution in its App Store be approved by actual people, Facebook relied on changes to its algorithms to combat things like spam and over-exposure for certain apps.
“What we should have realized is we should have hired someone to go and make a judgment call,” says a former Facebook executive, who doesn’t want to be named because he retains close ties to the company. He says the app store, or lack thereof, was one of the Platform’s single biggest points of failure. “The team that was driving the platform was the engineering team and the technical product team. We knew how to develop products, but we didn’t know how to build a payment system or build an organization of human judges.”
I can understand why Facebook went that direction. They wanted a robotically, algorithmically equal playing field, and if the rules required a human judgement call then they weren’t the right rules in the first place. This is an honorable goal to ensure that the rules are enforced consistently, but there’s still a human on the other side of that API, and humans are nuanced. Circumstances matter. Motivation matters.
Say what you will about the US judicial system, but it was also designed to take nuance into consideration, with judges who explain the law and how it applies to the many complex situations we humans get ourselves into.
In the months before Pando’s article went up, I’d heard that Facebook posted several job openings for “Platform Integrity Risk” managers. These roles are filled (or canceled) now, so I’d like to think that Facebook has learned from their failures and is already taking corrective action. As stated in the article, they’ve clearly still got momentum despite the shortcomings with Platform.
If you have developer policies, take Facebook’s story as a warning and make sure you have a high court in place — whether it be a community manager or dedicated policy review team — who can stand up to internal politicians, balance the shifting sands of a growing product, and earn developer trust.