In mid 2007, when Facebook wanted to generate its revenue stream, they produced a manifesto of social advertising. It boiled down to inserting sales pitches into the conversations among friends. The idea was to have real stories that the advertisers could then sponsor that would go to all the friends in their contact list with an implicit recommendation. It worked under the premise that Facebook is all about learning about friends. And so, learning about services and products through the lens of friends, especially if the ads have relevant information about friends, seems like it should work. That was the theme of Facebook’s ad business and they code named it Panda, a hybrid of the words Pages And Ads. Later on, the name morphed into something more ubiquitous these days: Pandemic.
Pages would allow companies to have profiles, something previously forbidden by its policy of permitting only individuals to have accounts. They would be like storefronts or even websites within Facebook. It helped users to find things that they need. It also provided value to businesses by helping them find more customers. And pages have the capability to attract both organic and paid traffic.
The most important conversations of people were between themselves. This concept was a big part of the strategy of Facebook and a distinguishing point of its social ads. Another part was to be even more significant. Facebook changed its ad system to more about targeting ads to the right people and less about how many people saw them. They created a system where advertisers could bid against each other to place ads alongside News Feed in the sidebars. The metric that advertisers paid for would not be exposure-based but engagement-based. In other words, people would pay not for the number of eyeballs grazed on the ads but for clicks. It’s very much like Google Adwords. The only difference is while Google used keywords, Facebook used demographic information as bidding criteria. This part of Pandemic became a full-scale package of business-based features. It set the tone for Facebook’s ad model that is still working today.
Pandemic used non-celebrity endorsement called beacon as a means of spreading the firm’s ethic to the web and tying clients to Facebook. Facebook struck deals with partners to put monitors called beacons on their web pages that flagged activity to Facebook. When somebody bought something from the site, the news would be shared on their friends’ News Feed. This broke ground that should have remained unbroken. Earlier, the interests of users were self-reported. Although Facebook did automate some news, they were based on activities that happened on Facebook. But beacon would stealthily track activities as people made purchases on the web and then circulate the news to their friends. There would be a pop-up warning of the actions needed to disable it. The history of user experience indicated that most users would ignore that warning. Facebook interpreted your nonaction as consent if you didn’t respond. Beacon would then let all your friends know about your purchase. It was so successful that in the run-up to the Pandemic announcement, Microsoft struck an investment deal with Facebook for $240 million in exchange for 1.6% of the company. When Microsoft began cashing out, its 1.6% slice would be worth $8 billion.
After Facebook made the Pandemic announcement, the headlines from the launch focused on social ads and microtargeting, but attention soon shifted to beacon’s disregard for privacy. Although for some it was quite an innovative ad product it was definitely an incredible violation of privacy. The media in the tech world warned that automatically spreading news of purchases could result in some unintended outcomes. Rightly so, people started complaining about their purchases appearing on people’s News Feeds. For days, Facebook did not respond to the criticism because there was so much disagreement within the company of the direction it should take.
Opt-in or opt-out
Facebook had a big discussion whether it should be opt-in or opt-out. The opt-in side would have Facebook asking if people wanted to participate or not. It would then take effect only if interest was expressed. The opt-out side would have Facebook share purchase information by default. Facebook believed that once it was implemented, just like the News Feed, people might like it. And if they didn’t, Facebook could always pull it back. The debate about whether to use opt-in, opt-out or any other variations of it continued. Meanwhile people began thinking that maybe Facebook was something that they could not trust. Facebook finally decided to use opt-in wherein users had to consent proactively before a story was published, finally restoring its default status. But that did not quell the objections.
Experts discovered disturbing aspects. Beacon transmitted data even if you opted out. Moreover, it gave Facebook a lot of information about what users did outside websites and even about users who did not sign up for Facebook. By then the press, privacy advocates and users were demanding the firm kill beacon outright. A week later, Facebook installed a privacy control to completely turn off beacon. The outcry died down and the purchases did not show up on News Feed anymore. That was the final fix to contain the Pandemic.
Adapted from the book "Facebook: The Inside Story" by Steven Levy published in 2020 by Penguin Random House UK