Enjoy fast, free delivery, exclusive deals, and award-winning movies & TV shows with Prime
Try Prime
and start saving today with fast, free delivery
Amazon Prime includes:
Fast, FREE Delivery is available to Prime members. To join, select "Try Amazon Prime and start saving today with Fast, FREE Delivery" below the Add to Cart button.
Amazon Prime members enjoy:- Cardmembers earn 5% Back at Amazon.com with a Prime Credit Card.
- Unlimited Free Two-Day Delivery
- Streaming of thousands of movies and TV shows with limited ads on Prime Video.
- A Kindle book to borrow for free each month - with no due dates
- Listen to over 2 million songs and hundreds of playlists
- Unlimited photo storage with anywhere access
Important: Your credit card will NOT be charged when you start your free trial or if you cancel during the trial period. If you're happy with Amazon Prime, do nothing. At the end of the free trial, your membership will automatically upgrade to a monthly membership.
-50% $16.25$16.25
Ships from: Amazon.com Sold by: Amazon.com
$12.83$12.83
Ships from: Amazon Sold by: GreatBookDealz
Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required.
Read instantly on your browser with Kindle for Web.
Using your mobile phone camera - scan the code below and download the Kindle app.
Audible sample
Broken Code: Inside Facebook and the Fight to Expose Its Harmful Secrets Hardcover – November 14, 2023
Explore your book, then jump right back to where you left off with Page Flip.
View high quality images that let you zoom in to take a closer look.
Enjoy features only possible in digital – start reading right away, carry your library with you, adjust the font, create shareable notes and highlights, and more.
Discover additional details about the events, people, and places in your book, with Wikipedia integration.
Purchase options and add-ons
"Broken Code fillets Facebook’s strategic failures to address its part in the spread of disinformation, political fracturing and even genocide. The book is stuffed with eye-popping, sometimes Orwellian statistics and anecdotes that could have come only from the inside." —New York Times Book Review
Once the unrivaled titan of social media, Facebook held a singular place in culture and politics. Along with its sister platforms Instagram and WhatsApp, it was a daily destination for billions of users around the world. Inside and outside the company, Facebook extolled its products as bringing people closer together and giving them voice.
But in the wake of the 2016 election, even some of the company’s own senior executives came to consider those claims pollyannaish and simplistic. As a succession of scandals rocked Facebook, they—and the world—had to ask whether the company could control, or even understood, its own platforms.
Facebook employees set to work in pursuit of answers. They discovered problems that ran far deeper than politics. Facebook was peddling and amplifying anger, looking the other way at human trafficking, enabling drug cartels and authoritarians, allowing VIP users to break the platform’s supposedly inviolable rules. They even raised concerns about whether the product was safe for teens. Facebook was distorting behavior in ways no one inside or outside the company understood.
Enduring personal trauma and professional setbacks, employees successfully identified the root causes of Facebook's viral harms and drew up concrete plans to address them. But the costs of fixing the platform—often measured in tenths of a percent of user engagement—were higher than Facebook's leadership was willing to pay. With their work consistently delayed, watered down, or stifled, those who best understood Facebook’s damaging effect on users were left with a choice: to keep silent or go against their employer.
Broken Code tells the story of these employees and their explosive discoveries. Expanding on “The Facebook Files,” his blockbuster, award-winning series for The Wall Street Journal, reporter Jeff Horwitz lays out in sobering detail not just the architecture of Facebook’s failures, but what the company knew (and often disregarded) about its societal impact. In 2021, the company would rebrand itself Meta, promoting a techno-utopian wonderland. But as Broken Code shows, the problems spawned around the globe by social media can’t be resolved by strapping on a headset.
- Print length336 pages
- LanguageEnglish
- PublisherDoubleday
- Publication dateNovember 14, 2023
- Dimensions6.3 x 1.3 x 9.5 inches
- ISBN-100385549180
- ISBN-13978-0385549189
Book recommendations, author interviews, editors' picks, and more. Read it now.
Frequently bought together
Customers who viewed this item also viewed
Editorial Reviews
Review
"Broken Code fillets Facebook’s strategic failures to address its part in the spread of disinformation, political fracturing and even genocide. The book is stuffed with eye-popping, sometimes Orwellian statistics and anecdotes that could have come only from the inside."
—New York Times Book Review
"Broken Code offers a comprehensive, briskly reported examination of key systems governing [Facebook] and their many failings... A smartly reported investigation into the messy internal machinations of one of the world’s most important and least understood companies."
—Washington Post
“Jeff Horwitz has written a blockbuster expose of Facebook, the notoriously secretive social media giant whose benign mission—connecting people—masked a growing propensity towards some of humanity’s worst impulses. Populated by concerned, brave employees who defied their employer and leaked thousands of pages of internal documents to Horwitz, with the imperious, remote Mark Zuckerberg and his top lieutenants at the center, Broken Code is brilliant reporting and a page-turning narrative of immense importance.”
—James B. Stewart, Pulitzer Prize-winning investigative journalist and New York Times bestselling author
“A dogged and meticulous reporter, Jeff Horwitz is at the height of his powers in Broken Code, a penetrating portrait of one of the most significant companies in the world and of one of the great new challenges of this technological era.”
—Ronan Farrow, Pulitzer Prize-winning investigative journalist and New York Times bestselling author
"An unsettling account….Stories of executives bumbling their way through or outright ignoring issues within the company are breathtaking and troubling… Horwitz’s reporting shines...This convincingly makes the case that Facebook’s pursuit of growth at any cost has had disastrous offline consequences."
—Publishers Weekly
"Readers interested in the ethics of the internet and technology, the business aspects of social media, and social media's impact on society at large will be fascinated. Horwitz has created an essential resource."
—Booklist
"A well-researched, disturbing study of a tech behemoth characterized by arrogance, hypocrisy, and greed."
—Kirkus Reviews
"Impressive reporting... A thoroughly documented portrait of a company that recognizes its products have harmed people yet declines to meaningfully change them."
—San Francisco Chronicle
About the Author
Excerpt. © Reprinted by permission. All rights reserved.
Arturo Bejar’s return to Facebook’s Menlo Park campus in 2019 felt like coming home. The campus was bigger than when he’d left in 2015—Facebook’s staff doubled in size every year and a half—but the atmosphere hadn’t changed much. Engineers rode company bikes between buildings, ran laps on a half-mile trail through rooftop gardens, and met in the nooks of cafés that gave Facebook’s yawning offices a human scale.
Bejar was back because he suspected something at Facebook had gotten stuck. In his early years away from the company, as bad press rained down upon it and then accumulated like water in a pit, he’d trusted that Facebook was addressing concerns about its products as best it could. But he had begun to notice things that seemed off, details that made it seem like the company didn’t care about what its users experienced.
Bejar couldn’t believe that was true. Approaching fifty, he considered his six years at Facebook to be the highlight of a tech career that could only be considered charmed. He’d been a Mexico City teenager writing computer games for himself in the mid-1980s when he’d gotten a chance introduction to Apple co-founder Steve Wozniak, who was taking Spanish lessons in Mexico.
After a summer being shown around by a starstruck teenage tour guide, Wozniak left Bejar an Apple computer and a plane ticket to come visit Silicon Valley. The two stayed in touch, and Wozniak paid for Bejar to earn a computer science degree in London.
“Just do something good for people when you can,” Wozniak told him.
Success followed. After working on a visionary but doomed cybercommunity in the 1990s, Bejar spent more than a decade as the “Chief Paranoid” in Yahoo’s once-legendary security division. Mark Zuckerberg hired him as a Facebook director of engineering in 2009 after an interview held in the CEO’s kitchen.
Though Bejar’s expertise was in security, he’d embraced the idea that safeguarding Facebook’s users meant more than just keeping out criminals. Facebook still had its bad guys, but the engineering work that Facebook required was as much social dynamics as code.
Early in his tenure, Sheryl Sandberg, Facebook’s chief operating officer, asked Bejar to get to the bottom of skyrocketing user reports of nudity. His team sampled the reports and saw they were overwhelmingly false. In reality, users were encountering unflattering photos of themselves, posted by friends, and attempting to get them taken down by reporting them as porn. Simply telling users to cut it out didn’t help. What did was giving users the option to report not liking a photo of themselves, describing how it made them feel, and then prompting them to share that sentiment privately with their friend.
Nudity reports dropped by roughly half, Bejar recalled.
A few such successes led Bejar to create a team called Protect and Care. A testing ground for efforts to head off bad online experiences, promote civil interactions, and help users at risk of suicide, the work felt both groundbreaking and important. The only reason Bejar left the company in 2015 was that he was in the middle of a divorce and wanted to spend more time with his kids.
Though he was away from Facebook by the time the company’s post-2016 election scandals started piling up, Bejar’s six years there instilled in him a mandate long embedded in the company’s official code of conduct: “assume good intent.” When friends asked him about fake news, foreign election interference, or purloined data, Bejar stuck up for his former employer. “Leadership made mistakes, but when they were given the information they always did the right thing,” he would say.
But, truth be told, Bejar didn’t think of Facebook’s travails all that much. Having joined the company three years before its IPO, money wasn’t a concern, and Bejar was busy with nature photography, a series of collaborations with the composer Philip Glass, and restoring cars with his daughter Joanna, who at fourteen wasn’t yet old enough to drive. She documented their progress restoring a Porsche 914—a 1970s model derided for having the aesthetics of a pizza box—on Instagram, which Facebook had bought in 2012.
Joanna’s account became moderately successful, and that’s when things got a little dark. Most of her followers were enthused about a girl getting into car restoration, but some showed up with rank misogyny, like the guy who told Joanna she was getting attention “just because you have tits.”
“Please don’t talk about my underage tits,” Joanna Bejar shot back before reporting the comment to Instagram. A few days later, Instagram notified her that the platform had reviewed the man’s comment. It didn’t violate the platform’s community standards.
Bejar, who had designed the predecessor to the user-reporting system that had just shrugged off the sexual harassment of his daughter, told her the decision was a fluke. But a few months later, Joanna mentioned to Bejar that a kid from a high school in a neighboring town had sent her a picture of his penis via an Instagram direct message. Most of Joanna’s friends had already received similar pics, she told her dad, and they all just tried to ignore them.
Bejar was floored. The teens exposing themselves to girls who they had never met were creeps, but they presumably weren’t whipping out their dicks when they passed a girl in a school parking lot or in the aisle of a convenience store. Why had Instagram become a place where it was accepted that these boys occasionally would—or that young women like his daughter would have to shrug it off?
Bejar’s old Protect and Care team had been renamed and reshuffled after his departure, but he still knew plenty of people at Facebook. When he began peppering his old colleagues with questions about the experience of young users on Instagram, they responded by offering him a consulting agreement. Maybe he could help with some of the things he was concerned about, Bejar figured, or at the very least answer his own questions.
That was how Arturo Bejar found himself back on Facebook’s campus. Just shy of fifty and highly animated—Bejar’s reaction to learning something new and interesting is a gesture meant to evoke his head exploding—he had unusual access due to his easy familiarity with Facebook’s most senior executives. Dubbing himself a “free-range Mexican,” he began poring over internal research and setting up meetings to discuss how the company’s platforms could better support their users.
The mood at the company had certainly darkened in the intervening four years. Yet, Bejar found, everyone at Facebook was just as smart, friendly, and hardworking as they had been before, even if no one any longer thought that social media was pure upside. The company’s headquarters—with its free laundry service, cook-to-order meals, on-site gym, recreation and medical facilities—remained one of the world’s best working environments. It was, Bejar felt, good to be back.
That nostalgia probably explains why it took him several months to check in on what he considered his most meaningful contribution to Facebook—the revamp of the platform’s system for reporting bad user experiences.
It was the same impulse that had led him to avoid setting up meetings with some of his old colleagues from the Protect and Care team. “I think I didn’t want to know,” he said.
Bejar was at home when he finally pulled up his team’s old system. The carefully tested prompts that he and his colleagues had composed—asking users to share their concerns, understand Facebook’s rules, and constructively work out disagreements—were gone. Instead, Facebook now demanded that people allege a precise violation of the platform’s rules by clicking through a gauntlet of pop-ups. Users determined enough to complete the process arrived at a final screen requiring them to reaffirm their desire to submit a report. If they simply clicked a button saying “done,” rendered as the default in bright Facebook blue, the system archived their complaint without submitting it for moderator review.
What Bejar didn’t know then was that, six months prior, a team had redesigned Facebook’s reporting system with the specific goal of reducing the number of completed user reports so that Facebook wouldn’t have to bother with them, freeing up resources that could otherwise be invested in training its artificial intelligence–driven content moderation systems. In a memo about efforts to keep the costs of hate speech moderation under control, a manager acknowledged that Facebook might have overdone its effort to stanch the flow of user reports: “We may have moved the needle too far,” he wrote, suggesting that perhaps the company might not want to suppress them so thoroughly.
The company would later say that it was trying to improve the quality of reports, not stifle them. But Bejar didn’t have to see that memo to recognize bad faith. The cheery blue button was enough. He put down his phone, stunned. This wasn’t how Facebook was supposed to work. How could the platform care about its users if it didn’t care enough to listen to what they found upsetting?
There was an arrogance here, an assumption that Facebook’s algorithms didn’t even need to hear about what users experienced to know what they wanted. And even if regular users couldn’t see that like Bejar could, they would end up getting the message. People like his daughter and her friends would report horrible things a few times before realizing that Facebook wasn’t interested. Then they would stop.
When Bejar next stepped onto Facebook’s campus, he was still surrounded by smart, earnest people. He couldn’t imagine any of them choosing to redesign Facebook’s reporting features with the goal of tricking users into depositing their complaints in the trash; but clearly they had.
“It took me a few months after that to wrap my head around the right question,” Bejar said. “What made Facebook a place where these kinds of efforts naturally get washed away, and people get broken down?”
Unbeknownst to Bejar, a lot of Facebook employees had been asking similar questions. As scrutiny of social media ramped up from without and within, Facebook had accumulated an ever-expanding staff devoted to studying and addressing a host of ills coming into focus.
Broadly referred to as integrity work, this effort had expanded far beyond conventional content moderation. Diagnosing and remediating social media’s problems required not just engineers and data scientists but intelligence analysts, economists, and anthropologists. This new class of tech workers had found themselves up against not just outside adversaries determined to harness social media for their own ends but senior executives’ beliefs that Facebook usage was by and large an absolute good. When ugly things transpired on the company’s namesake social network, these leaders pointed a finger at humanity’s flaws.
Staffers responsible for addressing Facebook’s problems didn’t have that luxury. Their jobs required understanding how Facebook could distort its users’ behavior—and how it was sometimes “optimized” in ways that would predictably cause harm. Facebook’s integrity staffers became the keepers of knowledge that the outside world didn’t know existed and that their bosses refused to believe.
As a small army of researchers with PhDs in data science, behavioral economics, and machine learning was probing how their employer was altering human interaction, I was busy grappling with far more basic questions about how Facebook worked. I had recently moved back to the West Coast to cover Facebook for the Wall Street Journal, a job that came with the unpleasant necessity of pretending to write with authority about a company I did not understand.
Still, there was a reason I wanted to cover social media. After four years of investigative reporting in Washington, the political accountability work I was doing felt pointless. The news ecosystem was dominated by social media now, and stories didn’t get traction unless they appealed to online partisans. There was so much bad information going viral, but the fact-checks I wrote seemed less like a corrective measure than a weak attempt to ride bullshit’s coattails.
Covering Facebook was, therefore, a capitulation. The system of information sharing and consensus building of which I was a part was on its last legs, so I might as well get paid to write about what was replacing it.
The surprise was how hard it was to even figure out the basics. Facebook’s public explainers of the News Feed algorithm—the code that determined which posts were surfaced before billions of users—relied on phrases like “We’re connecting you to who and what matters most.” (I’d later learn there was a reason why the company glossed over the details: focus groups had concluded that in-depth explanations of News Feed left users confused and unsettled—the more people thought about outsourcing “who and what matters most” to Facebook, the less comfortable they got.)
In a nod to its immense power and societal influence, the company created a blog called Hard Questions in 2017, declaring in its inaugural post that it took “seriously our responsibility—and accountability—for our impact and influence.” But Hard Questions never delved into detail, and after a couple of bruising years of public scrutiny, the effort was quietly abandoned.
By the time I started covering Facebook, the company’s reluctance to field reporters’ queries had grown, too. Facebook’s press shop—a generously staffed team of nearly four hundred—had a reputation for being friendly, professional, and reticent to answer questions. I had plenty of PR contacts, but nobody who wanted to tell me how Facebook’s “People You May Know” recommendations worked, which signals sent controversial posts viral, or what the company meant when it said it had imposed extraordinary user-safety measures amid ethnic cleansing in Myanmar. The platform’s content recommendations shaped what jokes, news stories, and gossip went viral across the world. How could it be such a black box?
The resulting frustration explains how I became a groupie of anyone who had a passing familiarity with Facebook’s mechanics. The former employees who agreed to speak to me said troubling things from the get-go. Facebook’s automated enforcement systems were flatly incapable of performing as billed. Efforts to engineer growth had inadvertently rewarded political zealotry. And the company knew far more about the negative effects of social media usage than it let on.
This was wild stuff, far more compelling than the perennial allegations that the platform unfairly censored posts or favored President Trump. But my ex-Facebook sources couldn’t offer much in the way of proof. When they’d left the company, they’d left their work behind Facebook’s walls.
I did my best to cultivate current employees as sources, sending hundreds of notes that boiled down to two questions: How does a company that holds sway over billions of people actually work? And why, so often, does it seem like it doesn’t?
Other reporters did versions of this too, of course. And from time to time we obtained stray documents indicating that Facebook’s powers, and problems, were greater than it let on. I had the luck of being there when the trickle of information became a flood.
Product details
- Publisher : Doubleday (November 14, 2023)
- Language : English
- Hardcover : 336 pages
- ISBN-10 : 0385549180
- ISBN-13 : 978-0385549189
- Item Weight : 2.31 pounds
- Dimensions : 6.3 x 1.3 x 9.5 inches
- Best Sellers Rank: #456,191 in Books (See Top 100 in Books)
- #393 in Computers & Technology Industry
- #402 in Social Aspects of Technology
- #1,377 in Biographies of Business & Industrial Professionals
- Customer Reviews:
About the author
Discover more of the author’s books, see similar authors, read book recommendations and more.
Customer reviews
Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.
To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.
Learn more how customers reviews work on AmazonTop reviews from the United States
There was a problem filtering reviews right now. Please try again later.
- Reviewed in the United States on June 27, 2024After 30+ years of IT experience with large companies and the last 2 years of which were in the thick of the front-end, client-side, e-commerce space, I am fortunate to have a decent understanding of the mechanics behind how and why Meta is able to cause such horrible atrocities without any consequences. What Jeff and his resources have provided here is very consistent with that understanding.
I completely agree with the other 5-star reviewers! Jeff and the persons who have helped him bravely expose what we NEED TO HEAR about, have now given us hope and laid the groundwork for a new generation of companies, gaining an advantage on their competitors by positioning themselves as "Meta-Free" businesses, winning over today's and tomorrow's consumers who are increasingly demanding a privacy-centric and socially responsible user experience! We are already starting to see signs of this, thanks to some businesses which have moved past 2017 and learned that even indirectly contributing to genocide, human trafficking, teenage eating disorders and political polarization, is not a particularly prudent advertising model.
When a large company does very, very bad things, we need to stop using its products. If a company is using FB plugins on its website, which nearly anyone can detect using basic browser tools, we will now run to that company's competitors. Fortunately, some companies and customers are starting to understand all of this, but we have a long way to go.
Thank you, Jeff Horwitz, for sacrificing so much to help Frances Haugen and the others who have come forward to keep us informed, so we can eventually get out of this mess. While it will take all of us doing more to protect our own privacy, and choosing companies NOT using Meta products as much as possible, we would never even have such an opportunity if it was not for those who brought us this knowledge, especially Jeff Horwitz.
- Reviewed in the United States on September 2, 2024Horowitz has done yeoman’s work on this book. It avoids speculation in favor of data and facts. It was clearly carefully researched. Kudos to Jeff Horowitz and the WSJ for developing a great source in France’s Haugen.
- Reviewed in the United States on July 19, 2024Horwitz gives a behind-the-scenes glimpse of Facebook’s repeated decisions to place business over society at large. By cultivating an impressive roster of former Facebook insiders (including the mother of all sources, Frances Haugen), the book paints a disturbing picture of Facebook’s inner workings. Horwitz gives numerous insider-accounts of Executives (Zuckerberg especially) repeatedly choosing increased usage-metics despite evidence of actual societal harm — i.e. the Rohingya Genocide, Indian ethnic violence, the Jan 6th attack on the U.S. Capitol etc.
- Reviewed in the United States on December 11, 2023I am in the process of reading this book. The author's choice of vocabulary seems less than eloquent. For example, he uses words like "incent" meaning to incentivize and "liase" meaning to establish a liaison.
The subject matter, Facebook's CEO's gross lack of ethics, while claiming to serve humanity, is also very depressing.
- Reviewed in the United States on January 9, 2024The main take away, meta-face, sounds like a horrible place to work. The book is a solid reminder that "social media" is neither social or media. The "leadership" of these tech companies seem to lack even the most basic humanity. Just reminds me of why I got clean of Facebooks toxic sludge.
- Reviewed in the United States on August 27, 2024A great overview of how good intentions when met with profit-seeking can lead to disaster. This is a definite must read for anyone in a company leadership position. This shows a slow slide and erosion of values to risk undermining a society.
- Reviewed in the United States on November 25, 2023I won't spoil the story, but if you have ever wondered whether Facebook does enough to protect its users, this book clearly shows it does not. If you've ever wondered whether Facebook is safe, this book clearly shows it is not.
- Reviewed in the United States on February 7, 2024Well-researched account showing Facebook leadership’s repeated failures to address the polarization and the spread of radicalizing content. Please read it.
Top reviews from other countries
- Cranky SpiderReviewed in the United Kingdom on July 16, 2024
5.0 out of 5 stars What goes on under the covers in a technology company
Interesting investigation into the difference between the public statements of Facebook in respect of its societal impact and what they actually prioritise. So its all about engagement rather than feeling any responsibility for how their platforms are used - or gamed by bad actors. If you were to turn on your tap and raw sewage came out, the water companies would get sued... Facebook enables a lot of toxic usage under the guise that they aren't there to manage the quality of user content. Hence why so much inappropriate content doesn't violate their usage policies... essentially they haven't really got any and anything goes... More people should read this sort of investigative journalism as we are about to repeat on steroids with AI....
- Peter SuwaraReviewed in Australia on July 19, 2024
1.0 out of 5 stars Repeats itself ad-nauseam ...
After the first engaging chapter, the second felt like a repeat but with different words and different people. The entire book ended up being chapter after chapter on the discourse of political opinions and their influences of the platform.
This is more a discourse on personal political opinions rather than actual insight into how the platform works, its flaws in management and the technical issues faced by the teams.
It became so exhausting after the third chapter, repeating the same talking points of the previous two, that I just gave up.
I don't really care about US politics and their elections, maybe that's why I felt it was just pointless.
Once you read the first chapter you essentially have read the whole thing.
Let me save you the headache of reading this book.
TLDR : Facebook is a mess and it's on the way out...