An Ugly Truth

Prologue: At Any Cost

1/4
Lang
1x
Voice
PDF
0:00
0:00

An Ugly Truth

by Sheera Frenkel · Summary updated

An Ugly Truth book cover

What is the book An Ugly Truth about?

Sheera Frenkel's An Ugly Truth meticulously chronicles Facebook's relentless growth under Mark Zuckerberg and Sheryl Sandberg, detailing how their prioritization of scale over safety enabled misinformation and global crises. It is essential reading for anyone concerned with corporate power and technology's impact on democracy.

FeatureBlinkistInsta.Page
Summary Depth15-min overviewFull Chapter-by-Chapter
Audio Narration✓ (AI narration)
Visual Mindmaps
AI Q&A✓ Voice AI
Quizzes
PDF Downloads
Price$146/yr (PRO)$33/yr
*Competitor data last verified February 2026.

1 Page Summary

Facebook's journey from a college social network to a global behemoth is chronicled in An Ugly Truth as a story of growth at any cost. The book meticulously details how the company's core leadership, particularly Mark Zuckerberg and Sheryl Sandberg, prioritized relentless expansion and engagement above all else, embedding a culture where solving problems of scale trumped addressing societal harms. This foundational ethos allowed misinformation, hate speech, and political manipulation to flourish on the platform, with the leadership repeatedly choosing to downplay crises—from Russian election interference to the genocide in Myanmar—in favor of protecting the company's image and unfettered growth.

The narrative is built around the pivotal and often strained partnership between Zuckerberg, the product-focused visionary with ultimate control, and Sandberg, the architect of the lucrative advertising empire. It reveals how their "growth first" partnership created a systemic blindness, where warnings from employees and researchers about platform abuses were sidelined or ignored. Key events, such as the Cambridge Analytica scandal and the unchecked spread of COVID-19 misinformation, are presented not as isolated failures but as inevitable outcomes of a business model fundamentally built on harvesting user attention and data, with insufficient safeguards.

The lasting impact of Facebook's choices, as argued by the authors, is the profound damage inflicted on democratic institutions and social cohesion worldwide. The book concludes that the company's immense power and Zuckerberg's unassailable control have made it resistant to meaningful reform, posing an ongoing threat. An Ugly Truth serves as a damning indictment, illustrating how the very algorithms and policies designed to connect the world became tools for division, radicalization, and violence, leaving a legacy of societal fracture that defines the digital age.

Chapter 1: Prologue: At Any Cost

Overview

The chapter opens at a pivotal moment in Facebook's history: December 9, 2020. On this day, the Federal Trade Commission and a coalition of 48 state attorneys general file landmark antitrust lawsuits against the company, seeking its breakup. This event represents the materialization of one of Mark Zuckerberg's deepest fears—regulatory action that could dismantle his empire. The legal complaints serve as a sweeping indictment of Facebook's entire business model and its leadership, accusing the company of crushing competition and harming users in a relentless pursuit of growth.

The Legal Onslaught

The lawsuits, led by New York Attorney General Letitia James, paint a damning picture. They frame Facebook's history as a "buy-or-bury" strategy designed to eliminate any potential rival. Mark Zuckerberg is cited over a hundred times, depicted as a founder who used bullying, deception, and espionage to maintain dominance. The complaint details how he entered "destroy mode" against competitors and allegedly broke commitments to the founders of acquired companies like Instagram and WhatsApp. The core allegation is that Facebook became a powerful monopoly that stifled innovation, reduced consumer choice, and degraded privacy.

The Partnership's Design

Alongside Zuckerberg, Sheryl Sandberg is presented as an essential architect of Facebook's success and its problematic model. Described as Zuckerberg's perfect foil, she built the immensely profitable advertising engine that treats personal data as a commodity to be traded. The lawsuits argue that her "behavioral advertising prototype" created a dangerous feedback loop where more user engagement meant more data extraction. Sandberg's polished public persona is portrayed as a strategic distraction from the fundamental issues of the business model, with one official even labeling her the "Typhoid Mary" who brought surveillance capitalism from Google to Facebook.

A Singular Investigation

The authors position their book as the result of a 15-year investigation, offering an inside view that challenges the common narrative of Facebook as a company that accidentally lost its way. They argue that from the moment Zuckerberg and Sandberg joined forces in 2007, they deliberately and methodically built an "unstoppable" global business. The book focuses on a critical five-year period between U.S. elections where the consequences of this design—including the proliferation of misinformation and data scandals—were fully exposed. The authors conclude that the story is not one of a rogue algorithm, but of a complex and intentional creation.

Key Takeaways
  • Facebook faced an existential threat in December 2020 from unprecedented government lawsuits seeking to break up the company for anticompetitive practices.
  • Regulators portrayed Mark Zuckerberg as a ruthless monopolist and Sheryl Sandberg as the architect of a surveillance-based advertising model, arguing their partnership was central to Facebook's harmful impact.
  • The authors contend that Facebook's problems are not accidental; they are the direct result of a business model deliberately built for unstoppable growth, with profound costs to society.

Key concepts: Prologue: At Any Cost

1. Prologue: At Any Cost

The Existential Legal Threat

  • December 9, 2020: FTC and 48 states file landmark antitrust lawsuits seeking Facebook's breakup
  • The lawsuits represent the materialization of Mark Zuckerberg's deepest fear of regulatory action
  • Legal complaints serve as a sweeping indictment of Facebook's entire business model and leadership
  • Accusations center on crushing competition and harming users in relentless pursuit of growth

Facebook's 'Buy-or-Bury' Strategy

  • Lawsuits frame Facebook's history as eliminating potential rivals through acquisition or destruction
  • Mark Zuckerberg cited over 100 times as using bullying, deception, and espionage to maintain dominance
  • Allegations detail 'destroy mode' against competitors and broken commitments to founders of Instagram and WhatsApp
  • Core claim: Facebook became a monopoly that stifled innovation, reduced choice, and degraded privacy

The Zuckerberg-Sandberg Partnership

  • Sheryl Sandberg presented as essential architect of Facebook's success and problematic model
  • Built the profitable advertising engine that treats personal data as a commodity to be traded
  • Her 'behavioral advertising prototype' created dangerous feedback loop of engagement and data extraction
  • Public persona seen as strategic distraction from fundamental issues of the surveillance business model

The Book's Investigative Perspective

  • Based on 15-year investigation offering inside view challenging accidental failure narrative
  • Argues Zuckerberg and Sandberg deliberately built an 'unstoppable' global business from 2007 onward
  • Focuses on critical five-year period between U.S. elections where consequences were fully exposed
  • Conclusion: Story is not of rogue algorithm but of complex and intentional creation with profound societal costs
Scroll to load interactive mindmap
💡 Try clicking the AI chat button to ask questions about this book!

Chapter 2: Chapter 1: Don’t Poke the Bear

Overview

The chapter opens with a stark, personal example of the systemic privacy and security failures embedded within Facebook's culture. It follows a late-night incident where a Facebook engineer, leveraging his nearly unrestricted access to user data, spies on a woman he briefly dated. This breach, while a fireable offense, was merely a symptom of a much larger problem: Facebook's foundational ethos of radical transparency and unfettered engineer access had created a surveillance apparatus that thousands of employees could misuse. The narrative then shifts to the corporate and political consequences of this permissive culture, examining how the company's leadership grappled with a defining crisis: whether to remove a virally popular video from Donald Trump's campaign calling for a "total and complete shutdown of Muslims entering the United States."

The Culture of Unchecked Access

The engineer’s story is not an anomaly. Between 2014 and 2015, fifty-two employees were fired for similar data abuses, most commonly men looking up women they were interested in. Some cases escalated to stalking and real-world confrontation. These incidents were only discovered because the employees used work laptops for their personal searches, triggering automated alerts. The true scale of the problem was unknown. This vulnerability existed because Mark Zuckerberg's founding principle was to eliminate "red tape" for engineers, allowing them fast, independent access to user data to build and test products. Even as the company grew to thousands of employees, this open system was never fundamentally redesigned. Executives sold this access as a perk to new engineering recruits, framing Facebook as the "world's biggest testing lab."

A Security Chief Sounds the Alarm

The systemic risk was formally presented to Zuckerberg and his top executives in September 2015 by new Chief Security Officer Alex Stamos. In a blunt assessment, Stamos revealed that Facebook’s security was dangerously fragmented, its apps were inadequately protected, and its encryption efforts lagged behind peers. Most critically, he highlighted the ongoing pattern of employee data abuse as a crisis the company was ignoring. Stamos argued that firing offenders after the fact was insufficient; Facebook needed to proactively strip the vast majority of its 16,744 data-privileged employees of their access, limiting sensitive data to under 100 people. While Zuckerberg agreed changes were needed, engineering leaders like Jay Parikh and Pedro Canahuati pushed back, warning that such restrictions would cripple their teams' speed and innovation. A compromise was tasked to Stamos and Canahuati, but Stamos made powerful enemies in the process.

"Don’t Poke the Bear": The Trump Dilemma

Parallel to the internal security struggle, Facebook faced an external political firestorm. In December 2015, Donald Trump’s campaign posted his "Muslim ban" speech video, which quickly went viral on the platform. The post forced Facebook's leadership into a fraught debate. While many employees and even Zuckerberg initially felt the hateful speech might warrant removal, Vice President of Global Public Policy Joel Kaplan advised extreme caution. Arguing from a position of political pragmatism, Kaplan warned that removing a presidential candidate's post would be seen as censorship and proof of liberal bias, further alienating Republicans already distrustful of the platform. His mantra was "Don’t poke the bear."

Crafting a Justification in Real-Time

In a tense videoconference, executives labored to find a legal and policy rationale for leaving the video up. They ultimately invented a "newsworthiness" exception on the spot, deciding that political speech from a major candidate deserved extra protection so the public could see a candidate's "unedited views." This ad-hoc policy creation was viewed by some participants as "bullshit" and "making it up on the fly." However, it served the immediate business need: avoiding a massive controversy with a leading presidential candidate who was also a major advertiser and power user. The decision cemented Kaplan’s influence, proving his value in navigating the threat from a disruptive political figure the establishment-oriented team didn't understand.

Employee Backlash and Zuckerberg's Defense

The internal employee reaction was one of anger and confusion. At a company-wide Q&A, an employee directly challenged Zuckerberg, asking why clear hate speech from a political candidate was allowed to stand. Zuckerberg defaulted to a core libertarian principle, framing the issue as one of free expression and the First Amendment. He argued that removing the post would be too drastic, emphasizing Facebook's role as a platform for a "cacophony of sparring voices." This stance overlooked a critical modern reality: the problem wasn't just hosting speech, but algorithmically amplifying it. As the chapter notes, "There is no right to algorithmic amplification. In fact, that’s the very problem that needs fixing."

Key Takeaways
  • Engineer Privilege as a Systemic Risk: Facebook’s culture of radical transparency and unlimited engineer access to user data was a profound security and privacy failure, enabling widespread employee misconduct that the company only addressed reactively.
  • Growth Outpaced Governance: Foundational policies designed for a small startup were dangerously obsolete in a global corporation with billions of users, yet changing them was resisted by leaders who prioritized engineering speed over security.
  • Business Over Principle: The decision on Trump's "Muslim ban" video was a pivotal moment where political and business calculations—fear of conservative backlash and losing a major advertiser—trumped the consistent enforcement of the company's own hate speech policies.
  • The Ad-Hoc Policy Playbook: Faced with a novel crisis, Facebook's executives invented a major speech policy ("newsworthiness") in real-time to justify a business decision, revealing a lack of prepared, principled frameworks for content governance.
  • The Free Speech Shield: Zuckerberg consistently used a broad, simplistic interpretation of free speech to defend controversial decisions, deflecting deeper questions about Facebook's active role in amplifying and targeting harmful content.

Key concepts: Chapter 1: Don’t Poke the Bear

2. Chapter 1: Don’t Poke the Bear

Opening Case: Engineer Data Abuse

  • Facebook engineer spies on a woman using unrestricted access to user data
  • Incident symptomatic of systemic privacy and security failures
  • Foundational ethos of radical transparency created surveillance apparatus
  • Thousands of employees had potential to misuse data access

Culture of Unchecked Engineer Access

  • 52 employees fired for similar data abuses between 2014-2015
  • Common pattern: men looking up women they were interested in
  • Vulnerability rooted in Zuckerberg's principle of eliminating 'red tape'
  • Open system never redesigned despite company growth
  • Unrestricted access sold as perk to engineering recruits

Security Chief's Warning and Internal Resistance

  • Alex Stamos presented systemic security risks in September 2015
  • Revealed fragmented security, weak app protection, lagging encryption
  • Recommended limiting sensitive data access to under 100 people
  • Engineering leaders warned restrictions would cripple innovation speed
  • Stamos made powerful enemies while pushing for reform

The Trump 'Muslim Ban' Video Crisis

  • Trump's viral video called for shutdown of Muslim entry to US
  • Forced leadership debate on removing hateful political speech
  • Joel Kaplan advocated caution with mantra 'Don't poke the bear'
  • Warned removal would be seen as censorship and liberal bias
  • Feared alienating Republicans already distrustful of platform

Ad-Hoc Policy Creation: The Newsworthiness Exception

  • Executives invented 'newsworthiness' exception in real-time
  • Decided political speech from major candidates deserved extra protection
  • Rationale: public should see candidate's 'unedited views'
  • Some participants viewed process as 'making it up on the fly'
  • Served business need of avoiding controversy with powerful candidate

Employee Backlash and Free Expression Defense

  • Employees reacted with anger and confusion to decision
  • Zuckerberg defended stance using libertarian free expression principles
  • Framed issue around First Amendment and platform for diverse voices
  • Overlooked critical distinction: hosting speech vs. algorithmically amplifying it
  • Failed to address that 'algorithmic amplification is the very problem that needs fixing'

The Culture of Radical Transparency and Engineer Privilege

  • Engineers had near-universal access to user data, treating it as an open resource for debugging and product development.
  • This access was a systemic security and privacy risk, enabling widespread internal misconduct like stalking and data snooping.
  • The company only addressed abuses reactively, failing to implement proactive controls as it scaled.

Obsolete Startup Policies in a Global Corporation

  • Foundational access and privacy policies were designed for a small, trusted team and became dangerously obsolete.
  • Attempts to update these policies were resisted by leadership who prioritized engineering velocity and agility.
  • This created a critical governance gap where growth vastly outpaced the necessary security and ethical frameworks.

The 'Muslim Ban' Video Decision as a Business Calculation

  • The decision to leave Trump's video up was driven by fear of backlash from conservative politicians and media.
  • There was significant concern about losing a major advertiser (the Trump campaign) during a critical revenue period.
  • This marked a moment where business and political interests overrode the consistent application of stated hate speech policies.

Inventing Policy in Real-Time: The Newsworthiness Exception

  • Faced with a novel crisis, executives created a major new speech policy ('newsworthiness') ad-hoc to justify a business decision.
  • This revealed a lack of prepared, principled frameworks for content governance at the highest levels of the company.
  • The move set a precedent for making consequential policy decisions reactively based on the identity of the speaker.

Strategic Use of Free Speech as a Deflection

  • Mark Zuckerberg employed a broad, simplistic interpretation of free speech to defend controversial content decisions.
  • This rhetoric served as a shield to deflect criticism and avoid deeper accountability.
  • It obscured Facebook's active role in algorithmically amplifying and targeting harmful content, framing the company as a passive platform.
Scroll to load interactive mindmap

⚡ You're 2 chapters in and clearly committed to learning

Why stop now? Finish this book today and explore our entire library. Try it free for 7 days.

Chapter 3: Chapter 2: The Next Big Thing

Overview

The chapter paints a portrait of a young Mark Zuckerberg, whose competitive nature was evident from his school days, first manifesting in technical one-upmanship at Exeter. This drive followed him to Harvard, where projects like FaceMash revealed an early pattern of causing disruption and then retreating behind a claim of unintended consequences. More importantly, his private conversations there laid bare a core philosophy: he was less interested in building functional tools than creating a place where people would casually "waste time," openly sharing personal information. He intuitively grasped that this open-ended data collection was the key to power, famously dismissing trusting users with crude contempt.

His focus on scale over revenue became clear during a meeting with The Washington Post, and he fully embraced the Silicon Valley playbook by choosing a venture capital offer from Accel Partners over a traditional media partnership. This deal fueled aggressive growth and a internal culture of "Domination!" where Zuckerberg consolidated control and made audacious bets, like rejecting a billion-dollar buyout. That defiant attitude paved the way for his most transformative idea: the News Feed. He envisioned an algorithmically-driven, personalized stream designed to maximize engagement, and he pushed a small team to build it over nearly a year.

The launch triggered an immediate and massive user backlash, with millions protesting the sudden loss of privacy. Ironically, the protest's own viral spread was powered by the News Feed algorithm itself, proving its formidable engagement engine. Zuckerberg's public response mixed apology with admonishment, setting a template for future crises by insisting the feature was ultimately for users' benefit. This period also exposed the company's ad hoc approach to the problems it was unleashing. Content moderation and advertising policies were made on the fly by a team preoccupied with growth, unaware of the ethical quagmires ahead. As competition from platforms like Twitter loomed, Facebook raced to scale and monetize, with Zuckerberg himself wrestling with the tensions between his visionary ambitions and the gritty realities of running a global business.

Competitive Instincts at Exeter

The chapter opens with Mark Zuckerberg as a seventeen-year-old at Phillips Exeter Academy, already establishing himself as the campus computer geek. His competitive nature was evident during all-night coding binges and programming races he often won. When a fellow student, Kristopher Tillery, created an online version of the school’s paper student directory called “the Facebook,” Zuckerberg’s first instinct was to test its limits. He inserted code into his own profile that crashed the browsers of anyone who tried to view it, a move Tillery saw as a competitive flag-planting, demonstrating Zuckerberg's superior technical skills.

Controversy and Data at Harvard

The narrative moves to Harvard, where the well-known origin of “FaceMash”—a site for rating female classmates—is recounted with an important nuance. While popular, it immediately drew criticism from student groups like Fuerza Latina and the Association of Black Harvard Women, who voiced concerns about privacy invasion. Zuckerberg’s response, an apology framed as surprise at the site’s viral spread, established a recurring pattern: portraying disruptive projects as unintended experiments to avoid serious consequences.

His competitive focus then shifted to fellow student Aaron Greenspan, who had launched “the Face Book.” In their private chats, Zuckerberg revealed his core philosophy. He argued against Greenspan’s more formal, functional design, stating he wanted to create a place to “waste time” where users would share information more openly and casually. He intuitively understood that open-ended data collection was key, telling Greenspan, “In a site where people give personal information for one thing, it then takes a lot of work and precaution to use that information for something else.” His ambition was power through data accumulation, a point starkly illustrated by a leaked chat where he boasted about the personal information users had submitted to him, calling them “dumb fucks.”

The Washington Post and a Turning Point

By January 2005, a 20-year-old Zuckerberg, having moved to Palo Alto, met with Washington Post chairman Donald Graham. Nervous and awkward, Zuckerberg explained his platform’s mechanics. Graham immediately saw its potential to disrupt local advertising, like that in Harvard’s Crimson newspaper. Crucially, Zuckerberg clarified he wasn’t focused on revenue but on acquiring more users—prioritizing scale and engagement above all else.

Graham, impressed, offered $6 million for a 10% stake. However, Zuckerberg soon faced a “moral dilemma” when venture firm Accel Partners offered more than double that amount. Accel’s philosophy of growing at all costs without pressure for immediate profitability aligned perfectly with Zuckerberg’s instincts and the Silicon Valley ethos he was absorbing from mentors like Peter Thiel. He accepted Accel’s offer, a decisive move that prioritized aggressive, founder-controlled growth over traditional media partnership.

Building a Culture of "Domination"

Following the Accel deal, Facebook’s valuation and reputation soared, fueled by its unprecedented treasure trove of user data. Internally, Zuckerberg consolidated control, restructuring shares to dilute early co-founder Eduardo Saverin’s stake—an early display of a ruthless “killer instinct” in pursuit of his vision.

The company’s scrappy, intense culture took shape, defined by all-night hackathons and a lack of formal management. Zuckerberg’s ambition crystallized when he rejected a $1 billion buyout offer from Yahoo in 2006, a move that caused his entire management team to quit in protest. Despite this low point, his audacity bolstered Facebook’s aura as the next monumental tech company, attracting talent from giants like Google. He rallied his remaining employees with the chant “Domination!”, focusing development on a new idea he believed would ensure victory: a customized central landing page to aggregate friend updates.

The Creation and Launch of News Feed

Mark Zuckerberg's vision for News Feed was to transform Facebook from a static directory into a dynamic, personalized stream of information. He imagined an algorithm that prioritized content based on "interesting-ness," with posts about the user themselves at the top, followed by content from close friends and then groups. To bring this complex idea to life, he enlisted early engineer Ruchi Sanghvi for the technical work and placed Chris Cox, a charismatic Stanford graduate with a background in natural language processing, in charge of translating the vision. The team spent nearly a year coding what became Facebook's most intricate system yet, aiming to maximize user "sessions" and keep people connected for hours. On September 5, 2006, at 1 a.m. PST, News Feed launched abruptly with a single "Awesome" button, instantly replacing the old interface and catching users off guard.

User Backlash and Viral Protests

The immediate reaction was fierce. Users felt exposed as relationship updates and photos were broadcasted in a public-feeding stream, leading to protests both online and offline. A Facebook group called "Students Against Facebook News Feed," created by a Northwestern University junior, rapidly gained 7% of Facebook's user base within forty-eight hours. Investors panicked, privacy advocates rallied, and demonstrators gathered outside Facebook's Palo Alto office, forcing the hiring of the company's first security guard. Ironically, the protest's virality was powered by News Feed itself—the more people joined, the more the algorithm promoted it, showcasing the feature's inherent power to drive engagement. Chris Cox noted the surge in activity, dismissing the backlash as a typical knee-jerk reaction to new technology, akin to early fears about telephones.

Zuckerberg's Calculated Response

Under pressure, Zuckerberg issued a public apology on Facebook titled "Calm Down. Breathe. We Hear You." almost 24 hours after launch. His 348-word post struck a tone more admonishing than contrite, emphasizing that privacy settings hadn't changed and suggesting users were responsible for what they shared. He framed News Feed as an evolving product that users would eventually appreciate, setting a precedent for how he would handle future crises. While the note aimed to soothe outrage, it subtly reinforced Facebook's control, and soon after, engineers added tools to let users restrict information access, acknowledging the need for adjustments.

The Ad Hoc World of Early Content Moderation

With News Feed, Facebook stumbled into the complexities of content governance without a playbook. Unlike traditional newspapers with editorial standards, Facebook relied on informal, gut-feel decisions made by a handful of moderators. This ad hoc approach extended to advertising, where the new monetization team, led by Tim Kendall, faced its first tough call with graphic political ads from Middle East groups. Kendall hastily drafted a policy against hate or violence incitement without legal review or executive approval, highlighting how speech issues were an afterthought. Employees recalled being preoccupied with growth and product, not foreseeing the serious dilemmas ahead. "We didn't understand what we had in our hands at the time," one admitted, reflecting the company's narrow focus on its "frivolous college website" image.

Facebook's Growing Pains and Future Ambitions

Amidst the turmoil, broader trends loomed. Twitter's rise in 2006 signaled a shift toward real-time information streams, pushing Facebook to scale rapidly. Zuckerberg's audacious goal was to connect every internet user, but monetization remained a daunting challenge. Expenses soared with hiring and infrastructure, and investors' patience for losses was thinning. In a candid moment from a documentary, a young Zuckerberg mused about potentially handing off CEO duties to focus on "cool ideas," revealing his reluctance for the managerial grind. This introspection underscored the tension between his visionary drive and the practical demands of building a sustainable business, setting the stage for Facebook's next evolution.

Key Takeaways
  • News Feed revolutionized social media by introducing a personalized, algorithm-driven stream that prioritized engagement, despite initial user backlash over privacy concerns.
  • The viral protest against News Feed ironically demonstrated its power, cementing Facebook's shift from a static profile site to a dynamic platform centered on constant interaction.
  • Zuckerberg's response to crises combined apology with assertion, establishing a pattern of defending product changes as ultimately beneficial for users.
  • Early content and ad policies were developed reactively and informally, highlighting Facebook's unpreparedness for the ethical complexities of global platform governance.
  • As competition intensified, Facebook faced pressing needs to monetize and scale, with Zuckerberg grappling with his role as CEO amid ambitions to connect the world.

Key concepts: Chapter 2: The Next Big Thing

3. Chapter 2: The Next Big Thing

Competitive Nature and Early Philosophy

  • Technical one-upmanship at Exeter established Zuckerberg's competitive drive
  • FaceMash controversy revealed a pattern of disruption followed by claims of unintended consequences
  • Core philosophy: building a place to 'waste time' for open-ended data collection
  • Viewed trusting users with contempt, seeing data accumulation as key to power

Strategic Choices: Scale Over Revenue

  • Washington Post meeting revealed focus on user acquisition rather than immediate revenue
  • Chose Accel Partners' venture capital over traditional media partnership
  • Embraced Silicon Valley 'growth at all costs' ethos aligned with his instincts
  • Decision prioritized aggressive, founder-controlled expansion

Internal Culture and Control Consolidation

  • Post-Accel deal created culture of 'Domination!' focused on growth
  • Zuckerberg consolidated control by restructuring shares to dilute co-founder stakes
  • Displayed ruthless 'killer instinct' in pursuit of his vision
  • Made audacious bets like rejecting billion-dollar buyout offers

The News Feed: Vision and Backlash

  • Transformative idea: algorithmically-driven, personalized stream to maximize engagement
  • Small team built it over nearly a year despite internal skepticism
  • Launch triggered massive user backlash over privacy invasion
  • Ironically, protest spread virally through the News Feed itself, proving its engagement power

Crisis Response and Growing Problems

  • Zuckerberg's response mixed apology with admonishment, setting template for future crises
  • Insisted controversial features were ultimately for users' benefit
  • Content moderation and advertising policies were made ad hoc by growth-focused team
  • Company unaware of ethical quagmires ahead while racing to scale and monetize

Facebook's Defining Culture and Ambition

  • Scrappy, intense culture formed through all-night hackathons and lack of formal management
  • Zuckerberg rejected a $1 billion Yahoo buyout in 2006, causing his entire management team to quit
  • Rallied remaining employees with 'Domination!' chant, focusing on a central landing page for friend updates

The Vision and Development of News Feed

  • Zuckerberg aimed to transform Facebook from static directory to dynamic, personalized information stream
  • Algorithm prioritized content based on 'interesting-ness' with user's own posts first, then close friends
  • Complex year-long development led by Ruchi Sanghvi and Chris Cox, launched abruptly in September 2006

Immediate User Backlash and Viral Protests

  • Users felt exposed as relationship updates and photos were broadcast in public-feeding stream
  • 'Students Against Facebook News Feed' group gained 7% of user base within 48 hours
  • Ironically, the protest's virality was powered by News Feed itself, demonstrating its engagement power

Zuckerberg's Crisis Management Approach

  • Issued public apology titled 'Calm Down. Breathe. We Hear You.' 24 hours after launch
  • Tone was more admonishing than contrite, emphasizing unchanged privacy settings
  • Set precedent for defending product changes as ultimately beneficial despite initial user resistance

Ad Hoc Content Moderation and Policy Development

  • Facebook stumbled into content governance without playbook or editorial standards
  • Relied on informal, gut-feel decisions by handful of moderators
  • First advertising policy against hate/violence incitement drafted hastily without legal review

Strategic Challenges and Future Direction

  • Twitter's rise in 2006 pushed Facebook to scale rapidly toward real-time information
  • Monetization remained daunting challenge despite soaring expenses and investor pressure
  • Zuckerberg expressed reluctance for managerial grind, considering handing off CEO duties
Scroll to load interactive mindmap

Chapter 4: Chapter 3: What Business Are We In?

Overview

The chapter traces Sheryl Sandberg's journey from a public-service-minded Harvard graduate to the operational architect of Facebook's business empire. It begins with her formative years under mentor Lawrence Summers and her initial career in Washington, highlighting how a chance encounter with tech executives like Eric Schmidt opened her eyes to Silicon Valley. Schmidt’s famous advice—if offered a seat on a rocket ship, get on—propelled her to Google, where she honed her skills before seeking a new challenge.

Her fateful partnership with Mark Zuckerberg was built on a complementary fit: his product vision required her expertise in scaling organizations and navigating politics. However, upon joining Facebook, she faced a chaotic, male-dominated "hacker" culture skeptical of "growing up." Sandberg's first major strategic move was to pose the foundational question: "What business are we in?" The answer, advertising, led her to define Facebook's core advantage against Google. She framed Google as capturing existing demand at the bottom of the marketing funnel, while Facebook’s treasure trove of real identities and social connections positioned it for demand creation through engaging, shareable brand campaigns.

Implementing this vision was fraught with internal tension. Zuckerberg, focused on product and growth, often resisted her requests for resources, leading to a company split between "Sheryl people" and "Mark people." Externally, Sandberg aggressively courted major brands by pitching Facebook as the world’s largest word-of-mouth platform, even partnering with Nielsen to measure ad effectiveness. Yet, her data-driven advertising model was built on a controversial foundation. An earlier program called Beacon had already revealed Facebook's willingness to monetize user activity with minimal consent, a warning sign privacy advocates flagged. The later introduction of the Like button revolutionized data collection, turning casual clicks into a detailed psychological profile and a powerful web-tracking tool.

Feeling competitive pressure from Twitter, Zuckerberg forced a major shift in 2009, making user profiles publicly searchable by default. This privacy change, framed as a simplification, sparked user confusion and outrage. Zuckerberg’s personal disconnect became a critical liability; his privileged experience left him unable to grasp the risks his platform posed for vulnerable groups, even as he meticulously guarded his own privacy. This hypocrisy, coupled with a dismissive and defensive approach to early regulatory scrutiny from officials like Senator Chuck Schumer, poisoned Facebook's relationship with Washington.

The backlash culminated in a formal Federal Trade Commission complaint, triggering Facebook's first major federal investigation. While this established a framework for future oversight, it did not immediately curb the company's trajectory, setting the stage for far greater crises ahead. The chapter ultimately reveals how the relentless drive for growth and a culture of privileged blindness embedded fundamental tensions between monetization, user privacy, and accountability from the company's earliest days.

Sandberg's Formative Years and Ascent

The narrative begins not with Sandberg at Facebook, but as an undergraduate at Harvard, where she first caught the attention of economics professor Lawrence Summers. Despite not being a vocal participant in his challenging Public Sector Economics course, she earned the top grade on the midterm, revealing a sharp, understated intellect. Summers noted her preparedness, respectful demeanor, and distinct lack of the pretension common among high-achieving peers. This led to her first professional role as his research assistant at the World Bank after graduation.

Her path was not immediately toward business. Coming from a family of physicians and nonprofit advocates, public service seemed a natural fit. However, a mentor at the World Bank, Lant Pritchett, saw her leadership potential and steered her toward business school instead of law school. After a brief, unsuccessful early marriage and a stint at McKinsey & Company that failed to engage her, Sandberg rejoined Summers at the U.S. Treasury Department.

The Allure of Silicon Valley

At Treasury, Sandberg's exposure to the tech world through executives like Eric Schmidt proved transformative. She was struck by the informal, idea-driven culture of Silicon Valley, a stark contrast to the formal protocols of Washington and Wall Street. Schmidt, then at Novell, famously advised her, "If you're offered a seat on a rocket ship, get on, don't ask what seat." This mindset, coupled with Google's idealistic mission to organize the world's information, convinced her to join the company in 2001.

At Google, she thrived, building the nascent advertising business into a multi-billion dollar engine. Yet, she eventually hit a ceiling, facing promotional inequities compared to male peers. Despite being recruited for a senior role at the Washington Post by Don Graham, she sought a new challenge with greater scale and impact.

A Fateful Meeting and a New Partnership

Sandberg and Mark Zuckerberg met at a Christmas party in 2007. Their conversation was immediately substantive, focusing on Facebook's staggering growth potential and the business infrastructure required to support it. Over a series of secret dinners, they found a complementary fit: Zuckerberg, the visionary product-focused founder, needed an operator who excelled at scaling organizations, managing business operations, and navigating politics—all of Sandberg's strengths. Don Graham, close to both, encouraged the union.

Zuckerberg was acutely aware of the coming regulatory storms around data privacy and saw Sandberg's Washington experience as a vital asset. In March 2008, she was officially named Facebook's Chief Operating Officer.

Navigating Facebook's "Hacker" Culture

Sandberg entered a chaotic, young, and intensely male-dominated engineering culture. Offices were littered with bottles and coding books, hallways were skateways, and the atmosphere was fiercely competitive. The environment could be hostile for women, with demeaning comments often dismissed. Figures like the intimidating engineer Andrew "Boz" Bosworth embodied this hard-charging ethos and were skeptical of the company "growing up."

Zuckerberg's awkward introduction of Sandberg to staff (noting she had "good skin") highlighted the cultural chasm. However, she deftly won over skeptics like Boz by affirming her goal was to scale the company without destroying its core culture. She promised they would get "better, not worse."

The Foundational Question: Defining the Business

A month into her role, with Zuckerberg traveling on a mentor-suggested global trip, Sandberg convened a critical dinner with the nascent ads and growth teams. She posed the essential, strategic question on a whiteboard: "What business are we in?"

The consensus was clear: Facebook would remain free, making advertising its revenue foundation. Sandberg pressed further, beginning to frame the fundamental distinction between Facebook and Google. It hinged on the type of data each company collected—a crucial insight that would define Facebook's future advertising model. This session marked the start of her mission to build a business machine capable of funding Zuckerberg's social vision.

Shifting Facebook's Advertising Strategy

Sandberg framed the advertising challenge using the classic marketing "funnel" metaphor. She positioned Google at the bottom, capturing users with clear purchase intent via search data. In contrast, Facebook’s unique advantage was its position earlier in the funnel. The platform wasn't just a directory of user profiles; it was a treasure trove of real-time activity and social connections. This allowed Facebook to move beyond uninspired banner ads and create demand through brand engagement. Advertisers could design interactive campaigns—polls, quizzes, brand pages—that users would willingly share with friends, turning them into unwitting brand advocates. If Google filled existing demand, Facebook’s new mission was to create it.

Friction with Zuckerberg and Internal Divisions

Despite Zuckerberg approving her broad vision, Sandberg quickly encountered resistance. The CEO was far more interested in product development and user growth than in monetization, often brushing aside her requests for more staff, budget, and engineering resources to build new ad tools. This tension crystallized into a company-wide bifurcation. Teams informally split into "Sheryl people" (often business hires from Google and her network) and "Mark people" (the original product and engineering-focused employees). This divide bred mutual distrust, with engineers often dismissive of the business side's needs.

Courting Brands and Overcoming Skepticism

Undeterred, Sandberg began a relentless campaign to woo major brands like Ford and Coca-Cola. Her pitch centered on Facebook’s unparalleled, authentic data: real identities and real conversations. She argued it was the world's largest word-of-mouth platform. When advertisers expressed skepticism—particularly about measurement and the effectiveness of social ads—Sandberg acted swiftly. She secured a partnership with Nielsen to measure ad attention on Facebook. However, she found her efforts hampered without Zuckerberg's active participation in key pitches and continued to struggle for internal resources, eventually appealing to board member Don Graham for support.

The Beacon Controversy: A Privacy Warning Ignored

The chapter reveals that Sandberg’s aggressive data-driven advertising vision was built upon a foundation already laid by a prior, controversial program: Beacon. Launched in 2007, Beacon automatically shared users' off-site purchases and activities on their friends' News Feeds, turning them into involuntary brand endorsers. Privacy advocate Jeff Chester immediately recognized it as a dangerous escalation of surveillance-based advertising. A public outcry led by MoveOn.org forced Zuckerberg to apologize and change Beacon to an "opt-in" program, but the episode was framed by Facebook as a mere misstep in user control. Chester and other critics saw the deeper truth: Facebook was committed to monetizing user data and behavior, with or without meaningful consent, a fundamental issue the apology did not address.

The Like Button: A Data Collection Revolution

In 2009, Facebook introduced the "Like" button, a feature that would become a cornerstone of its data empire. Internally popular for increasing engagement, it gave users a quick way to express approval. More importantly, it created a powerful new stream of psychological and preference data. Later, when deployed across the entire web, the Like button became a ubiquitous tracking device, allowing Facebook to monitor user behavior far beyond its own walls. Unlike Beacon, it faced no public resistance; users willingly traded clicks for social validation, unaware of the detailed profile of their interests being assembled.

Opening the Network to Rival Twitter

Feeling competitive pressure from Twitter's public, real-time model, Zuckerberg made a decisive move in late 2009. He forced a major shift in user privacy settings, making previously private information (like profiles and photos) publicly searchable by default. Confusing prompts led many users to accidentally accept these broader settings. The move was internally controversial, with one policy employee warning it would be a privacy disaster. While framed as a simplification, the change was a strategic effort to make Facebook more of a public "town square" for real-time conversation and to better compete with Twitter's open ecosystem, further prioritizing growth and data accessibility over user privacy.

Regulatory Scrutiny and a Disconnected Leader

Government officials and privacy regulators, reading critical news coverage, began to press Facebook on the changes. The company’s first registered lobbyist, Tim Sparapani—poached from the ACLU for this very fight—defended the new settings as a privacy enhancement. This stance was directly challenged by advocates like Jeff Chester, who argued the changes were deceptive and illegal, noting Facebook’s “self-serving and narrow” definition of privacy that obscured sophisticated data harvesting for marketing.

Mark Zuckerberg’s personal worldview became a central point of conflict. He projected a carefree attitude about online sharing, declaring it a new “social norm,” and seemed genuinely perplexed by user outrage. His own privileged life experience—a safe, elite trajectory through Exeter, Harvard, and Silicon Valley—left him unable to empathize with the systemic risks others faced, such as predatory ads targeting marginalized groups. This disconnect was starkly revealed in an interview with journalist Jose Antonio Vargas, who shared the real-world danger of his secret sexuality; Zuckerberg responded with a blank, uncomprehending stare.

Despite preaching openness, Zuckerberg was highly protective of his own privacy, carefully curating his Facebook friends and even buying surrounding houses in Palo Alto to create a private compound. This hypocrisy underscored a core tension: he valued “authenticity” as a commodity for the platform while being shielded from the vulnerabilities his system created for others.

A Confrontation in Washington

The political pressure crystallized when aides to Senator Chuck Schumer complained directly to Facebook. This prompted a high-stakes visit to Washington by Sparapani and Elliot Schrage, Sheryl Sandberg’s confidant and the head of policy. Schrage’s dismissive, impatient, and defensive posture during the meeting—where he insisted Facebook had the strongest privacy policies—proved counterproductive. He irritated Schumer’s staff and established a confrontational tone, signaling that Facebook was not truly listening to regulatory concerns.

The FTC Steps In

In the wake of consumer and political uproar, Facebook partially walked back its changes. However, the damage was done, attracting the focused attention of the Federal Trade Commission. Its new chairman, Jonathan Leibowitz, had already expressed deep concern over the “unfettered collection” of consumer data. In December 2009, a coalition of privacy groups filed a formal FTC complaint alleging Facebook’s practices were deceptive.

The FTC’s response was swift and unusually public, signaling “particular interest” in the case. This marked the beginning of Facebook’s first major federal investigation, which would eventually lead to a historic settlement imposing twenty years of privacy audits. Yet, in the near term, this regulatory action did not slow Facebook’s dominant trajectory; the government would later greenlight its acquisitions of Instagram and WhatsApp. The full ramifications of this initial FTC scrutiny would only resurface dramatically during the platform’s crisis period beginning in 2016.

Key Takeaways
  • Privileged Blind Spot: Zuckerberg’s personal experience of safety and success created a profound inability to understand the privacy risks and systemic harms his platform posed for more vulnerable populations.
  • Hypocrisy in Practice: While publicly advocating for radical transparency, Zuckerberg meticulously guarded his own personal information, revealing a double standard.
  • Failed Diplomacy: Facebook’s early approach to regulators, epitomized by Elliot Schrage’s confrontational style, was dismissive and defensive, poisoning well in Washington and signaling an unwillingness to be constrained.
  • Regulatory Ignition: The 2009 privacy changes directly triggered the first major federal investigation into Facebook by the FTC, establishing a framework of oversight that would become crucial years later.

Key concepts: Chapter 3: What Business Are We In?

4. Chapter 3: What Business Are We In?

Sandberg's Formative Years and Early Career

  • Harvard education under mentor Lawrence Summers, revealing sharp intellect and preparedness
  • Initial career path in public service and research at World Bank and U.S. Treasury
  • Mentorship redirects her toward business school over law school
  • Early professional experiences at McKinsey and Treasury shape operational skills

Transition to Silicon Valley and Google

  • Exposure to tech executives like Eric Schmidt transforms her career outlook
  • Schmidt's 'rocket ship' advice propels move to Silicon Valley
  • Thrives at Google building advertising into multi-billion dollar business
  • Hits promotional ceiling and seeks new challenge with greater scale

Partnership with Mark Zuckerberg

  • Fateful 2007 meeting reveals complementary fit: Zuckerberg's vision needs Sandberg's operational expertise
  • Zuckerberg recognizes need for business infrastructure and Washington experience for coming regulatory storms
  • Don Graham encourages the union between visionary founder and experienced operator
  • Sandberg joins as COO in March 2008 to scale Facebook's organization

Defining Facebook's Business Strategy

  • Sandberg's foundational question: 'What business are we in?' leads to advertising focus
  • Strategic framing: Google captures existing demand, Facebook creates demand through social connections
  • Leverages Facebook's advantage of real identities and social connections for brand campaigns
  • Pitches Facebook as world's largest word-of-mouth platform to major brands

Internal Tensions and Cultural Challenges

  • Faces chaotic, male-dominated 'hacker' culture skeptical of business growth
  • Company splits between 'Sheryl people' (business/operations) and 'Mark people' (product/growth)
  • Zuckerberg resists resource requests despite needing business infrastructure
  • Sandberg navigates tension between scaling organization and preserving startup culture

Data Collection and Privacy Foundations

  • Beacon program reveals early willingness to monetize user activity with minimal consent
  • Like button revolutionizes data collection, creating detailed psychological profiles
  • Data-driven advertising model built on extensive user tracking across web
  • Privacy advocates flag warning signs about user consent and data practices

Privacy Controversies and User Backlash

  • 2009 shift making profiles publicly searchable by default sparks user outrage
  • Zuckerberg's privileged experience leaves him unable to grasp risks for vulnerable groups
  • Hypocrisy between Zuckerberg's personal privacy and platform's default public settings
  • User confusion and anger over privacy changes framed as 'simplification'

Regulatory Scrutiny and Early Investigations

  • Dismissive approach to early regulatory scrutiny from officials like Senator Chuck Schumer
  • Poisoned relationship with Washington despite Sandberg's political experience
  • Formal FTC complaint triggers first major federal investigation
  • Establishes oversight framework but doesn't curb company's growth trajectory

Embedded Tensions and Future Implications

  • Relentless growth drive conflicts with user privacy and accountability
  • Culture of privileged blindness to platform risks from earliest days
  • Fundamental tensions between monetization and user protection established early
  • Sets stage for far greater crises despite early regulatory attention

Navigating Facebook's 'Hacker' Culture

  • Sandberg entered a chaotic, male-dominated engineering culture hostile to women and resistant to corporate maturity.
  • She won over skeptics like Andrew 'Boz' Bosworth by promising to scale the company without destroying its core culture.
  • Zuckerberg's awkward introduction highlighted the cultural chasm she needed to bridge.

Defining the Core Business Strategy

  • Sandberg posed the foundational question 'What business are we in?' to establish a revenue model around free access and advertising.
  • The session framed the crucial distinction between Facebook and Google based on the type of data each company collected.
  • This marked the start of building a business machine to fund Zuckerberg's social vision.

Shifting the Advertising Model: Creating vs. Filling Demand

  • Sandberg positioned Facebook earlier in the marketing funnel than Google, using social data to create demand rather than capture existing intent.
  • Facebook's advantage was real-time activity and social connections, enabling interactive brand campaigns users would share.
  • The strategy moved beyond banner ads to turn users into unwitting brand advocates through engagement.

Internal Tensions and Resource Struggles

  • Zuckerberg prioritized product and growth over monetization, creating resistance to Sandberg's requests for resources.
  • The company split into 'Sheryl people' (business-focused) and 'Mark people' (product/engineering), breeding mutual distrust.
  • Sandberg appealed to board member Don Graham for support amid ongoing resource constraints.

The Beacon Controversy: Early Privacy Conflicts

  • Beacon automatically shared users' off-site purchases, turning them into involuntary brand endorsers and sparking a privacy outcry.
  • The program revealed Facebook's commitment to monetizing user data with or without meaningful consent.
  • Despite an apology and opt-in change, critics saw it as a warning about surveillance-based advertising.

The Like Button: Data Collection Revolution

  • Introduced in 2009, the Like button created a powerful stream of psychological and preference data.
  • It became a ubiquitous tracking device across the web, monitoring behavior beyond Facebook's walls.
  • Unlike Beacon, it faced no public resistance as users traded clicks for social validation.

Courting Advertisers and Overcoming Skepticism

  • Sandberg pitched Facebook as the world's largest word-of-mouth platform with authentic identity data.
  • She secured a Nielsen partnership to measure ad attention, addressing advertiser concerns about effectiveness.
  • Her efforts were hampered without Zuckerberg's active participation in key brand pitches.

The 2009 Privacy Shift: A Strategic Move Against Twitter

  • Zuckerberg forced a major change to user privacy settings, making previously private information publicly searchable by default.
  • The change was a strategic effort to make Facebook more of a public 'town square' to compete with Twitter's real-time, open model.
  • Confusing user interface prompts led many to accidentally accept the broader settings, causing internal controversy and warnings of a 'privacy disaster.'
  • The move prioritized platform growth and data accessibility over user privacy, framing it as a simplification.

Regulatory and Advocacy Pushback

  • Government officials and privacy regulators, prompted by critical news, began pressing Facebook on the changes.
  • Facebook's lobbyist defended the new settings as a privacy enhancement, a stance challenged by advocates who called them deceptive and illegal.
  • Critics argued Facebook used a 'self-serving and narrow' definition of privacy to obscure sophisticated data harvesting for marketing.

Zuckerberg's Personal Disconnect and Hypocrisy

  • Zuckerberg projected a carefree attitude about online sharing as a new 'social norm' and was perplexed by user outrage.
  • His privileged life trajectory left him unable to empathize with systemic risks faced by marginalized groups on his platform.
  • Despite preaching openness, he meticulously guarded his own privacy, curating his friends list and buying houses for a private compound.
  • This revealed a core hypocrisy: valuing 'authenticity' as a platform commodity while being shielded from the vulnerabilities it created for others.

Confrontational Approach to Washington

  • After complaints from Senator Chuck Schumer's office, Facebook executives made a high-stakes visit to Washington.
  • Elliot Schrage's dismissive, defensive, and impatient posture proved counterproductive, irritating congressional staff.
  • The meeting established a confrontational tone, signaling Facebook was not truly listening to regulatory concerns.

The FTC Investigation and Lasting Consequences

  • Following partial walkbacks and ongoing uproar, privacy groups filed a formal FTC complaint in December 2009.
  • The FTC, under Chairman Jonathan Leibowitz, signaled 'particular interest,' launching Facebook's first major federal investigation.
  • This led to a historic settlement imposing twenty years of privacy audits, though it didn't immediately slow Facebook's dominant trajectory.
  • The ramifications of this initial scrutiny would resurface dramatically during the platform's crisis period beginning in 2016.
Scroll to load interactive mindmap

📚 Explore Our Book Summary Library

Discover more insightful book summaries from our collection

MemoirRelated(42 books)

Self-Help(44 books)

Business(68 books)

The Infinity MachineThe Scaling CurveTurn Words Into WealthApple in ChinaThe SaaS PlaybookThe Growth EngineScale SoloVisionaryDing DongRunnin' Down a DreamSix Months to Six FiguresThe Curious Mind of Elon MuskPineapple and Profits: Why You're Not Your BusinessBig TrustObviously AwesomeCrisis and RenewalGet FoundVideo AuthorityOne Venture, Ten MBAsBEATING GOLIATH WITH AIDigital Marketing Made SimpleThe She Approach To Starting A Money-Making BlogThe Blog StartupHow to Grow Your Small BusinessEmail Storyselling PlaybookSimple Marketing For Smart PeopleThe Hard Thing About Hard ThingsGood to GreatThe Lean StartupThe Black SwanBuilding a StoryBrand 2.0How To Get To The Top of Google: The Plain English Guide to SEOGreat by Choice: 5How the Mighty Fall: 4Built to Last: 2Social Media Marketing DecodedStart with Why 15th Anniversary Edition3 Months to No.1Think BigZero to OneWho Moved My Cheese?SEO 2026: Learn search engine optimization with smart internet marketing strategiesUniversity of Berkshire HathawayRapid Google Ads Success: And how to achieve it in 7 simple steps3 Months to No.1How To Get To The Top of Google: The Plain English Guide to SEOUnscriptedThe Millionaire FastlaneGreat by ChoiceAbundanceHow the Mighty FallBuilt to LastGive and TakeFooled by RandomnessSkin in the GameAntifragileThe Infinite GameThe Innovator's DilemmaThe Diary of a CEOThe Tipping PointMillion Dollar WeekendThe Laws of Human NatureHustle Harder, Hustle SmarterStart with WhyMONEY Master the Game: 7 Simple Steps to Financial FreedomLean Marketing: More leads. More profit. Less marketing.Poor Charlie's AlmanackBeyond Entrepreneurship 2.0

Business/Money(1 books)

Business/Entrepreneurship/Career/Success(1 books)

History(1 books)

Money/Finance(1 books)

Motivation/Entrepreneurship(1 books)

Lifestyle/Health/Career/Success(3 books)

Psychology/Health(1 books)

Career/Success/Communication(2 books)

Psychology/Other(1 books)

Career/Success/Self-Help(1 books)

Career/Success/Psychology(1 books)

0