Adam Mosseri has a broad view of Facebook, its capabilities, and its challenges. And Facebook has a lot of challenges right now: the attack on the Capitol was at least partly driven by conspiracy theories and misinformation on social media; all of the major platforms have banned or restricted Donald Trump, prompting a reckoning about content moderation; the Federal Trade Commission and 48 attorneys general across the US have filed a major antitrust lawsuit that seeks to break Facebook up entirely; and there is new competition from TikTok, which has taken over Instagram’s relevance to culture in surprising ways.

Mosseri is the guest on this week’s TA'cast. Currently the Facebook executive in charge of Instagram, he started at Facebook over a decade ago as a designer, and he’s held a number of important roles since then — at one point he was responsible for Facebook’s News Feed and the algorithms behind it. He came to Instagram as head of product and then took over the platform when founders Kevin Systrom and Mike Krieger left in late 2018.

Mosseri and I talked about all these challenges, but of course we spent most of our time talking about how to run a creative platform like Instagram at scale while keeping users — and democracy — safe. Mosseri told me he agrees with the decision to ban President Trump from Facebook. He’s worried about the fallout and isn’t happy the decision had to be made, but he agrees with it. We also talked about how much responsibility the platforms have for what their ranking algorithms promote, and where he thinks the government should step in and set clear rules.

And, of course, we talked about Instagram as a product. Mosseri told me he’s not yet happy with how the TikTok competitor Reels is doing, and that there are too many video formats on Instagram between Reels, stories, the feed, and IGTV. It sounds like things are going to get simplified this year.

One thing that really jumped out at me about this conversation is how Mosseri is trying to manage a three-part feedback loop: Instagram not only gives people the creative tools to express themselves in new ways, but also a huge distribution platform with access to a massive audience — and that means Instagram also has to police what people share and what it amplifies. Pay attention to how the first half of this conversation is about content moderation and making certain things harder to share, and how the second half is about competing with TikTok and making video easier to create and share. The balance between the two feels like the hardest problem for social platforms to solve in the years to come.


This transcript has been lightly edited for clarity.

Social platforms are in the news in a lot of different ways lately, but I like to start by asking people for their decision-making frameworks. You have been at Facebook for 12 years. You’ve had a number of roles there; I’m assuming your framework has evolved over time. You’re now the head of Instagram. Give me a sense of how you make decisions and how that has changed as you’ve moved through your career at Facebook.

I think the first thing to put out there is I try and delegate as many decisions as I can. I think that a risk that anybody who runs a large organization has is becoming a bottleneck for decisions. And the truth is, the breadth of decisions we need to make at Instagram is so wide that there’s no way I can, one, be on top of all of them, and two, actually be deep enough in the details to make really informed or insightful decisions.

So, first and foremost, I try to delegate. I don’t think I did a particularly good job of that earlier in my career. I was a stereotypical early lead, micromanaging, passionate, hothead type.

But for the decisions I do make, I try and make sure I keep top of mind what our values are. I try to make sure I bias towards longer-term thinking. I think it’s really easy to get pulled into a more reactionary capacity. And I try to make sure I think about what Instagram is most focused on. Usually, the two groups that we care about most are young people and creators. So I often try to think through, “Is this decision going to benefit those groups or not?” Because we often put those groups first, whenever we can. So all those things go into making any given decision, but hopefully, I make less and less over time.

I’ve been paging through the Barack Obama memoir, and he says that one of the things he learned about being president is that by the time a decision gets to him, it’s an impossible decision. Because if there was a right answer, someone else would’ve made it, somewhere else in the government. But if it gets to him, it means there’s not a good choice, and he’s screwed. Is that how you feel? It seems like that’s what you’re describing, in a more positive way.

I definitely feel that way.

The decisions I make don’t have the same gravity that his did, but when I first became head of Instagram, one of the first things that surprised me was how often a decision would be escalated to me. There would be two people who I deeply trusted, who I thought were both brilliant, and just completely disagreed on something important, and I would end up having to make the call. So yes, I think that the more senior you get, the more likely that the decisions you have to make are really just deciding between what’s the least bad option. They’re almost never easy calls, but that’s the job.

The reason I leapt to the Obama comparison is we often think of Facebook as a state. It’s big. It touches a lot of the people in the world. It has an outsized power, and perceived power, over things that happen. And it’s often easy to think of [Facebook CEO] Mark Zuckerberg as a head of state — I think sometimes he carries himself that way. What is your relationship, as the head of Instagram, to the larger “state” of Facebook? Do you think of Instagram as a constituent part of that larger entity, in the way that Illinois is a constituent part of the United States? Or is it more separate?

Oh, there’s an analogy! Yeah, I could see that, maybe.

We’re obviously hesitant to liken ourselves to governments for obvious reasons, but I do think there are similar challenges. We have a large group of people, our community or constituency, that relies on us. Different groups have different values. They disagree. And we’re always trying to manage that and be responsive to our community, but also be true to our own values.

To connect your first and second question, Mark and I would like to move more decisions from us to actual governments. Decisions like what can and cannot be on our platform. So, pick an issue: inciting violence, hate speech, et cetera. It would be great if a lot of these things were covered by broad standards that were agreed upon and set by, not every government and every country around the world, but at least a few for organizations. That would actually make more sense in a lot of ways.

But to try to answer your question directly — how does Instagram relate to the broader company? I think maybe you could think of us as a state. It doesn’t quite work because there’s a lot of people who use Instagram and use Facebook, too. And you can’t be in Illinois and in Florida at the same time, but we try and share as much as we can. For instance, on content policy, we have the same rules because it helps us keep more people safe and make less mistakes. To just destroy the analogy, there are some federal laws and rules, and there’s some local laws and rules, and that’s okay.

We try to share as much as we can because, honestly, the Instagram team is lean and mean, and we want to stay that way, but we want to get as much leverage from the broader company as we can.

Another Facebook company has been in the news this week: there was a big controversy with WhatsApp and its privacy policy changes around messaging. People are unhappy. They’ve delayed the rollout [of the new policy].

Instagram has already merged its messaging with Facebook. How has that gone? Are people reacting to it?

Not in the same way. But actually, that’s been going relatively well. To be clear on WhatsApp, they changed their terms of service. I don’t think we effectively communicated what was changing. Privacy did not change in terms of your messages, your personal messages, or your messages between friends.

The only real substantive change was how data is stored when you message an official business account. So there’s a bunch of misinformation flying around. The irony is not lost on me. It’s a nice, humbling meta moment for us. But I think the real issue wasn’t just that we didn’t communicate well, it was that we didn’t make it clear what the value was in the moment.

I think that when you make a change, people are going to be nervous, but any large app is going to have to change. Otherwise, they become irrelevant over time. The people who use it a lot and care about it a lot get upset when things change. Change is uncomfortable.

So when you do make a change, you need to make sure that the value is immediately evident. So when we started to merge [Instagram] with Messenger’s experience, it became clear that there was value because suddenly you had all of these features that hadn’t come to Instagram yet. You could reply to messages as opposed to just having a linear chat. You had a bunch of new reactions and customization tools, vanishing mode, et cetera. So there was a lot of evident value. People, by and large, seem to really enjoy it and that’s been great.

Whereas when WhatsApp changed its terms of service this week, it was not clearly articulated and there was no immediately evident value. That’s a bad combination. That’s a recipe for misinformation. So that’s how we got [there]. We’re now scrambling to correct the record.

One of the things I’ve actually heard Mark [Zuckerberg] say in the past is [there are] more people at Facebook working on trust and safety than work on Instagram. There’s a bigger team working on policy issues across the company than working on Instagram. How big is Instagram? How many people work at Instagram?

It really depends on how you count, because there’s a lot of people who work on Instagram some of their time, but not all of their time. There’s people who report to me directly. There’s people who report in a dotted line to me — but none of that really matters.

When I joined Instagram, I wasn’t running it. I was the head of product. I had come from running News Feed at Facebook for a number of years. And I told everybody I was going to be a sponge, and I wasn’t going to push for any change for a couple months while I ramped up and tried to better understand Instagram, the product, the employee base, and the values.

But the one place where I almost immediately broke my promise was on safety and integrity. I was pretty interested in the details, having spent the last couple of years being responsible for fake news on Facebook and a bunch of other gnarly safety problems.

I found for the most part that [Instagram was] just running our own stuff, and our team was tiny. And so I made the team pivot and essentially [integrate] the technology and work with the engineers who worked on safety across the rest of the company. I actually lost a bunch of people because of that. Not because they disagreed that it was a better way to keep people safe on Instagram over the long run, but it just wasn’t why they signed up to be on the team. So it was pretty painful for six months on that front. And I lost some credibility with some of the people.

But now safety and integrity is one of our strongest teams. So I’m proud of that. I think it’s good. But otherwise, yeah, we are lean and mean, just a few thousand people. Which is a ton of people, but not a lot considering that over a billion people use Instagram.

The last public number you’ve given for Instagram’s user base two years ago was 1 billion users. You were very proud of the milestone. You just said, “over a billion.” What is the latest number?

I can’t share the latest numbers, as much as I’d love to beat our chest about our growth. I will say I’ve been really excited about our momentum this past year, particularly in terms of how fast we’ve grown. Most of our growth is not in the US. Our bubble is probably very US-biased, but for all these platforms like ours, most of the growth is outside of the country in which we’re headquartered.

Where’s your fastest growth?

India is probably the fastest right now. India is coming online fast. You’re seeing hundreds of millions of people coming online over the next couple of years. The access to data is going up. The cost of data is going down.

Reliance Jio is a carrier out there, for those who are not familiar. They basically offered free data for half a year, or maybe even more, which created a price war between all the carriers. That was a fun story because it was a brother whose other brother ran another carrier, and it basically became a family feud. There’s all sorts of fun drama.

So we’ve seen an immense amount of momentum there, especially as we started to focus more on our Android experience a couple years ago.

Fastest growth in India. You’re over a billion. Are you under 2 billion?

I can’t say. I love how leading that is.

I like to start with a bunch of frameworks, because I like to see how those frameworks are expressed in complicated decisions. Instagram is growing really fast. It’s part of this larger entity, Facebook, that has controversy associated with it all the time.

Specific to Instagram, you have rolled out a lot of new features. You have a big competitor in TikTok. There is the Facebook antitrust lawsuit, and there is a content moderation debate in the United States that feels like it’s at a tipping point.

I think we should start there because I think it feeds into almost everything else that we will talk about. Just give people a sense, right now: what is the status of Trump’s Instagram presence, Trump’s Facebook relationship, and what happens next?

We have no plans to reinstate his account on any of the platforms, but if I’m honest, we’re less focused on a specific account and more focused on making sure that we do everything we can to avoid our platforms being used to coordinate violence at the inauguration coming up in a couple of days. I mean, there was a violent mob attack on the Capitol last week. So we’re very focused on doing everything we can, changing our policies, updating our enforcement, building products, and just trying to reduce the likelihood that anything is going on [on] our platform. That’s the primary focus. Once we get through the next week or two, then I think we can pop back up and talk about the president’s account, but we don’t have any plans right now to reinstate him.

How much of your time do you spend on moderation decisions and policy decisions?

Less than I used to because I pushed so hard to centralize most of that work. Now we draft as much as we can off of Facebook company policy decisions. Sometimes I have to dive into the details because there are things that are different about Instagram. For instance, we have parts of the app like Explore where you just have a grid of photos. Facebook doesn’t really have that.

A lot of the content policies assume that you can see all the context around a piece of content, including the caption, but in Explore we’re showing photos without the caption in that context. So your policies around self-injury have to be different, for instance. You might talk about self-injury and show a scar and say, “I’m 40 days clean.” That’s a way of getting support and celebrating your journey and your safety, but if you show a scar without that context, it might not be okay.

So I get involved from time to time, often trying to make sure that our policies — that have often been written for Facebook first because of its age and size — are adapted, appropriate, and responsible in the Instagram context, which has some important differences.

Were you in the room when the decision around Trump was made?

Well, there are no rooms anymore, so it’s easy to say no to that [laughs]. Ultimately, this was a Facebook company call, not an Instagram call, and I think that’s the right thing.

You just got an email like, “It’s done,” and you’re like, “Okay, moving on.”

No, no. [Laughs] It wasn’t quite that simple. I would be okay with that. [But] I agree with the decision, is the more important thing. I’m really worried about it, I’m not happy about it, but I definitely agree with it.

Sheryl Sandberg gave a quote at a Reuters conference, where she said the attack was not planned on the Facebook platform. And then I heard from a lot of people at Facebook and Facebook companies saying, “What? We’re spending all of our time trying to tamp this down. It is happening here. It did happen here. We’re working on it.” You just said you’re working on it. Have you tamped it down to zero? Or are you seeing it pop up, and you’re trying to push it down?

It’s impossible to tamp anything down to zero. There’s over a billion people on Instagram, and there’s billions of people on our other apps between WhatsApp, Facebook, and Messenger. If success is perfection, where nothing bad happens on the platform, then we’re always going to end up failing because at some level, social media broadly, and messaging apps and technology, are a reflection of humanity. We communicated offline, and all of the sudden now we’re also communicating online. Because we’re communicating online, we can see some of the ugly and gnarly things we missed before. Some of the great and wonderful things, too.

But there’s never going to be perfection there. Now, if you read Sheryl’s full quote, she said that most of the coordination was happening on platforms other than ours, because of the work that we’ve done. She got hit pretty bad for that statement on Twitter, but you should look at the whole conversation. The conversation pretty quickly moved from “Was the mob coordinated on the platform?” to “Were people radicalized on the platform and then they participated in this thing?”

That is an important, subtle shift. From where I sit, it’s important that we take a look at whether social media radicalizes people. And if so, what can we do about it? I think that social media isn’t good or bad, like any technology, it just is. But social media is specifically a great amplifier. It can amplify good and bad. It’s our responsibility to make sure that we amplify more good and less bad.

But it’s not just about social media. If we’re going to talk about the events of the last week, why do conspiracy theories and these radical right fringe narratives appeal to so many people? It can’t be completely blamed on social media. Yes, I think we have to look at ourselves and make sure that we’re being thoughtful, and that we’re not making anything worse, but why do people trust the government in this country so little that those narratives are resonating in the first place? That’s a bigger societal problem than just any one platform or industry.

So I think it’s important to look at the whole picture. Now, if you work in tech, you’re under a ton of scrutiny, and you have been for years. I think that’s fundamentally a good thing. It’s super uncomfortable and unpleasant at times, but at the end of the day, you learn, you weed out the noise, you focus on the signal, and you get better. I think we should also scrutinize the whole system because there’s a lot that’s broken right now. And it’s not just technology.

I think that conspiracy theories have always been part of the fabric of American culture, but rarely have they felt so dominant, and very rarely have they expressed themselves in such a mass acceptance of violence. But I can’t look back and say, “Well, social media didn’t play a small role in that.” It feels like the amplification power of social media, not just on your platform, but on every platform — YouTube, Twitter, what have you — has fed into that in a pretty real way.

So one of the questions, broadly, is how much responsibility should a platform like yours take for what it amplifies? There’s a pretty unsophisticated debate about the algorithm, but Facebook runs on algorithmic amplification. You used to run the News Feed. I always just imagined that you had a huge board full of dials in front of you

Yeah, just twisting them as fast as we can.

Turn ThinkAuthority up some days, and we’d get a lot of traffic, and you’d turn it down some other days.

Yeah. [Laughs]

How much responsibility do you feel for what the algorithm promotes and what it amplifies?

I’ll say a few different things and then I’ll answer your question more directly. When we talk about responsibility, particularly with regards to amplification, algorithmic transparency, and algorithmic bias, the thing that I care about most in the US is polarization. I think what we’re seeing is the country becoming increasingly polarized year after year.

By the way, that trend dates as far back as we’ve measured polarization, long before technology and the internet, 60 to 70 years at this point. Now, can technology make that worse? Sure. So we have to make sure that we’re not doing our part to make it worse, but I think that there’s a lot of things at play. I actually think Ezra Klein’s book, Why We’re Polarized, is probably the best book I’ve read on the issue, though I’m still trying to read as much as I can find.

For our responsibility, when it comes to ranking, I still really believe in it. I think ranking is a really good way of making the most of people’s time. There’s way more out there than you and I could consume on a given day.

And ranking means when you open up Facebook or Instagram and the algorithm ranks what you see?

I like to call algorithms “ranking” because “algorithm” just sounds like this robot that has its own agency, making decisions behind the scenes like the Wizard of Oz. And I also feel like companies hide behind algorithms as a way to absolve responsibility, where really all an algorithm does is optimize for what you build it to optimize for. So you’re responsible for being transparent about that and making sure that you’re being thoughtful and responsible in what you’re doing. I say “ranking” because I try to make sure that we are not separating the work and our own responsibility.

What we do is we try to look at all the things you could see because you followed someone on Instagram, and we sort them by how interested we think you might be in them. Recency is an important input, but not the only input. This means that if my sister got engaged, and she lives in Europe by the way, that’s probably more interesting to me than if my brother ate a po’boy, even if he posted it a minute ago. So we’ll show my sister’s posts first, hopefully over my brother’s.

Now, that has consequences, right? That affects what people see. And that means that we are expressing some sort of value judgment. We are valuing relevancy, for lack of a better concept. And we have to be careful because we can’t really know what matters to you or what’s relevant to you, so we have to use proxies, and those proxies can be gamed and they can lead to problematic outcomes.

I think there’s a responsibility there, but it’s not at a content level. It’s not about deciding what to say or what to write. What issue is the most important? What news stories are the most important in a given day? It’s how you build what you build where you have to be responsible. And it’s the outcomes [that] you have to be thoughtful about as well. It’s more complicated than if we were a newspaper and we were figuring out what’s on the front page. But there’s a lot of responsibility and I don’t want to shy away from that.

One of the things you said about your decision-making framework, you said you take into account young people and creators, and you also said you think about the values of Instagram. And one of the things that I wonder about is — big Silicon Valley companies, they have a very clear set of internal values, right? You want your team to be inclusive, you want to be diverse — internal company values are pretty uniform. Do you think your product expresses those values? Because the product reflects the user base in a different way, right?

I think it does. It’s just more indirect. So, we value speech, obviously. We take a lot of flack defending things we don’t agree with, and people’s right to say those things. We value safety, particularly around creating an environment where people feel safe to express themselves. Those things are clearly in tension at times and we have to navigate that tension.

At Instagram specifically, we value young people. We value creators. We value visual expression. We value creativity. We value simplicity. We value craft. I think those things are all reflected in the product to a certain degree. Does that mean that everything in the product aligns with those values completely? Absolutely not. There’s all sorts of content on Instagram and people on Instagram who have different values, and their experiences are reflected in those values.

But I do think some of those threads are widely there across the platform. For instance, we don’t support links or text posts in the feed, and that’s because we’re focused more on visual communication. Everyone’s feeds are more visual and less text-heavy. Is that good or bad? I don’t think it’s really either, it’s just what Instagram is. It’s us trying to differentiate from Facebook and from Twitter and from other platforms.

But it’s hard to wrap your head around what it is as a platform because it’s personalized, which is another value of ours. We value personalization. So your Instagram is going to be very different than mine, and that’s okay, but that does make it harder to really put your finger on what exactly Instagram is. It’s a different thing for everybody who uses it.

Jack Dorsey, the CEO of Twitter, put up a tweet thread saying he thought the decision to ban Trump was correct based on the threats to physical safety that Twitter had. I read that thread as feeling kind of angsty and kind of worried about what happens next. And that’s kind of where you are too?

I’m more explicit about my concerns, maybe. I’m definitely worried about what happens next. Look, last week was a pretty extraordinary situation. We had a sitting president incite a riot and point a mob at the Capitol building to try and prevent the peaceful transfer of power to a democratically-elected official. We care about free speech a lot. We take a lot of criticism about that, but to care about free speech only really works on a foundation of democracy. That’s kind of the context in which that sits.

So when democracy itself is attacked, that is a big deal and kind of changes the game for us. So we’re being very intense this week about pulling levers and trying to make sure that we do everything we can to reduce the likelihood of any more violence, particularly directed at undermining democracy. We obviously changed our policies around “stop the steal” and how we designate that term, but we’ve got more stuff coming out next week.

This has been a huge focus of ours and a lot of what we’ve talked about at the leadership level at the company over the last week. I think it’s shedding a bunch of light on the fact that platforms like ours have a lot of power. And I think that scares people. I think that’s reasonable.

You also saw a bunch of other companies do things. To see Amazon, Apple, and Google essentially take out Parler further down the stack is a whole other type of content decision. I think if there’s a silver lining, it’s that we’ll hopefully talk where it is appropriate for companies to make decisions, and where there should be regulation. Hopefully that debate will be more informed for this, but this is going to bring out an immense amount of scrutiny on the whole industry and I think it’s going to be really painful for all of us. Plus, I just think it’s a dark day when you have social media companies having to take down the president’s accounts. It’s just bad news for everybody, honestly.

You said the leadership of Facebook and you are focused on the next week and the violence that might occur at the inauguration. Does it feel like a palpable shift in your sense of responsibility, or Facebook’s sense of responsibility, to democracy?

I think that we’ve always cared a lot about democracy and about safety. I think that the big shift was really in the wake of the 2016 presidential election with all that scrutiny that we were under. I think we did work to try to protect that election, but we were focused on the wrong things. And I think that in general, one of our biggest mistakes is that we didn’t get as serious as we should have, as early as we should have, around safety and integrity.

Any new startup isn’t going to focus on safety issues. You’re just trying to make sure what you have works. When I joined Facebook, we were just trying to catch up with MySpace — that’s how long I’ve been here — but at some point you get to a scale where your responsibility gets significant enough that you need to really embrace safety, and I think we should’ve done that years before we did.

Over the last four years, we’ve done a lot and I’m proud of that work. Are we where we need to be? No, this work never ends. I think the answer to that question is always going to be no, but we’ve made an immense amount of progress.

I think that one of the reasons why we can react as quickly as we did, is because we built up a bunch of processes, technologies and guidelines, to sort of react on the fly. So I don’t think this marks a shift in how much responsibility we feel. I do think this is going to mark a big shift in how people think about — and how governments, regulators, and policymakers think about — technology, content moderation, and safety online.

I was talking to a friend of mine who runs a trust and safety team for a much smaller platform. And what struck me about it is just how sad she feels about this inflection point. I’m just going to read what she wrote to me, and I’m curious about your reaction.

“We all want to run creative platforms and now we literally have to fight Nazis all day, and they’re claiming to be free speech champions.” What she was expressing was [that] she began her career feeling like a defender of free speech and free expression, and now she’s in the opposite mode.

Do you have that same feeling of sadness or angst that all the processes you’re describing are designed to clamp down, even though what you’re trying to protect is the core of expression?

Not quite the same feeling, but having spent a lot of time on these issues myself, I think that the tone of being deflated, maybe, resonates more specifically than that idea.

I think that we’re going to continue to try and bias towards allowing speech on our platforms. I’m sure we’re going to be criticized for that. I think that’s okay. I think at the end of the day, we have to do what we think is right and best.

I think that there’s a lot of real good that comes out of giving people a voice and allowing people to express themselves. And we don’t talk about that as much because it’s not really good news, but I think it’s important. I try to keep focus on that.

I’m from a family of artists, in some way. My mom is an architect, my brother’s a musician, my sister’s a furniture designer. I used to be a designer, even though I wasn’t that good at it. Instagram tries to be a platform where creators can really express themselves, and I think we’ve got a lot of strength with visual creatives particularly. I want to make sure that we don’t lose sight of that. Yes, we need to identify problems and address them, but we also need to not forget that we’re here to help people connect with those they love and be inspired by the world around them. We’re not just here to find hate speech and take it off the platform, though that is a responsibility that we have to live up to as well.

The thing that honestly jumps to my mind though, as you were reading her quote, is just, when you build these safety and integrity teams, you often move people from other teams onto them, even if they signed up to build a new version of Events or make a cool AR filter. It’s a different mindset. It’s a different skill set. It’s a different approach entirely.

So I’ve tried to make sure that my teams all bring a bit more of an adversarial mindset to what they do, so when they build something, they think about not only how it can be used, but how it can be abused. For the safety and integrity team specifically, I’ve tried to hire people who are passionate about keeping people safe and who have those specific skill sets and understanding of the intersection of policy and technology, because you have to build sustainable teams that are going to stay around forever. You don’t finish these problems, right? The people who are trying to abuse platforms change their tactics as you shut down vectors for abuse. So you have to build teams that can be permanent. And you’ve got to find people who get energy from this work rather than get deflated.

We’ve talked about lawmakers a few times and the sense that regulation is coming. What kind of help do you think you need from governments, particularly the United States government, and what are you anticipating now that there’s a new administration?

I don’t know. It’d be interesting to see what the new administration focuses on now that the Democrats are going to control both houses. My sense is they’re mostly focused on the pandemic right now and vaccine distribution, and I think that’s good. We want to work closely with regulators wherever we can. We think that they have a really important role to play. And we can, in certain cases, collaborate in really productive ways as well.

The one I’m most particularly interested in is content and having there be clear guidelines about what is and what is not allowed. You have to be careful because If you go too far, you start censoring people. If you don’t go far enough, you have safety issues. It’s a really delicate balance there.

The intersection of what the new administration is focused on and where I feel like we have an opportunity, is definitely the vaccine and the pandemic. We’ve tried to do a lot to support people in these times, pushing them to get information when it wasn’t clear what was going on, encouraging people to stay home, accelerating a bunch of commerce tools to try and create support for small businesses that are really suffering. I think that those opportunities and challenges are going to change over the next year.

I’m not an expert on this, but it seems pretty clear that the vaccine is distribution-constrained, not production-constrained. That’ll probably change as we get it together. And then eventually we’ll probably be constrained by the number of people who won’t even get the vaccine, right? A bunch of people are worried about it who aren’t necessarily anti-vaxxers. They’re nervous.

So what can we do at Instagram and Facebook? And what are the risks that are going to change as we go through these different phases? How can we help people get good information and find places to get vaccinated? How can we deal with the inevitable misinformation that comes up? How do we deal with more difficult things, like real information that is encouraging people not to take the vaccine? For instance, these things are stored at really low temperatures. Something’s probably going to go wrong somewhere because someone’s going to open the door of the freezer truck too soon, and then someone’s going to get sick. Then there’s going to be a bunch of coverage of that. And that’s going to be real information, but it’s going to be abused. What do we do in that state?

Right now we’re trying to think through all of these potential opportunities and issues, and figure out how we can look around some corners and get ready to do our part. We’re really focused right now on the transfer of power here in the US, and that makes sense; but there’s a global pandemic out there. There’s people dying. There’s businesses going under. LA is in terrible shape. London’s in terrible shape. So we want to make sure we don’t lose sight of that broader picture and our responsibilities and opportunities to be part of the solution.

Let me ask you a really threshold, sort of mechanical question. How deeply into a post or video on Instagram can you hash meaning out of? Say I post an Instagram story saying, “Hey, everybody, the freezer door at the local hospital was left open overnight. Don’t go get vaccinated today.” Can you extract value from that and understand what I’m saying and know when and how to promote it?

The short answer is no.

I think that we can definitely try to understand things about a video or a photo, but the truth is our systems are way less sophisticated than most people think they are. Most of how ranking works is based on taking a look at what you do. So you tend to like photos; this is a photo. You tend not to like Adam’s photos, so we’re going to rank this one lower. Most of it’s that simple. I mean, even in ads, people focus on targeting a lot, but most of it is based on what ads you clicked before.

And then we do a lot of recommendations, so things like Explore, where you’ve seen something not because you follow an account, but because you just showed up in a space and we’re recommending something. A lot of that’s based on what’s called collaborative filtering. Basically, say you like surfing videos. We don’t even know that they’re surfing videos, you just like watching these videos that happen to be about surfing. Here, we’re looking to find a bunch of other people who like those videos. What other videos did they like? We fan out from there.

We’re trying to build up [a] more sophisticated or semantic understanding of content, but that’s usually at the topic level. Is this about baking, or is this about soccer, or is this about politics? Honestly, most of how ranking works is much simpler and less sophisticated than that.

It’s funny how the ghost in the machine always seems a lot smarter [than reality]. That’s how I always think about ad targeting. But your semantic understanding of the content is not at the level where, I literally post a video to the grid being like, “vaccines are available,” and that you can understand it and promote that specifically.

The reason I ask is on the flip side. QAnon is an idea. It’s not a set of keywords like “stop the steal.” It’s not even a specific set of images. How do you approach something like that, which is a lot more diffuse if you can’t semantically read into the posts?

It’s a good question. I think that one of the things I want to call out is that all these systems that you build to try and address a problem like QAnon — or to try and just make the feed more interesting — they are almost always a hybrid between technology and people. And often these things are put in opposition to each other. I was asked to speak at a panel about that and that’s how they framed it.

But the truth is, technology is good at certain things and bad at others. It’s particularly good at scale. There’s a lot of people on Instagram doing a lot of things. So we need technology to meet scale, and people are better at nuance.

We rely on people to make a lot of content decisions about what to take down or what not to take down. Whatever your goal might be, maybe to make the feed more interesting, or maybe to remove hate speech, is leveraging both — [you] leverage technology and leverage people for their strengths to get the best possible outcome. Almost everything is always a combination.

For instance, let’s say we wanted to build a sports section in Explore, and you wanted to automatically populate that with all the cool sports content that you might like. You first have to define “sports,” which is harder than it sounds. Then you’d have to write a bunch of guidelines for [human] labelers to be able to implement that consistently. Then you would have them label tons of things, tens of thousands of images, and say, “This is sports. This is not sports. This is sports. This is not sports.” You double check and make sure that they’re labeling well. Then you’d end up with this big, what we would call a dataset, a bunch of examples of images with labels. Then you go write a bunch of code to try and figure out how to do that automatically. And you train that code on that dataset. Then you start to run that code and you say, “All right, here’s a new image, classifier.” And then [the] classifier would usually spit back: “This is 70 percent likely to be sports.”

Then you would evaluate the code with people again. You would have people say how right and how wrong the code was, so that you could then give the feedback back to the engineers and change the code. So it’s people at the beginning and at the end, and code in the middle. And around and around you go. It’s always both.

To get to your question more specifically about QAnon, it’s tough, because the context is even more important. It’s easier to build a classifier for something like nudity. It’s not as easy as it should be, of course — you’d think a nipple is a nipple, but a baby photo can get caught up all the time. But that’s a lot easier than building a classifier for something like hate speech. What if I said something that might be inappropriate? And if you said it, it might not be, based on who we are or what context in which we said it. That doesn’t mean it’s impossible. It just means it takes longer, and you’re going to make more mistakes.

But QAnon is an idea. You can look for the word, the letters QAnon, but context is super important, which is why it’s more difficult to identify an idea at scale.

Let’s talk about Instagram as a product. You’ve launched a bunch of new features. You’ve launched Reels. You’ve launched IGTV. There’s a bunch of shopping features.

I want to start with Reels. Reels launched first in, I believe, India. You brought it to the United States. Right now, it feels like TikTok is the center of the cultural conversation in a huge way. It’s where the dances come from. Half of American teenagers are singing sea shanties this week. Is Reels on the path to compete with it? Are you happy with it?

No, I’m not yet happy with it.

Reels has momentum. We’re growing both in terms of how much people are sharing and how much people are consuming, but we have a long way to go. And we have to be honest that TikTok is ahead. They get a lot of credit for really pioneering the format, and we’re still mostly focused on table stakes, making sure that it’s performing, it’s reliable, that they’re good creative tools, that we’re decent at ranking content or video.

I think there’s a lot of interesting ways in which we can differentiate over time. And I’m excited about those probably coming later this year, but right now we’re mostly focused on what I would call table stakes.

Do you see the sort of flood of re-uploaded TikToks on Reels as good, bad, or neutral?

I get excited every time I see a creator switch from re-uploading TikToks to Reels to doing them natively. And I’ve seen more and more over time, so I appreciate that. I think, like you said, a lot of culture happens on TikTok, but I think a lot of culture also happens on Instagram. Culture isn’t just videos. It’s all sorts of important things in what we would think of as emerging culture and new culture. We’re having a lot more progress in certain countries than other countries, but we are growing around the world.

Give me a status update on IGTV, which was the previous big launch. I think you’re now incentivizing creators directly by paying them to make longer IGTV videos. How’s that going?

IGTV, and more broadly video, is doing well. There’s a lot of demand for video.

I do think that it’s not clear to most people what IGTV is. And to date, IGTV has really just been longer Instagram videos. That’s probably too nuanced a distinction to resonate with anybody, so we’re looking about how we can — not just with IGTV, but across all of Instagram — simplify and consolidate ideas, because last year we placed a lot of new bets. I think this year we have to go back to our focus on simplicity and craft.

But overall, video is doing really well, particularly during the pandemic, but in general it’s been growing across all major platforms for years now. I think the shift to video is, in some ways, as important as the shift to mobile.

When you say “consolidate the interface” — there’s a lot of Instagram now. There’s the grid, there’s Stories, there’s IGTV, there’s Reels. As I’ve talked to you and other folks at Instagram before, there’s a lot of pride in unshipping features and keeping the app lean. Would you unship IGTV and say, “We’re just not doing this thing over there.” Is the conversation that broad?

I mean, we would consider things like that. I don’t think we’re going to unship IGTV, though. I think that we just need to evolve it. Right now, there’s too many different types of video, and I think the distinctions aren’t that important. There are other things, too. I think the profile is too complicated. There’s a bunch of cleanup we need to do with the nav change.

There’s things to clean up, and I think that we take pride in removing things, even though we do inevitably make someone angry. You should see my DMs after we removed the Following tab in Activity. It was just so much hate for six weeks, but I think it’s good practice.

Some of the core features of TikTok kind of fit into the content moderation conversation, right? The joy of the platform comes from emergent behaviors from users. You would never write a policy that predicts teenagers in America are going to do sea shanties for a week. You can’t do that.

Can’t do it.

But the product elements of TikTok that have allowed that to happen are things like Duet and Stitch, right? Is that kind of thing the stuff you’re looking at building in Reels?

This is the space of creative tools. So I think what you’re trying to do is build new tools, to allow people to take those tools and then turn them into creative ways of storytelling that you might not have expected. That’s kind of hard, because you don’t know what you’re going to get at the outset, but we’ve seen this happen. We have this app called Threads, which is just focused on sharing with your close friends. It automatically captions your videos and writes them out.

And it just happens to bleep out curses, both in the text and in the audio. It blew up over Thanksgiving, because a bunch of TikTok stars thought it was cool. And literally Threads was No. 1 in the App Store on iOS in the US on Thanksgiving Day, because people were excited about that tool as a funny way of making funny videos.

So that, I think, is exciting. I think we have to get better at building more powerful and creative tools that aren’t necessarily a meme or a sort of moment in a package, but give people who are more creative than us and make content for a living the ability to make something that’s going to pop.

One of the things with Instagram creators in general, is their ability to monetize has expressed itself in many different ways, but there’s no native monetization to Instagram the way there is with YouTube, or sort of with TikTok and their fund. I think Taylor Lorenz had a story in the [New York] Times today, that Snapchat is just giving lottery money out to people who go viral. Have you thought about native monetization for Instagram influencers?

I think there’s a bunch of different ways in which we can help creators monetize. They really fall into three buckets. The first is commerce. I think there’s multiple pieces to commerce. There’s branded content, which by the way, is the economic engine behind most of the creator ecosystem today. But there’s also affiliate marketing, and there’s merchandise. I think there’s a bunch of interesting stuff we can do across all three of those, but that’s all within just the bucket of commerce.

Then there’s “user pay” products, so things like tipping, subscriptions, or exclusive content, and I think that’s pretty exciting. I kind of like [it] a little bit more because maybe it’s not as big an industry, but it doesn’t feel like a tax. When Steph Curry plugs a Brita water filter, that kind of feels like a tax on the experience, whereas if I could pay Steph for some exclusive content, that feels more like a win.

Then there’s the third bucket, which is just revenue share. That is most relevant for video creators, but I think there’s other things that we could do for non-video creators too. I want to make sure that we build meaningful services across all of those buckets, because if we want to be the No. 1 place for creators, we need to make sure that we offer a suite of services that they find meaningful and valuable as opposed to just one type of unstable value, which is distribution. I don’t want to have our eggs in one basket.

Instagram feels like the platform that is trying out the most modes of creative expression. There’s the grid, which is the Instagram core product, which is what drove the growth at the beginning, but then there’s Stories — LinkedIn has Stories now.

Pinterest has Stories.

It’s ridiculous. Stories, it’s like the new away message or something. You just have to have it. But then—

There’s Fleets now—

I’ve never opened a Fleet. I’ll put that out there. I don’t think I ever will.

But then you’ve also got IGTV, which is kind of in the YouTube zone, and now you’ve got Reels. They’re just kind of in the TikTok zone. Can you have it all? Can you do all of those things? Because the biggest, most successful platforms, Instagram included, have driven the majority of their growth with one focused sharing dynamic.

I think you run the risk of spreading yourself too thin and not doing anything that well, if you try to do too many things. That’s something that we worry about, which is another reason why I think it’s important that we look for opportunities to consolidate ideas and products.

But I do think there’s a lot of examples of products that have been really successful despite the fact that they do way more things. The Facebook app is much bigger than Instagram and it does a lot of things. But if you look towards Asia, you see a bunch of apps like WeChat from Tencent that do even more things than Facebook and are incredibly successful. Maybe Asia is different or maybe China’s different in that specific example, or maybe it’s a bellwether. It’s hard to say.

I don’t want us to do too many things, to be clear. I think that over time, we’re going to differentiate by focusing on less things. I think we’re going to focus on visual communication and we’re going to focus on commerce. In terms of the audience, we’re going to focus on young people and we’re going to focus on creators. We’re looking for opportunities to do less and do it better. But last year was, [in] a lot of ways, reacting to the world and placing a bunch of new bets. This year has to be about delivering on those commitments and simplifying the experience.

I lifted the “social platforms do one thing well, and they drive the growth” strategy from a guy named Mark Zuckerberg, who wrote that in an email to his board members when he was thinking about buying Instagram. He said, “We’ve got to buy a new sharing dynamic, and Instagram is what I want to buy.” That has led to an antitrust lawsuit [from] 48 states [and] the federal government.

I’m curious for your read on that, but I want to ask just a much simpler, abstract question. We started this conversation by talking about TikTok and whether you can be as relevant as TikTok. [So] there’s obviously some competition in the market.

If I went to the CEO of Ford and said, “I don’t like the F-150,” he’d be like, “Great.” Even though it’s the best-selling vehicle for 40-some years, he’d be like, “Great, go buy a Chevy,” and that would be the end of that conversation. Why doesn’t it feel like we can say that to Facebook and Instagram?

I think it’s because of a few things. For a bunch of people or a bunch of different groups of people, social media is just deeply involved in your life or what you do. So if you are a politician, a big part of engaging with your constituency now is on social media. That is a shift in power that is probably fundamentally uncomfortable. If you’re in the media or in the news industry, obviously the internet has turned the business models of the news industry on their head. A lot of news is distributed through platforms like Facebook and Twitter. There’s a shift in power there. So it’s not something you’re going to let go.

If you’re just a normal person, you use Instagram or iMessage or Snapchat to connect in a very personal way with the people who matter most to you in your life. So it’s a very intimate involvement. It’s harder to step away and just say, “I’m just going to buy a Silverado and not an F-150,” and I think that’s the emotional side.

The rational side is, look, we’re large. There’s a lot of people who use our platform. That means that we have a lot of responsibility and it’s important, that we, as a society, scrutinize that and figure out what the right regulation is and what the right long-term healthy state is.

But I think this is not a new story. Every new technology has gone through these waves. First, there’s elation. Everyone’s like, “This is awesome and new,” and then everyone is freaked out about it for awhile. Then you get to some sort of stable state. That happened with VHS. That happened with writing. That happened with bicycles. So we’re just in that phase and I think that’s uncomfortable, but fundamentally healthy.

There’s a great book called Brilliant: The Evolution of Artificial Light, with a long section about the controversy over the color temperature of lighting when electric lights came out. I think about that a lot.

Last time on TA'cast, I talked to Daphne Keller about platform regulation and content moderation. She pointed out to me over and over again, that the content moderation debate is deeply connected to the competition debate. Outside of the antitrust lawsuit, do you think there’s enough competition for Facebook, for Instagram, and for your services? Are you too big or do you feel the pressures of competition?

I think we have a ton of competition. Our core reason to be is to connect people with their friends. That’s the primary reason why people use Instagram. They stay to be inspired, but if you look at the research, one of the core use cases is just sharing with your close friends. Most of the growth in that business is in messaging. So I think that messaging presents an existential threat to broadcast-based social media products like Feeds or Stories.

How are you looking at the explosive rise in Signal this week? Is that on your radar?

Signal is definitely on our radar. Telegram is way bigger than Signal. iMessage is way bigger than Telegram. There’s just a ton of competition in the messaging space, particularly here in the US, and I think that’s scary. We’re trying to figure out how we can offer the most compelling messaging product that we can, given products like iMessage that are installed by default. They don’t have to ask for your permission to send you notifications. It just sort of works out of the box. How can we compete with that? It’s tough. So yeah, we have a lot of competition.

My last question for you: What’s next for Instagram? Obviously, there’s next week and that’s difficult, but I want to focus: Over the course of 2021, what should people be looking for?

This year, it’s going to be about delivering on the bets that we made last year. It’s going to be about getting Reels to a good place and starting to differentiate it more. It’s going to be about shopping, particularly shopping through the eyes of creators, which we think is the future of shopping. It’s going to be about just what we’ve always been about, creative expression, but we’re really going to be looking to double down on the new things we launched last year. There’s so much stuff that would be fun to build that we haven’t even started yet, but I really think we have a responsibility to make what we’ve already committed to great before we take on much that’s new.