"Privacy, Publicity, and Visibility"

danah boyd
Microsoft Tech Fest
March 4, 2010

[This is a rough unedited crib of the actual talk]

Citation: boyd, danah. 2010. "Privacy, Publicity, and Visibility." Microsoft Tech Fest.Redmond, March 4.


As an ethnographer, I spend most of my time systematically traipsing across physical and digital landscapes in an attempt to understand emergent cultural practices. In the field, I'm always asking two questions: What is it that people do? And why? Upon leaving the field, I try to step back and examine those practices, locate them into a historical and theoretical context, and work out why they matter. Today's talk is one of those stepping back exercises.

My goal today is take what I've learned in studying youth and social media and think through three interwoven concepts that I believe are critical to understanding social media: privacy, publicity, and visibility. I want to complicate these notions and also provide you with a framework for thinking how they operate in mediated life. As designers, developers, marketers, and even users of social systems, you are all going to face challenges to privacy, publicity, and visibility. I can't provide you with a magical wand of how to address these topics, but I can give you a way of thinking about them. So I want you to walk out of this talk having a better sense of how these issues are shaping life today.

I'm going to keep circling back to three big claims:

1) Privacy is not dead. People care just as much about privacy online as they care about it offline, but the relationship between privacy and publicity has been transformed through social media.

2) People have always entered public space for a variety of reasons. Yet, just because something is publicly accessible doesn't mean that it should be publicized.

3) Even when something is publicly visible, not everyone is looking at it. Furthermore, those who are looking may not be those who need to be looking.

So let's begin...


No matter how many times a privileged straight white male technology executive pronounces the death of privacy, Privacy Is Not Dead. People of all ages care deeply about privacy. But what privacy means may not be what you think.

Fundamentally, privacy is about having control over how information flows. It's about being able to understand a specific social context and behave or share information in a manner where the flow of information can be understood. To do so, people must trust their interpretation of the context, including the people in the room and the architecture that defines the setting. People believe that their privacy has been violated when they lack the control they'd like over a situation.

People care just as much about privacy online as they care about privacy offline. For this reason, it's important to understand how they navigate privacy in everyday situations.

When I'm casually chatting with you and I tell you something deeply personal, I may explicitly ask you not to share it with anyone or I may implicitly assume that you understand that this is not to be shared. If you decide to spread what I share, I will feel as though my privacy has been violated. And, more importantly, I will lose faith in you, especially if I explicitly told you not to tell anyone. The issue is never whether or not you COULD tell others. Unless I did something to render you comatose, the possibility is always that that you could open your mouth. But my sense of privacy is wrapped up in a set of expectations and trust that I've located with you.

We don't just hold people accountable for helping us maintain privacy; we also hold the architecture around us accountable. The notion that "these walls have ears" dates back to at least Chaucer.

"But sooth is seyd, go sithen many yeres, / That feeld hath eyen and the wode hath eres."
-- "Knight's Tale" by Chaucer (1387), lines 1521-1522

While this idiom seemingly blames the architecture - the walls, the field, the wood - it's really about the fact that the architecture signals false expectations about the social context. It's quite socially disruptive to think that one has privacy - or control - only to learn that this is not the case. Yet, in both Chaucer's times and our own, people take their chances, hoping that their privacy will be upheld but idiomatically knowing that there's always a possibility that they'll have control taken away, most notably by those who have power over them, those who have the power to control the architecture - the ears in the wall.

Why do we seek privacy? Primarily to understand the social context. Whenever we interact with other people, we must account for them in choosing how to behave and what to say. It's hard to speak to broad audience. But we also make ourselves vulnerable to others in order to gain social support and as a means of intimacy. Often, privacy isn't about hiding; it's about creating space to open up.

Technologically mediated environments change the equation. What changes is the architecture, not people's desires (although people may prioritize other desires if the system enables a person to achieve those). What people must account for when interacting within a mediated system stems from the properties and dynamics of that particular system. Part of becoming media literate is understanding different systems' affordances and how to work around them to achieve one's desires.

Think about email. There are new ways in which the walls could have been listening - the unencrypted message could have been sniffed as it went across the network or someone could've accessed it from the servers without either of us knowing it. This is how we typically think about privacy. But the social issues are more salient. It's a lot easier for you to forward a message than to try to spread a rumor through word-of-mouth. But in both ways, the technology amps up the ability to spread a message, the potential scale of privacy violations is greater. But, just because you COULD spread what I said and the walls COULD have ears doesn't mean this is the case; for my own sanity, I need to trust both you and the system.


Interestingly, children learn who to trust and how to negotiate privacy through trial and error. Classic example: Susie is upset because Mary told Bobby something that Susie shared to Mary in confidence. We learn who we can trust through experience. Yet, there's another critical lesson that tends to come in this process. At some point, children also learn that if they _want_ something to be spread wildly, they should tell Mary not to tell anyone, knowing that she will then tell everyone. This, mind you, is one of the reasons why middle school is an emotional disaster. Tweens - and especially tween girls - spend a lot of time working out how they can manipulate situations to achieve social goals, often in response to failures of trust.

Now, add Facebook or MySpace or text messaging to the mix. These systems are architected to enable publicity, to allow participants to spread content further, faster, and more efficiently. Put two and two together and you can see why such systems can be the site of a social explosion. As many teens have explained, technology creates "drama central."

New forms of social media are especially tricky because they haven't stabilized as technologies. With email, most of us have learned what the various privacy issues are and find that the biggest risks to our expectations are the very people we're communicating with and their failure to respect our expectation of privacy. But even with email, there are moments when the technology gets in the way. Think about that mailing list whose default Reply-To is Reply-To-List instead of Reply-To-Individual. Decades in, people are still accidentally communicating to a larger audience than they expected. Of course, let me take a moment to shout out to whoever in Office wisely created that feature in Outlook that subtly informs me about the size of the mailing lists I'm blasting out to. You Rock!

Social network sites like Facebook haven't stabilized their features and users are still trying to navigate a moving system. In December, Facebook made news when it prompted users to reconsider their privacy settings. The first instantiation of the process asked users to consider various types of content and choose whether to make that content available to "Everyone" or to keep their old settings. The default new choice was "Everyone." Many users encountered this pop-up when they logged in and just clicked on through because they wanted to get to Facebook itself. In doing so, these users changed all of their settings to public without realizing it. When challenged by the Federal Trade Commission, Facebook proudly announced that 35% of users had altered their privacy settings when they had encountered this popup. They were proud of this because, as research has shown, very few people actually change the defaults. But this means that 65% of users changed their settings to public.

If one believes that people really wanted to be public, one might believe that Facebook users encountered this pop-up, appreciated the new public-facing option and agreed to it which is why they didn't change anything. Anyone who believes this has their head in the sand. Facebook built its name and reputation on being a closed network that enabled privacy in new ways, something that its users deeply value and STILL believe is the case. Are there Facebook users who want their content to be publicly accessible? Of course. But 65% of them? No way.

Many of the people that I've spoken with in my research are completely confused about what is visible to who and in what contexts in Facebook. Since December, I've been asking average non-techy users about their privacy settings on Facebook. I've asked them what they their settings are and then asked them to look at them with me. I have yet to find someone whose belief matched up with their reality. That is not good news. Facebook has implemented ears into its walls. Most users are aware that Facebook can be listening to what it is that they do, but they have no idea that Facebook has invited all sorts of powers that be to listen in too.


Keeping in mind that privacy is about a sense of control, the notion of whether or not some content is 'public' or 'private' often misses the point. If people don't expect it, publicizing public content takes away their sense of control. Yet, as technologists, we often think that content is simply public or private and that we have the right to use anything that is public. On the development side, most of our models of privacy have to do with access controls where 'private' means only accessible to the user and 'public' means accessible by anyone. In between, there are access control lists dictating which users or groups of users should have access. But boiling intent and expectation down to ACLs turns out to be socially messy.

Don't get me wrong: people certainly see information that they share with no one as private information. And there are also times and places where they understand the notion of making something available to a "group." But where things get really fraught is over what it means to be 'public.' When people consciously choose to make something public online, they invite the possibility that someone outside of a specified list of individuals might access this content. That is intentional. But the someone that they imagine is not just any old person, let alone a search engine, the paparazzi, or EVERYONE across all space and all time.

Think about the offline equivalent. When you go out to a cafe, you invite the possibility of running into people. Yet, the reality of physical spaces is that you're more likely to run into certain people than others. You're more likely to run into people who live nearby and people for whom cafes are enjoyable places (who are presumably people like you). There are also people you presumably don't want to run into even though you are in public. Perhaps you've chosen that cafe because you don't think that your ex is likely to patronize the establishment at the same time. Physics also means that you never have an expectation of running into everyone in the entire world, or even everyone you've ever met. Even if you run into an old friend, the probability of running into everyone you attended high school with at the same time is so infinitesimal that you would probably go insane on the spot if such an unlikely event were to occur.

When people interact online, they too have certain expectations and probability calculations. They make guesses about who may or may not stumble across them as they're producing content and they also assume that older content will fade away as new content emerges. Their calculations are completely reasonable, as it's an efficient way of getting a decent handle on the social context, even if they are sometimes wrong. Early adopters of a particular public technology are not always prepared for the future audiences that come along as the technology gets more popular, even if they know that might be the case. And even late adopters may not be prepared for what happens when the technology itself changes in ways that make public content more public, even though the company made no promise that things would stay as-is.

Security through obscurity is also not as stupid as people often believe. Most people out there never get much attention, even when they are desperately seeking it. A few years back, Technorati pointed out that the average blog was read by six people. Just because something can be accessed, doesn't mean that it will be. And for that reason, people regularly calculate that there's not much lost in making something public and, more importantly, a high possibility of gaining something if a friend or potential friend happens to stumble by.

Here's where there's an interesting equation in people's heads, one that is inflected by age and socio-economic position. For youth in particular, they are thinking about all that they have to gain. For older adults, and particularly privileged older adults, they are thinking about all that they have to lose. I've met countless teens who put things up online because they're hoping that someone important will find them and save them from their boredom. Don't get me wrong - there are teens who are thinking about the "future" which, btw, is a substitute for privileged kids thinking about their Ivy League possibilities starting at age 5. Most kids - and many adults - are more interested in what possibilities are out there in the public world.

Of course, keep in mind that no one really wants to be public to EVERYONE. Adults and teens alike can list off people that they don't want to encounter, publics that they don't want to enter. Just as is the case in physical environments.

And here's where technological decisions can be costly. Just because someone has decided to make content publicly accessible to everyone does NOT mean that they want that content to be publicized. When a new feature comes along that takes something that is purportedly public and makes it MORE public, it destabilizes people's sense of control and, thus, their sense of privacy. Over and over again, technology companies do this, arguing that the content was public in the first place. And over and over again, consumers are irate and complaining about privacy. They're speaking past one another because they're operating with two different models of privacy. Fundamentally, users are upset about the fact that they've lost control over the content.


To get at the privacy and publicity challenges, let's talk about where things went horribly awry for Google after they launched Buzz. It's the perfect case study for seeing how technological logic is not necessarily interpreted by the public as technologists expected.

For the uninitiated, Google introduced a new service called Buzz that is basically a stream (ala Twitter or Facebook's Feed) with content populated by the people that an individual chooses to follow. The service is situated within Gmail, requiring users to access it via the Gmail interface. When first launched, new users were invited to check out Buzz on the way into Gmail. If they agreed, they were prompted to give information that would result in the creation of a publicly accessible profile, if they didn't already have one. And they were given a popup of users that the Buzz calculated that they'd most like to follow. While any user could be unclicked, the default was that they were clicked and clicking through would result in users automatically accepting these people. The default also meant that a users' list of followees would be listed on their publicly accessible profile, but there was an option to uncheck this. Likewise, if the user used other public features of other Google products - such as Reader - these too would be all integrated into a user's public profile, even though there was always a way to disconnect these sites.

Nothing that the Buzz team did was technologically wrong. There were all sorts of opt-outs available - opt out of Buzz, opt out of the default lists, opt out of displaying the lists, etc. Yet, the service resulted in a PR disaster. Why? Google made 5 key mistakes.

1) Google launched a public-facing service inside a service that people understand as extremely private. Mixing Gmail and Buzz was a huge mistake as it created a cognitive disconnect in users' minds, resulting in all sorts of panics about Google making email publicly viewable. While this was never the case, the integration confused people and left them with the wrong impression.

2) Google told users what they wanted rather than asking them. As technologists, it's easy to assume that optimizing a situation is always best. But think about social rituals. We don't go through the niceties of "Hi, How are you?" because it's optimal for communication; we do it because to do otherwise is rude. And if you want your partner to get milk on the way home from work, you know that asking is often more effective than demanding "Get milk" even though the latter is what you're actually saying. The same goes for interface design. Asking users if they wanted to scan for previous contacts in Gmail or other Google services would've been less optimal functionality wise, but would've been the right social ritual to politely interface with users.

3) Google found the social equivalent of the uncanny valley. Graphics and AI folks know how eerie it is when something looks almost right but not quite. When Google gave people a list of the people they expected them to know, they were VERY close. This makes sense - they have lots of data about many users. But it wasn't quite perfect. And that made it feel eerie.

4) Google assumed that people wanted different pieces of public content integrated together. Yet, just because something is publicly accessible doesn't mean people want it to be publicized. And just because people talk to certain people through one Google interface doesn't mean that they want to talk to them elsewhere. Collapsing contexts is destabilizing for users.

5) Finally, Google assumed that people who might have issues being public would opt-out. I'm giving them the benefit of the doubt on this one. A more insidious framing would be to say that they wanted to force people into being public because it's more viral and monetizable that way. We know that users accept most defaults so the defaults matter. The defaults also set the tone for the space. When the defaults are outside of people's comfort zones, they flip out, regardless of whether or not they could opt out. It's too easy to make mistakes, too easy to not know.

What the outrage around Google Buzz showed us is that people care deeply about privacy and control. Don't get me wrong - plenty of people will use the service and it will be extremely popular, but this doesn't mean that Google didn't just take a hit in terms of trust. There are plenty of folks out there who are more wary of them now than they were before Buzz.

Part of what makes public explosions like those over Buzz or Facebook so confusing is that people are putting more content into publicly accessible settings. But we really need to examine why they are doing this and what they expect here.


First things first, we need to distinguish between PII and PEI. In technical and privacy circles, we often talk about PII - "Personally Identifiable Information." And we think that when people make PII public, they don't care about privacy. But in practice, people are much more attentive to PEI - "Personally Embarrassing Information." This is what they're brokering, battling over, and trying to make sense of. The difference between PII and PEI has everything to do with the audience at play. When we talk about the problems with PII, we're mostly talking about governments and corporations and strangers manipulating and abusing people. But if you look at what actually worries people, they'll talk about the damage that can be done by people that they know or might get to know, the people for whom they might be embarrassed in front of. Because of this, people often don't worry so much about whether or not something is publicly accessible, but how far it can spread and how searchable it might be by those who might be trying to embarrass them.

There's another twist to what's going on - we've moved into an era of "public by default, private through effort." Let me explain.

If we're having a friendly chat out in the hallway, our conversation might be overheard by any number of people who walk by. Some folks might even join the conversation. Some of what I tell you might stick in your head and you might choose to tell other people but, most likely, 90% of what we talk about will fade from your memory and be shared with no one. This is not to say that I'm a boring interlocutor, but that few conversations are so memorable that they're shared in detail and, even if you wanted to share every detail of the conversation, there's no way that you could reproduce the conversation verbatim without the help of technology. Such conversations, the conversations we're used to, are private by default, public with effort. You can make our conversation public, but it'll take effort. We didn't mean for our conversation to be particularly private which is why we didn't hide out in a private room, but it'll be that way by default.

Now, think about a conversation we might have through social media. Those conversations are often public by default, private through effort. The technology makes it such that engaging with someone in a setting where drop-ins are welcome result in them being publicly accessible. This is because they are recorded and rendered asynchronously accessible by default. Instead of choosing to make a conversation public when operating in these environments, we must consciously choose to make a conversation private, either by switching media or by using different features or by speaking in a coded manner that others can't make sense of. We do so when something might be hurtful or not appropriate for others to hear, but this is a very conscious choice.

This change is really significant because it means that most of the boring conversations we had - the "how's the weather?" conversations - are now public by default rather than being private by default. This doesn't mean that we've given up on privacy, it's just that we see no reason to actively make many comments private.

Part of this change stems from the fact that we have always had something to gain from speaking in public - the possibility of a friend, the possibility of a chance encounter, the possibility of serendipity. And so there's no reason to make many things private, just as there was no reason to stop them from being overheard in a "private by default" scenario.


In an environment that is public by default, there are also folks who are actively seeking publicity, actively seeking broad audiences for their message. This, btw, is the huge difference between Facebook status updates and Twitter. Their origin point was similar. Many early adopters of Twitter used the site to update friends about their lives, resulting in the prevalence of "what they ate for breakfast" jokes. But as the service evolved, plenty of people figured out that Twitter was far more public than that. And while there are still plenty of folks who use Twitter to talk with their friends, many more use it to engage in a new form of publicity.

On one hand, you have the celebrities, micro-celebrities, and wanna-be celebrities - all people who are looking to obtain an audience in one way or another. These people want to be able to engage en masse with broad audiences of followers, either to get their message out there or create a safe situation where they can interact with fans. At Le Web, Queen Rania of Jordan pointed out that she could no longer go out to the street and just talk with average people, but Twitter gave her the ability to do so in a way that was safe for her while also being interactive.

You also have plenty of fans - people who are actively following the celebrities, micro-celebrities, and wanna-be celebrities. Many of the teens who use Twitter - and yes, there are some teens who use Twitter - are there because they want to engage with the celebrities. They love the ability to @reply them and hope that they'll get a response.

Mixed into this, you have lots of folks who are enjoying the serendipity of publicity, loving the ability to get memes rolling. If you don't consume the Trending Topics on Twitter, you've missed one of the more delightful aspects of it - collective participation in an effort to create trends, most notably by black users and tweens.

My point in bringing up Twitter is to highlight that there are places where people are actively trying to achieve publicity, seeking attention and visibility and notoriety. They engage with Twitter because this is the dynamic here and they embrace it. But this is precisely why Twitter will not be popular with everyone. Not everyone wants an audience or wants to be part of someone else's audience. And that's AOK because it is what some people want. And for that, it's extremely popular.


In thinking about publicity and privacy, we must talk about privilege. Who has the right to be seen and heard in public? Who is comfortable speaking in public? What are the costs that people face when walking out into public?

At Microsoft, we are extremely lucky. For the most part, we can dress how we please and be as quirky as we'd like. But even here, not everyone feels as though they have the right to speak up when they are discriminated against or when they are harassed. In the scheme of things though, we have a lot more privilege that most folks and a lot more protections for those who are marginalized. Not everyone is so lucky.

The more privilege we have, the more we take for granted when it comes to accessing public space. We believe that we can challenge authority, that we have the right to be heard and the right to be seen. We believe that we can tell our stories, that our voices matter. We can walk out into the public without fear of losing our jobs, losing our children, losing our rights. We can be "public by default" without thinking too much about the costs of it all. And we can seek publicity if and when we want.

But... Imagine that you're an immigrant whose family came here illegally 30 years ago when you were six months old. You don't speak the native tongue of your ancestors, have never been back to the country in which you were born. You are petrified of being deported. Are you comfortable telling your story in public?

Or... Imagine that you left an abusive relationship (one of the hardest things to do). You're working two jobs to make ends meet for you and your kids. You're exhausted, but your biggest fear is that your ex will find out where you are and hurt you and/or your kids again. How public do you want to be?

These two character sketches aren't made-up people; they're people I met who are trying to make life work. And there are a lot of people like them. Imagine being gay or transgendered in the military or in many other institutions. Imagine being Jewish in some parts of the world, Christian in others, or even Muslim in many parts of this country. Try being a woman in a tech company or a person of color in an all-white setting. Being different is exhausting enough, but being marginalized along any axis can be humiliating and downright deadly.

Sure, it's great to say that everyone SHOULD be comfortable being in public, but that's not the world that we live in. Many people are just trying to get by. And we cannot expect marginalized folks out there to be the ones always fighting for their voice. Forcing people into public can put people at risk, especially those who are already marginalized. The "public by default" environment that we are so proudly creating isn't the great democratizer; for many, it's scary as hell.


While technology complicates how we navigate privacy and publicity, it fundamentally alters visibility. Visibility is an issue of seeing and being seen. In a digital environment, who can see whom and how do they interpret what they see?

Through social media, we have the ability to see into more people's lives than ever before. What we see is very limited, based on the traces that people leave behind. But what do we do with that information? Mostly, we misinterpret what we see.

When MySpace was just gaining visibility beyond its early adopter populations, I received a phone call from a college admissions officer at an Ivy League institution. The school had received an application from a young black man living in South Central in LA. He had written a heart-wrenching essay about how he wanted to leave his gang-ridden community. When the college went to his MySpace profile, they were aghast. His profile was filled with gang symbols and references to gang activities.

The question asked of me: why are kids today lying about their lives when it's possible to see the "truth" online? I laughed. That kid from South Central was not lying to the college admissions officer. He was trying to survive. Every day, he walked into his school in South Central. To survive in that school in that community of South Central requires being a part of the gang culture. He was posting for his classmates and perhaps his family members, not for the college admissions officer. Yet, the college admissions officer saw this post as public and felt that she could interpret what she saw based on her understanding of the world.

When we're not misinterpreting visible data, we're usually panicking. This is particularly true when it comes to what teenagers do online. We see things and we fail to recognize that they've always been there but were once more invisible.

Take, for example, cyberbullying. By most statistical measures, bullying in the US is no more frequent today than it has been in the past. Teens today continue to report that bullying at school is both more frequent and more emotionally devastating than that which takes place online. So why are we obsessed with cyberbullying? The bullying that takes place online leaves traces. This means that more people can see it - both peers and adults. Your parents may not have known that you were being harassed at school unless you came home with a black eye, but if you're looking for it, you may actually see your children being harassed by their peers. Of course, what might look like harassment to you may also just be joking around so teasing this out is quite important. Through technology, bullying is more visible than ever before. This is both a blessing and a curse. On one hand, there's an obsession with going after the technology rather than analyzing the root problem. But on the other, parents today are much more likely to be engaging their kids in a conversation about bullying than parents in the past. Of course, no one knows how to eradicate bullying, online or offline. And most parents are completely lost, recognizing that adages like "suck it up kid" and "fight back" aren't the solution, but not fully sure what is.

It's easy to be horrified; it's much harder to embrace the visibility in an effort to get to the root of the problem.

Of course, just because content is publicly accessible does not mean it's visible. It all depends on who is looking and with what intention. And, sadly, there are too many cases where people should be looking but aren't. In Colorado, a girl named Tess killed her mother with the help of a few friends. When the TV news picked it up, they talked about it as "girl with MySpace kills mother." This prompted me to go and look at her MySpace; her MySpace and all of its content was entirely public. It was heartbreaking. For months, she had been documenting her mother's alcoholic rages through her public blog postings on MySpace. Detailed accounts of how her mother physically abused her, yelled at her, and psychologically tormented her. Her friends had left comments, offering emotional support. But they were in above their heads with no support from adults. All of this content took place in a public setting, visible to wide audiences. But without people looking - and especially people who could act on this information - nothing was done.

And visibility doesn't guarantee an ability to act even if one would like to.

I've been spending a lot of time thinking about ChatRoulette lately. Now that the press has heard of it, it is the new site of panic. Created by a 17-year-old Russian boy seeking to connect with other teens, ChatRoulette randomly connects you by video to a stranger who is also using the service. At any point, you can click next and the person on the other hand can click next. And if it gets weird or boring, all you have to do is click next. And on and on you go, connecting to random people around the world in a game of people roulette.

What interests me about this site is not the possibility of running into problematic content, but the opportunity to momentarily dip into people's lives. What you'll find can be both heartwarming and heartbreaking. While you can look in on people's lives, it's often hard to actually connect because it's so easy to be nexted by someone else. This can actually be one of the most frustrating aspects of the site. Consider this comment from a college freshman named Dylan Field:

"One time, I talked to a very angry, young teenage boy who yelled at me for a good 5 minutes. I couldn't press 'next' because I wanted to understand why he was so angry. I never did figure it out, and after he ran out of insults he 'nexted' me. I don't care about the gross images, but that has stuck with me."

Just because we can see into people's lives doesn't mean that we can actually connect with them. And sometimes, not being able to connect with someone who needs help is unbelievably frustrating.

And some issues of visibility go beyond that... We become forced to see aspects of our society that we might not actually like.

Consider what's happening on Twitter as black users speak loud and proud. On the night of the BET Award Ceremonies last summer, all of the Trending Topics were icons of the black community. How did non-black users respond? Some, not so nicely. They talked about how the niggers were taking over, about how this had become an unsafe neighborhood. I know that racism is pervasive in our society, but it always saddens me when I'm forced to see it. As much as I'd love to make the racist language I see go away, I'm much more incentivized to use the visibility of what we see to combat it.


Everyone in this room is affected by the changes that are underway. Whether you're living with the in-flux social dynamics or creating the tools that enable them, you're affected. Some of you are having your lives transformed by social media, either positively and negatively. But even if you're not, I need you to think about those who are. Think through how privacy, publicity, and visibility doesn't just affect you but affects people across all walks of life.

And as I conclude this talk, I also have two specific requests for those who like to innovate:

First, let's find a way to create 'eyes on the digital street'. In her seminal work on urban cultures, Jane Jacobs highlighted that the best way to create a healthy community was to encourage people to be aware of their surroundings, ready to help out when people were hurting, etc. She talked about this as 'eyes on the street.' When a kid fell off his bike, someone might have run out to help. Not because they were surveilling the street, but because they were aware and concerned about the health of the community.

We have created a new form of public and our society's public-ness is currently in transition. Publics play a critical role in the health of society. Not only are they central to governance and political endeavors, but they are crucial to business and innovation as well as people's sense of self. Because of this, publics are where intolerance breeds and is countered. It's where people gain authority, but it's also where people collectively do great things. Publics are fragile things and we need to create a public that we're proud of, one where we're looking out for others and ready to step in when things are going wrong.

With that in mind, I have an ask specifically for those who are helping to create these publics. Please beware enabling digital paparazzi or, worse, digital witch hunts. While people have always entered public space for a variety of different purposes, their exposure in digital public spaces is more akin to what celebrities experience. They are observed and their every misstep recorded for others' judgment. More than a few celebrities have broken under the pressure and exhaustion of always being watched. And the paparazzi's insistence that they have the right to invade people's lives has caused more than one death.

When we treat average people like celebrities, when we assume that the public has "the right to know" simply because someone has decided to step out in public. In doing so, we take away people's control, expecting them to just cope. This isn't fair and it won't necessarily produce a healthy society. As Helen Nissenbaum has argued, we need the ability to maintain "contextual integrity." People need the ability to understand the social context in which they are in and operate accordingly. Whenever we publicize something that is simply publicly accessible, we run the risk of putting people in harm's way. Whenever we force people to operate with an understanding of all people across all space and all time whenever they want to open their mouths, we run the risk of them not being able to function. Or not being able to do so in a socially appropriate way. And we are more likely to do more damage to those already marginalized or ostracized in the society we live in.

I fell in love with the Internet because I saw and lived its transformative potential. I got involved in the industry because we are the architects of a new kind of public life. We have the opportunity to do some amazing things, to help shape the future. But with such privilege comes great responsibility.

I cannot give you a formula for dealing with privacy, publicity, and visibility. It doesn't really boil down to a simple equation. It is a stew of complexity and there are no clear answers. Instead, what I'm hoping you take away from this talk is the need to be aware of these issues, to be in conversation with your colleagues about how these shape what you do. I need you to be mindful of these concepts so that you can help shape the future. I've introduced a lot of different ideas in this talk, but let me highlight what I believe are the most crucial to walk out of here remembering:

1) Privacy is not dead. People care just as much about privacy online as they do offline. And they also care about being a part of public life. So they're trying to navigate privacy and participation in public life simultaneously. And they're contending with a shift from "private by default, public with effort" to "public by default, private with effort."

2) Just because something is publicly accessible doesn't mean that people want it to be publicized. Scales of publicity matter to people and they want to have some control over what they put out there. Publicizing their material without their knowledge is a way of taking control away from them; invite them to figure out what should be public and what should be publicized.

3) For all of those thinking about what they have to lose by engaging in public spaces, there are plenty of folks thinking about what they have to gain. This is particularly true for young people who, because of their life stage, are trying to figure out how to participate in public life. This "generation gap" is primarily one of social position, but it's critical that we don't get caught up in our own thoughts about the costs of publicity without remembering those who see the opportunities.

4) Just because you can see data out there doesn't mean it was meant for you. Context matters. On the flipside, we all need to be aware that there are people out there crying out for help. Finding a way to navigate visibility so as to create a better society will be good for all of us.

Privacy, publicity, and visibility are processes, grounded in needs, desires, and goals. Technology is transforming the fundamental architecture upon which all of these processes unfold. As the architects of these technologies, the business people obsessed with these transformations, the policy makers trying to regulate what is happening, and the people shaping the social norms, we must all be conscious of our role in this process. I invite each and every one of you to understand what is transpiring and to reflect on your role in this process.

Thank you!