Preview Mode Links will not work in preview mode

The Social Media Clarity Podcast


Oct 17, 2016

 

Responsibility.com - Episode 27

Responsibility.com - Episode 27

Social Responsibility and Social Platform Providers

Description

Marc, Scott, and Randy talk about recent changes at social platform companies as they wrestle with the ethics of their customers causing conflict, such as racism/sexism in AirBNB and Nextdoor.

Links

Transcript

Scott: This week, we’re going to talk about a trend that we’re seeing and is being reported in the media about large scale internet companies stepping up and taking some responsibility for the power that they are creating with their networks.

Randy: Specifically the social power.

Welcome to the Social Media Clarity podcast. 15 minutes of concentrated analysis and advice about social media in platform and product design.

Marc: A lot of platforms have tried to be a common carrier. They don’t want to interfere with what goes on, on their platforms but in some ways those days are over in part because what went on, on some of those platforms turned out to be so bad that it could no longer be tolerated.

Randy: On Airbnb, people were refusing guests because of their race.

Scott: For a long time, Airbnb was essentially turning a blind eye to it saying we’re just providing the platform, but at some point there was enough response from the community, from people who were reporting on discrimination that was occurring within the given platform that Airbnb made a decision that we don’t really want this discrimination to be a feature of our platform so now we have to design something to eliminate or reduce the kinds of discrimination that are going on.

Randy: When designing systems, I had to do this a few times at Yahoo that when a channel behavior and specifically say some behavior that some people participate in, is not acceptable when people interact with each other on our network, you will lose some customers. If you say, you cannot use race or gender in most cases, to limit who you rent your Airbnb room to, you’re going to lose some customers.

You’re making a conscious decision of the type you didn’t make up until now generally which is you wanted all customers. In the early days of the internet, all that mattered was the number of … First there was eyes and then when they start to have communities like a number of transactions that occurred. That’s all you cared about.

Now, we’re saying actually there’s a quality of transaction that is important to us and since it’s between individuals who do not work for the company, we’re now in this middle man position. We actually have some responsibility we walked away from before to actually say, “No, there are some kinds of transactions that are allowed here and not others.

Marc: Platforms have been a wild west, anything goes and the values were more about not interfering with the operation of the platform rather than the way in which users engage with one another and that is shifting. We’re seeing a maturation of the market. Very large players are recognizing that they have to or need to or want to step in and they are now going to say, “We’re going to ban racism from the platform. We’re going to ban sexism from the platform. We’re going to ban certain kinds of abusive practices from the platform.”

Many would say this is long overdue but it is interesting to see that companies are now stepping up and changing the design of their platforms as well as the terms of service that govern their platforms.

Scott: This is being reported in the media and there’s a couple of articles that we’re going to be riffing off. One of them is the New York Times article talking about where Airbnb as breaking ranks in admitting the power that we’ve just been talking about. It also talks about Nextdoor. It also pokes a little at Twitter who hasn’t quite come to the table of admitting power. We’re also talking about some detail design on Wired that reported on Nextdoor and talking about frictionless markets.

Marc: What’s common in many of the design changes is the imposition of additional steps. The addition of friction as a design philosophy seems to run counter to the prevalent one which is to remove as much friction, to create "one click solutions" to problems. Nextdoor’s design actually imposes a few additional steps to try to guide the claim about people’s observations about others maybe doing bad things in their neighborhoods.

Randy: Wired magazine doing an article called "Nextdoor Breaks a Sarcred Design Rule to End Racial Profiling" by Margaret Rhodes which details the walkthroughs. You don’t actually have to join Nextdoor to see the changes they made but they literally do detection and handholding as you write something. The example they give is when you’re reporting a crime. People often use racial language to describe suspected perpetrator of something and this walks you through the same process the police would go through if you called them to help you tease out the real details of what was going on and not ley you slip into potentially racial profiling.

Scott: The n New York Times talks about Airbnb and how they’re taking steps to diminish and hopefully eliminate racial profiling and gender profiling when people are renting. Previously they were getting reports of, and then did their own investigation of race bias in people trying to rent from other folks in Airbnb. They’re taking active steps both in guidelines and classes for their hosts and in what they’re going to be doing about reducing racial profiling in terms of hosts renting to guests.

Another aspect that goes beyond racial profiling or some kind of negative aspect that corporations/organizations with large networks are realizing they have an influence over is the fact that YouTube us starting to deny revenue, ad revenue to content producers to channel owners if they’re swearing during their videos. Motherboard has an article that talks about the details on what’s going on.

Marc: What’s new here is platforms are directly changing the acceptable used cases that they’re going to tolerate on their platforms and in some cases their values may not agree with all their users and their decisions may not always … It may drive people into an exodus into other platforms that will impose slightly different values and standards.

The YouTube example is one that I think is interesting because that one directly addresses the kind of content you’re allowed to create in YouTube and still have the right to monetize through advertising. That one might be one where we disagree most over what constitutes a legitimate imposition from the platform. My understanding is that there is a ban on the use of profanity if you want to have advertising.

This is an example where a corporation is shaping the nature of discourse and commerce on its platform on a way that the older vision of common carriers of open platforms that users decide how they’re going to use has ended and this is a big shift given the power of these platforms because could conceivably really, really change our behavior.

Randy: Specifically YouTube is interesting because it’s focused on their revenue model. They are getting better at targeting advertising and they’re finding that the people who click through the most on advertising like certain kinds of videos that they are now getting better and better at categorizing. The advertisers are less interested in reaching out to people who have smaller audiences who are less likely to be interested in their products.

Normally if you leave the social out of it, this is how Google has evolved a lot of different sites are based on advertising including Facebook have evolved. The difference here is that YouTube was different. It was kind of distributing all the advertising evenly across everything. Some people who’s content is less lucrative from an advertising point of view were generating more personal revenue through their rev shares, through their videos than other people.

This is not YouTube necessarily acting socially but they’re acting commercially at least in part. They have the power. Sometimes they’re trying to resolve a social issue. Sometimes they’re trying to resolve revenue and it’s not always clear that those two match up.

Marc: Where’s this all going? Is this the end of the nation state and the rise of the corporation as the organizing principal for all of our societies or is this something more specific in which platforms recognize they have enormous social power and understand that they are in fact communities -- which is to say that they have norms and that they therefore have boundaries ad that if you follow the norms, you’re inside the boundary and if you don’t follow the norm, you’re on the outside of the boundary. The interesting thing to me is who gets to define what the norms are.

Scott: Regardless of who thinks they have the power, organizations who control the platform do have certain powers and one of the key powers they have in terms of helping communities actually form is the power of context -- providing the context around how people are interacting -- and making the tradeoffs between trying to be something for everyone and clarifying who they are and what those groups of people are trying to accomplish.

Randy: Everyone on this podcast has helped people build online communities of one form or another. We’ve been facing these problems for decades. They just keep coming in news waves. Before this, back in the Yahoo, Google, Microsoft and Quantum Link and those guys, those services all had to deal with it on an extra-legal basis.

When state attorneys came and would try to deal with the pedophiles on the internet age. If you remember back a decade or so -- that was the big social responsibility problem and it was basically the death of chat programs because they were unable to adapt. We’re now in a place where we’re more dependent on these systems not just for entertainment, for transportation, for housing or for our very livelihoods.

The systems have to adapt and that involves platform design and social design in the form of things like terms of service and community guidelines. Your podcasters actually provide services in this regard. If you need help sorting through this social responsibility which is now becoming part of system’s design on the internet, we can help you.

Scott: Send us email at feedback@socialmediaclarity.net or find us on Twitter @smclarity and Facebook at Social Media Clarity. We’d love to hear from you.