Northeastern University

A friendlier Facebook

Print Friendly
 

Image via iStockphoto.

Image via iStockphoto.

Online social interactions have created an entirely new way for humans to interact, and the standard operating procedures don’t work. We don’t get the immediate feedback we do with face-to-face interactions–cues about the other person’s reaction through their expressions and gestures, for example. But we subconsciously rely on those cues to proceed in a functional manner. 

I don’t remember the first time I heard the term “cyber bullying,” but it wasn’t too long ago and since then the phenomenon has become somewhat ubiquitous. It’s a lot easier to say something mean online than it is in person. And the most ginormous social network out there knows it. Facebook has become a hotbed not just for sharing cat videos and the contents of your breakfast plate, but also interpersonal conflict.

Last year, there were 219 billion photographs posted to Facebook last year, according to Arturo Bejar who recently spoke at Northeastern for the Emotion and Technology conference. “And a lot of them got reported,” he said. When given the standard drop-down list of reasons for the report, these photos turned up as representing nudity or hate speech or harassment. But when Bejar’s team took a closer look, they found some surprising stuff.

An adorable picture of a husky puppy might be reported for hate speech. A photo of a fully clothed couple posing at an amusement park might be reported for nudity. When the team looked even closer they found that most of the time, the person doing the reporting was actually pictured in the photo in question and that most of the time they were friends with the person who uploaded it.

Facebook wasn’t providing the appropriate resources for people to deal with their online interpersonal issues. They had to resort to the inadequate resources available instead. So, if, for example, an ex boyfriend posted a photo of Sally and she didn’t want it online, she’d have to use the “report” button to get it erased. But the options for why she was reporting it didn’t have anything accounting for her actual feelings. So she’d just say it was a picture of nudity and move on.

Likewise, if a fellow student posted something like, “Sally was crying in school today LOL,” she wouldn’t have a good way for dealing with it other than the standard protocol when people “break the rules of a site,” said Bejar. “Sally was crying in school today” isn’t exactly hate speech, but there was previously nothing else better to describe it.

And that’s all assuming that Sally even clicks “report” in the first place. What if she were worried about being “found out” or even for getting the other person in trouble? Maybe she’d never click at all. When it comes to cyber bullying, Bejar’s team found, that’s often the case.

They took a scientific approach to dealing with all these issues until they came up with a new “flow” which hi showed with pretty impressive data is much more effective than it was a year ago. Now, instead of having a “report” button, every Facebook  simply allows you to say “I don’t want to see this.”

Depending on where you live, how old you are, and your gender, you’ll see a few different options appear next. That’s because Bejar and his team did some pretty exhaustive analysis looking at how different demographics responded. Girls were more likely to report stuff than boys, and when boys did report stuff they didn’t usually explain why they did so with an emotion word. They’d say, “none of the above,” instead.

People in different countries have different ideas of what’s polite and so the content they’ll report on will be a bit different, and the options for dealing with it will have to be also.

“The number of things go through the flow that are minor issues is huge,” said Bejar. “The number of things that go through that are very important because they’re very sensitive is small. And we need to build something that does a really good job in both situations.”

Ultimately, Facebook built an entire bullying prevention hub, complete with downloadable guides helping teens, parents, and educators deal with the problems online and off. After all, what happens on Facebook rarely stays on Facebook. If you’re made fun of by a classmate, you’re going to see that person in school tomorrow. So maybe blocking them isn’t the best solution.

“We have a responsibility to provide people with the tools to navigate these issues,” said Bejar. In most cases it will be as simple as providing someone with the language (which has been matched to their age, gender, and culture) to talk to the person who posted whatever they’re concerned about. But in some cases it’ll be more serious than that. And Facebook isn’t shying away from it.

We can say all we want about the multi-billion user network being the big-bad corporation encroaching on our privacy and manipulating our minds with targeted advertising, but the fact is we use it. A lot. Our social interactions have morphed because of it (I saw a friend the other day who thought she’d met my husband before because she’d seen him on Facebook; she only realized she hadn’t when she heard him speak). I for one found it really refreshing to learn that they’re actually putting a lot of time, money, and resources into figuring out how they can help facilitate healthy interactions between us all.


Follow

Get every new post on this blog delivered to your Inbox.

Join other followers: