Facebook’s success is based on algorithms. Can they also fix it?

Read Time:6 Minute, 56 Second


Now, hours of testimony and thousands of pages of documents from Facebook whistleblower Frances Haugen re-examine the impact of Facebook and its algorithms on youth, democracy, and society at large. This consequence raises the question of how far Facebook and possibly similar platforms can or should reconsider using a set of algorithms to determine which pictures, videos, and news users see.

Haugen is a former Facebook product manager with a background in “algorithmic product management”. Her criticism focused on the company’s algorithm, which aims to show users the content they are most likely to participate in. She has said that this is responsible for many of Facebook’s problems, including the promotion of polarization, misinformation and other toxic content. She said in the “60 Minutes” show that Facebook understands that if it makes the algorithm more secure, “people will spend less time on the site, they will click on fewer ads, and they will make less money.” ( Facebook CEO Mark Zuckerberg (Mark Zuckerberg) dismissed the company’s view that profits are prioritized to the safety and well-being of users.)
Facebook’s head of global policy management, Monika Bickert, told CNN in an interview with CNN after the Haugen Senate hearing on Tuesday that the company’s claims that the company’s algorithms are designed to promote inflammatory content are “incorrect” , And Facebook actually “just the opposite” downgrades so-called click bait.
Sometimes in her testimony, Haugen seemed to suggest a thorough rethinking of how news feeds should work in order to resolve the issues she raised through a large number of documents within the company. “I strongly support chronological order,” she said in testimony to the Senate subcommittee last week. “Because I don’t think we want computers to determine what we focus on.”

But the algorithm for picking and choosing what we see is not only the core of Facebook, but also the core of many social media platforms that follow in Facebook’s footsteps. For example, if there is no content recommendation algorithm to run the program, TikTok will not be able to recognize it. The larger the platform, the greater the demand for algorithms for content filtering and sorting.

The algorithm will not disappear. However, Facebook has ways to improve them, algorithm and artificial intelligence experts told CNN Business. However, it needs something that Facebook has so far seemed reluctant to offer (despite executive talking points): provide users with more transparency and control.

A woman's hand holding an iPhone X using facebook with a login screen.  Facebook is the largest social network and the most popular social network in the world.

What’s in the algorithm?

The Facebook you are experiencing today has a steady flow of algorithmic selection of information and advertisements, which is very different from the early social networks. When Facebook first launched as a college student website in 2004, the navigation was simple and tedious: if you want to view the content posted by your friends, you have to visit their profile once.
This situation began to change significantly in 2006, when Facebook launched a news feed to provide users with the latest news from family, friends, and the person with whom they had a few unpleasant dates. According to reports, from the beginning, Facebook has used algorithms to filter what users see in news feeds. In a report in Time Magazine in 2015, Chris Cox, the company’s chief product officer, stated that even then, curation was necessary because there was too much information to show every user. Over time, Facebook’s algorithms have continued to evolve, and users have become accustomed to the algorithms that determine how Facebook’s content is presented.

An algorithm is a set of mathematical steps or instructions, especially for a computer, that tells it how to process certain inputs to produce certain outputs. You can think of it as roughly similar to a recipe, where the ingredients are the input and the last dish is the output. However, on Facebook and other social media sites, you and your actions—what you write or pictures you post—are inputs. What social networks show you—whether it’s a post from your best friend or an advertisement for camping gear—is output.

In the best case, these algorithms can help personalize the feed so that users can discover new people and content that match their interests based on previous activities. In the worst case, as Haugen and others have pointed out, they run the risk of leading people to disturbing rabbit holes, which exposes them to toxic content and misinformation. In either case, they will allow people to scroll longer, which may help Facebook make more money by showing more ads to users.

Many algorithms work together to create the experience you see on Facebook, Instagram, and other online sites. This will make it more complicated to sort out what’s happening inside such systems, especially in a large company like Facebook, where multiple teams build various algorithms.

“If there is a higher power to go to Facebook and say,’fix the algorithm in XY’, it is really difficult, because they have become very complex systems with many inputs, many weights, and they are like multiple systems together. Work,” said Hilary Rose, senior project manager at the Berkman Klein Center for Internet and Society at Harvard University and manager of its Relaunch Social Media Institute.

More transparent

However, there are ways to make these processes clearer and give users more say in how they work. Margaret Mitchell (Margaret Mitchell) is the head of artificial intelligence ethics at AI model builder Hugging Face. She was the co-leader of Google’s ethical artificial intelligence team. She thinks this can be done by allowing you to see why The details of what you see on social networks, such as replying to posts, advertisements, and other content you view and interact with.
Why whistleblower Frances Haugen is Facebook’s worst nightmare

“You can even imagine having some say in it. You might be able to choose your preferences for things you want to optimize for you,” she said, such as how often you want to see the content of your immediate family members, friends from high school, or Baby photos. All these things may change over time. Why not let users control them?

She said that transparency is key because it can inspire good behavior in social networks.

According to Sasha Costanza-Chock, director of research and design at the Algorithmic Justice League, another way that social networks can move towards greater transparency is to increase independent audits of their algorithmic practices. They envision this to include completely independent researchers, investigative reporters, or people within regulatory agencies–not the social media companies themselves or companies they hire–who have the knowledge, skills, and legal powers that require access to algorithmic systems to ensure that the law is not Violate and follow best practices.

James Mickens, a professor of computer science at Harvard and co-director of the Berkman Klein Center for the Relaunch of the Social Media Institute, suggests finding ways to audit elections without revealing voters’ private information (such as who each person votes for) To gain insights on how to audit and reform algorithms. He believes this can provide some insights for building an audit system that allows people outside of Facebook to provide oversight while protecting sensitive data.

Other measures of success

Experts say that a big obstacle to meaningful improvements is the current focus on the importance of participation in social networks, or the time users spend scrolling, clicking, and interacting with social media posts and advertisements.

Haugen revealed Facebook’s internal documents, showing that the social network is aware that its “core product mechanisms, such as virality, recommendation and engagement optimization, are an important part of the “booming” of hate speech and misinformation on its platform.

Experts say that changing this is tricky, although some people agree that this may involve considering how users feel when using social media, not just how long they use social media.

“Participation is not synonymous with good mental health,” Mikens said.

But can algorithms really help solve Facebook’s problems? At least, Mikans hopes the answer is yes. He does think they can be more in the public interest. “The question is: what will persuade these companies to start thinking like this?” he said.

In the past, some people might say that this requires pressure from advertisers, who have funds to support these platforms. But in her testimony, Hoogen seemed to bet on a different answer: pressure from Congress.

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %