Now, hours of testimony and thousands of pages of documents from Facebook whistleblower Frances Haugen re-examine the impact of Facebook and its algorithms on youth, democracy, and society at large. This consequence raises the question of how far Facebook and possibly similar platforms can or should reconsider using a set of algorithms to determine which pictures, videos, and news users see.
But the algorithm for picking and choosing what we see is not only the core of Facebook, but also the core of many social media platforms that follow in Facebook’s footsteps. For example, if there is no content recommendation algorithm to run the program, TikTok will not be able to recognize it. The larger the platform, the greater the demand for algorithms for content filtering and sorting.
The algorithm will not disappear. However, Facebook has ways to improve them, algorithm and artificial intelligence experts told CNN Business. However, it needs something that Facebook has so far seemed reluctant to offer (despite executive talking points): provide users with more transparency and control.
What’s in the algorithm?
An algorithm is a set of mathematical steps or instructions, especially for a computer, that tells it how to process certain inputs to produce certain outputs. You can think of it as roughly similar to a recipe, where the ingredients are the input and the last dish is the output. However, on Facebook and other social media sites, you and your actions—what you write or pictures you post—are inputs. What social networks show you—whether it’s a post from your best friend or an advertisement for camping gear—is output.
In the best case, these algorithms can help personalize the feed so that users can discover new people and content that match their interests based on previous activities. In the worst case, as Haugen and others have pointed out, they run the risk of leading people to disturbing rabbit holes, which exposes them to toxic content and misinformation. In either case, they will allow people to scroll longer, which may help Facebook make more money by showing more ads to users.
Many algorithms work together to create the experience you see on Facebook, Instagram, and other online sites. This will make it more complicated to sort out what’s happening inside such systems, especially in a large company like Facebook, where multiple teams build various algorithms.
“If there is a higher power to go to Facebook and say,’fix the algorithm in XY’, it is really difficult, because they have become very complex systems with many inputs, many weights, and they are like multiple systems together. Work,” said Hilary Rose, senior project manager at the Berkman Klein Center for Internet and Society at Harvard University and manager of its Relaunch Social Media Institute.
“You can even imagine having some say in it. You might be able to choose your preferences for things you want to optimize for you,” she said, such as how often you want to see the content of your immediate family members, friends from high school, or Baby photos. All these things may change over time. Why not let users control them?
She said that transparency is key because it can inspire good behavior in social networks.
According to Sasha Costanza-Chock, director of research and design at the Algorithmic Justice League, another way that social networks can move towards greater transparency is to increase independent audits of their algorithmic practices. They envision this to include completely independent researchers, investigative reporters, or people within regulatory agencies–not the social media companies themselves or companies they hire–who have the knowledge, skills, and legal powers that require access to algorithmic systems to ensure that the law is not Violate and follow best practices.
James Mickens, a professor of computer science at Harvard and co-director of the Berkman Klein Center for the Relaunch of the Social Media Institute, suggests finding ways to audit elections without revealing voters’ private information (such as who each person votes for) To gain insights on how to audit and reform algorithms. He believes this can provide some insights for building an audit system that allows people outside of Facebook to provide oversight while protecting sensitive data.
Other measures of success
Experts say that a big obstacle to meaningful improvements is the current focus on the importance of participation in social networks, or the time users spend scrolling, clicking, and interacting with social media posts and advertisements.
Experts say that changing this is tricky, although some people agree that this may involve considering how users feel when using social media, not just how long they use social media.
“Participation is not synonymous with good mental health,” Mikens said.
But can algorithms really help solve Facebook’s problems? At least, Mikans hopes the answer is yes. He does think they can be more in the public interest. “The question is: what will persuade these companies to start thinking like this?” he said.
In the past, some people might say that this requires pressure from advertisers, who have funds to support these platforms. But in her testimony, Hoogen seemed to bet on a different answer: pressure from Congress.