Connect with us


Exclusive: Facebook Opens Up About False News



NEWS FEED, THE algorithm that powers the core of Facebook, resembles a giant irrigation system for the world’s information. Working properly, it nourishes all the crops that different people like to eat. Sometimes, though, it gets diverted entirely to sugar plantations while the wheat fields and almond trees die. Or it gets polluted because Russian trolls and Macedonian teens toss in LSD tablets and dead raccoons.

For years, the workings of News Feed were rather opaque. The company as a whole was shrouded in secrecy. Little about the algorithms got explained and employees were fired for speaking out of turn to the press. Now Facebook is everywhere. Mark Zuckerberg has been testifying to the European Parliament via livestream, taking hard questionsfrom reporters, and giving tech support to the Senate. Senior executives are tweeting. The company is running ads during the NBA playoffs.

In that spirit, Facebook is today making three important announcements on false news, to which WIRED got an early and exclusive look. In addition, WIRED was able to sit down for a wide-ranging conversation with eight generally press-shy product managers and engineers who work on News Feed to ask detailed questions about the workings of the canals, dams, and rivers that they manage.

The first new announcement: Facebook will soon issue a request for proposals from academics eager to study false news on the platform. Researchers who are accepted will get data and money; the public will get, ideally, elusive answers to how much false news actually exists and how much it matters. The second announcement is the launch of a public education campaign that will utilize the top of Facebook’s homepage, perhaps the most valuable real estate on the internet. Users will be taught what false news is and how they can stop its spread. Facebook knows it is at war, and it wants to teach the populace how to join its side of the fight. The third announcement—and the one the company seems most excited about—is the release of a nearly 12-minute video called “Facing Facts,” a title that suggests both the topic and the repentant tone.

The film, which is embedded at the bottom of this post, stars the product and engineering managers who are combating false news, and was directed by Morgan Neville, who won an Academy Award for 20 Feet from Stardom. That documentary was about backup singers, and this one essentially is too. It’s a rare look at the people who run News Feed: the nerds you’ve never heard of who run perhaps the most powerful algorithm in the world. In Stardom, Neville told the story through close-up interviews and B-roll of his protagonists shaking their hips on stage. This one is told through close-up interviews and B-roll of his protagonists staring pensively at their screens.

In many ways, News Feed is Facebook: It’s an algorithm comprised of thousands of factors that determines whether you see baby pictures, white papers, shitposts, or Russian agitprop. Facebook typically guards information about the way the Army guards Fort Knox. This makes any information about it valuable, which makes the film itself valuable. And right from the start, Neville signals that he’s not going to merely scoop out a bowl of peppermint propaganda. The opening music is slightly ominous, leading into the voice of John Dickerson, of CBS News, intoning about the bogus stories that flourished on the platform during the 2016 election. Critical news headlines blare, and Facebook employees, one carrying a skateboard and one a New Yorkertote, move methodically up the stairs into headquarters.

‘Is there a silver bullet? There isn’t.’


The message is clear: Facebook knows it screwed up, and it wants us all to know it knows it screwed up. The company is confessing and asking for redemption. “It was a really difficult and painful thing,” intones Adam Mosseri, who ran News Feed until recently, when he moved over to run product at Instagram. “But I think the scrutiny was fundamentally a helpful thing.”

After the apology, the film moves into exposition. The product and engineering teams explain the importance of fighting false news and some of the complexities of that task. Viewers are taken on a tour of Facebook’s offices, where everyone seems to work hard and where there’s a giant mural of Alan Turing made of dominos. At least nine times during the film, different employees scratch their chins.

Oddly, the most clarifying and energizing moments in “Facing Facts” involve whiteboards. There’s a spot three and a half minutes in when Eduardo Ariño de la Rubia, a data science manager for News Feed, draws a grid with X and Y axes. He’s charismatic and friendly, and he explains that posts on Facebook can be broken into four categories, based on the intent of the author and the truthfulness of the content: innocent and false; innocent and true; devious and false; devious and true. It’s the latter category—including examples of cherry-picked statistics—that might be the most vexing.

A few minutes later, Dan Zigmond—author of the book Buddha’s Diet, incidentally—explains the triptych through which troublesome posts are countered: remove, reduce, inform. Terrible things that violate Facebook’s Terms of Service are removed. Clickbait is reduced. If a story appears fishy to fact-checkers, readers are informed. Perhaps they will be shown related stories, or more information on the publisher. It’s like a parent who doesn’t take the cigarettes away but who drops down a booklet on lung cancer and then stops taking them to the drug store. Zigmond’s whiteboard philosophy is also at the core of a Hard Questions blog post Facebook published today.

The central message of the film is that Facebook really does care profoundly about false news. The company was slow to realize the pollution building up in News Feed, but now it is committed to cleaning it up. Not only does Facebook care, it’s got young, dedicated people who are on it. They’re smart, too. John Hegeman, who now runs News Feed, helped build the Vickrey-Clark-Groves auction system for Facebook advertising, which has turned it into one of the most profitable businesses of all time.

The question for Facebook, though, is no longer whether it cares. The question is whether the problem can be solved. News Feed has been tuned, for years, to maximize our attention and in many ways our outrage. The same features that incentivized publishers to create clickbait are the ones that let false news fly. News Feed has been nourishing the sugar plantations for a decade. Can it really help grow kale, or even apples?

To try to get at this question, on Monday, I visited with the nine stars of the film, who sat around a rectangular table in a Facebook conference room and explained the complexities of their work. (A transcript of the conversation can be read here.) The company has made all sorts of announcements since December 2016 about its fight against false news. It has partnered with fact-checkers, limited the ability of false news sites to make money off their schlock, and created machine-learning systems for combatting clickbait. And so I began the interview by asking what had mattered most.

The answer, it seems, is both simple and complex. The simple part is that Facebook has found that just strictly applying its rules—”blocking and tackling,” Hegeman calls it—has knocked many purveyors of false news off the platform. The people who spread malarkey also often set up fake accounts or break basic community standards. It’s like a city police force that cracks down on the drug trade by arresting people for loitering.

In the long run, though, Facebook knows that complex machine-learning systems are the best tool. To truly stop false news, you need to find false news, and you need machines to do that because there aren’t enough humans around. And so Facebook has begun integrating systems—used by Instagram in its efforts to battle meanness—based on human-curated datasets and a machine-learning product called DeepText.

Here’s how it works. Humans, perhaps hundreds of them, go through tens or hundreds of thousands of posts identifying and classifying clickbait—”Facebook left me in a room with nine engineers and you’ll never believe what happened next.” This headline is clickbait; this one is not. Eventually, Facebook unleashes its machine-learning algorithms on the data the humans have sorted. The algorithms learn the word patterns that humans consider clickbait, and they learn to analyze the social connections of the accounts that post it. Eventually, with enough data, enough training, and enough tweaking, the machine-learning system should become as accurate as the people who trained it—and a heck of a lot faster.

In addition to identifying clickbait, the company used the system to try to identify false news. This problem is harder: For one, it’s not as simple as analyzing a simple, discrete chunk of text, like a headline. Secondly, as Tessa Lyons, a product manager helping to oversee the project, explained in our interview, truth is harder to define than clickbait. So Facebook has created a database of all the stories flagged by the fact-checking organizations that it has partnered with since late 2016. It then combines this data with other signals, including reader comments, to try to train the model. The system also looks for duplication, because, as Lyons says, “the only thing cheaper than creating fake news is copying fake news.” Facebook does not, I was told in the interview, actually read the content of the article and try to verify it. That is surely a project for another day.

Interestingly, the Facebook employees explained, all clickbait and false news is treated the same, no matter the domain. Consider these three stories that have spread on the platform in the past year.

“Morgue employee cremated by mistake while taking a nap.” “President Trump orders the execution of five turkeys pardoned by Obama.” “Trump sends in the feds— Sanctuary City Leaders Arrested.”

The first is harmless; the second involves politics, but it’s mostly harmless. (In fact it’s rather funny.) The third could scare real people and bring protesters into the streets. Facebook could, theoretically, deal with each of these kinds of false news differently. But according to the News Feed employees I spoke with, it does not. All headlines pass through the same system and are evaluated the same way. In fact, all three of these examples seem to have gotten through and started to spread.

Why doesn’t Facebook give political news strict scrutiny? In part, Lyons said, because stopping the trivial stories helps the company stop the important ones. Mosseri added that weighting different categories of misinformation differently might be something that the company considers later. “But with this type of integrity work I think it’s important to get the basics done well, make real strong progress there, and then you can become more sophisticated,” he said.

Behind all this though is the larger question. Is it better to keep adding new systems on top of the core algorithm that powers News Feed? Or might it be better to radically change News Feed?

I pushed Mosseri on this question. News Feed is based on hundreds, or perhaps thousands, of factors, and as anyone who has run a public page knows, the algorithm rewards outrage. A story titled “Donald Trump is a trainwreck on artificial intelligence,” will spread on Facebook. A story titled “Donald Trump’s administration begins to study artificial intelligence” will go nowhere. Both stories could be true, and the first headline isn’t clickbait. But it pulls on our emotions. For years, News Feed—like the tabloids—has heavily rewarded this kind of story, in part because the ranking was heavily based on simple factors that correlate with outrage and immediate emotional reactions.

Now, according to Mosseri, the algorithm is starting to take into account more serious factors that correlate with a story’s quality, not just its emotional tug. In our interview, he pointed out that the algorithm now gives less value to “lighter weight interactions like clicks and likes.” In turn, it is putting more priority on “heavier weight things like how long do we think you’re going to watch a video for? Or how long do we think you’re going to read an article for? Or how informative do you think you’d say this article is if we asked you?” News Feed, in a new world, might give more value to a well-read, informative piece about Trump and artificial intelligence, instead of just a screed.

‘Two billion people around the world are counting on us to fix this.’


Perhaps the most existential question for Facebook is whether the nature of its business inexorably helps the spread of false news. Facebook makes money by selling targeted ads, which means it needs to know how to target people. It gathers as much data as it can about each of its users. This data can, in turn, be used by advertisers to find and target potential fans who will be receptive to their message. That’s useful if an advertiser like Pampers wants to sell diapers only to the parents of newborns. It’s not great if the advertiser is a fake-news purveyor who wants to find gullible people who can spread his message. In a podcast with Bloomberg, Cyrus Massoumi, who created a site called Mr. Conservative, which spread all kinds of false news during the 2016 election, explained his modus operandi. “There’s a user interface and you create ads and then you create an image and advert, so lets say, for example, an image of Obama. And it will say ‘Like if you think Obama is the worst president ever.’ Or, for Trump, ‘Like if you think Trump should be impeached.’ And then you pay a price for those fans, and then you retain them.”

In response to a question about this, Ariño de la Rubia noted that the company does go after any page it suspects of publishing false news. Massoumi, for example, now says he can’t make any money from the platform. “Is there a silver bullet?” Ariño de la Rubia asked. “There isn’t. It’s adversarial, and misinformation can come from any place that humans touch and humans can touch lots of places.”

Pushed on the related question of the possibility of shutting down political Groups into which users have put themselves, Mosseri noted that it would indeed stop some of the spread of false news. But, he said, “you’re also going to reduce a whole bunch of healthy civic discourse. And now you’re really destroying more value than problems that you’re avoiding.”

Should Facebook be cheered for its efforts? Of course. Transparency is good, and the scrutiny from journalists and academics (or at least most academics) will be good. But to some close analysts of the company, it’s important to note that this is all coming a little late. “We don’t applaud Jack Daniels for putting warning labels about drinking while pregnant. And we don’t cheer GM for putting seat belts and airbags in their cars,” says Ben Scott, a senior adviser to the Open Technology Institute at the New America Foundation. “We’re glad they do, but it goes with the territory of running those kinds of businesses.”

Ultimately, the most important question for Facebook is how well all these changes work. Do the rivers and streams get clean enough that they feel safe to swim in? Facebook knows that it has removed a lot of claptrap from the platform. But what will happen in the American elections this fall? What will happen in the Mexican elections this summer?

Most importantly, what will happen as the problem gets more complex? False news is only going to get more complicated, as it moves from text to images to video to virtual reality to, one day, maybe, computer-brain interfaces. Facebook knows this, which is why the company is working so hard on the problem and talking so much. “Two billion people around the world are counting on us to fix this,” Zigmond said.

 Source: WIRED


Continue Reading
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *


Breaking: Witness Radio and Partners to Launch Human Rights Monitoring, Documentation, and Advocacy Project Tomorrow.



By Witness Radio Team.

Witness Radio, in collaboration with Dan Church Aid (DCA) and the National Coalition for Human Rights Defenders (NCHRD), is set to launch the Monitoring, Documentation, and Advocacy for Human Rights in Uganda (MDA-HRU) project tomorrow, 22nd February 2024, at Kabalega Resort Hotel in Hoima District.

The project, funded by the European Union, aims to promote the protection and respect for human rights, and enable access to remedy where violations occur especially in the Mid-Western and Karamoja sub-regions where private sector actors are increasingly involved in land-based investments (LBIs) through improved documentation, and evidence-based advocacy.

The three-year project, which commenced in October 2023, focuses its activities in the Mid-Western sub-region, covering Bulisa, Hoima, Masindi, Kiryandongo, Kikuube, Kagadi, Kibale, and Mubende districts, and Karamoja sub-region, covering Moroto, Napak, Nakapiripirit, Amudat, Nabilatuk, Abim, Kaabong, Kotido, and Karenga districts.

The project targets individuals and groups at high risk of human rights violations, including Human Rights Defenders (HRDs) and Land and Environmental Defenders (LEDs). It also engages government duty bearers such as policymakers and implementers in relevant ministries and local governments, recognizing their crucial role in securing land and environmental rights. Additionally, the project involves officials from institutional duty bearers including the Uganda Human Rights Commission (UHRC), Equal Opportunities Commission, and courts, among others.

Representatives from the international community, faith leaders, and business actors are also included in the project’s scope, particularly those involved in land-based investments (LBIs) impacting the environment.

The project was initially launched in Moroto for the Karamoja region on the 19th of this month with the leadership of the National Coalition for Human Rights Defenders (NCHRD).

According to the project implementors,  the action is organized into four activity packages aimed at; enhancing the capacity and skills of Human Rights Defenders (HRDs) and Land and Environmental Defenders (LEDs) in monitoring, documentation, reporting (MDR), and protection, establishing and reinforcing reporting and documentation mechanisms for advocacy and demand for corporate and government accountability;  providing response and support to HRDs and marginalized communities; and lastly facilitating collaboration and multi-stakeholder engagements that link local and national issues to national and international frameworks and spaces.

Continue Reading


Kiryandongo leadership agree to partner with Witness Radio Uganda to end rampant forced land evictions in the district.



By Witness Radio team.

Kiryandongo district leaders have embraced Witness Radio’s collaboration with the Kiryandongo district aimed at ending the rampant violent and illegal land evictions that have significantly harmed the livelihoods of the local communities in the area.

The warm welcome was made at the dialogue organized by Witness Radio Uganda, Uganda’s leading land and environmental rights watchdog at the Kiryandongo district headquarters, intended to reflect on the plight of land and environmental rights defenders, local and indigenous communities and the role of responsible land-based investments in protecting people and the planet.

Speaking at the high-level dialogue, that was participated in by technical officers, policy implementers, religious leaders, leaders of project affected persons (PAPs), politicians, media, Civil Society Organizations (CSOs), and development partners that support land and environment rights as well as the Land Based Investments (LBIs) Companies in the Kiryandongo district, the leaders led by the District Local Council 5 Chairperson, Ms. Edith Aliguma Adyeri appreciated the efforts taken by Witness Radio organization to organize the dialogue meeting aimed at bringing together stakeholders to safeguard community land and environmental rights in order address the escalating vice of land grabbing in the area.

During the dialogue, participants shared harrowing accounts of the impacts of land evictions and environmental degradation, including tragic deaths, families torn asunder, young girls forced into marriage, a surge in teenage pregnancies, limited access to education, and significant environmental damage which have profoundly affected the lives of the local population in Kiryandongo.

Participants attending the dialogue.

In recent years, Kiryandongo district has been embroiled in violent land evictions orchestrated to accommodate multinational large-scale agriculture plantations and wealthy individuals leaving the poor marginalized.

According to various reports, including findings from Witness Radio’s 2020 research Land Grabs at a Gun Point, the forceful land acquisitions in Kiryandongo have significantly impacted the livelihoods of local communities. It is estimated that nearly 40,000 individuals have been displaced from their land to make room for land-based investments in the Kiryandongo district. However, leaders in the district also revealed in the dialogue that women and children are affected most.

The Kiryandongo Deputy Resident District Commissioner, Mr. Jonathan Akweteireho, emphasized that all offices within the Kiryandongo district are actively involved in addressing the prevalent land conflicts. He also extended a welcome to Witness Radio, acknowledging their collaborative efforts in tackling and resolving land and environmental issues in the district.

“Ladies and gentlemen, we all know that the land rights together with environmental rights have been violated in our district, but because we don’t know what our rights are, because we have not directly done what we could to safeguard our rights and now this is the time that Witness Radio has brought us together to safeguard our rights. I want to welcome you in Kiryandongo and be rest assured that we shall give you all the necessary support to help us manage these rampant cases,” Ms. Adyeri said in her remarks during the dialogue meeting.

The team leader at Witness Radio Uganda, Mr. Geoffrey Wokulira Ssebaggala expressed gratitude to the participants for their active involvement in the dialogue and revealed that Witness Radio’s objective is to find a holistic solution to the escalating land disputes in Kiryandongo district serving as an example to other districts.

“We are here to assist Kiryandongo district in attaining peace and stability because it stands as a hotspot for land grabbers in Uganda. Mismanagement of land conflicts in Uganda could potentially lead to a significant internal conflict. Everywhere you turn, voices are lamenting the loss of their land and property. Kiryandongo, abundant with ranches, suffers from a lack of a structured framework, which amplifies these land conflicts. The influx of wealthy investors further complicates the situation,” Mr. Ssebaggala disclosed.

Within the dialogue, Mr. Ssebaggala emphasized the need for the Kiryandongo district council to pass a by-law aimed at curbing land evictions as an initial step in addressing the prevalent land injustices.

Continue Reading


Kiryandongo authorities decry rising cases of land disputes



The LC5 chairperson of Kiryandongo, Ms Edith Aliguma Adyeri, has saidnland dispute has impacted on people’s lives, dignity and children’s education in the district.

Just like other parts of Uganda, conflicts over land in Kiryandongo arise when individuals – who often are blood relatives – compete for use of the same parcel of land or when members of the community lay claim over ownership of unutilised government land.

Ms Adyeri further said land and environmental rights affect people both directly and indirectly, “and we are not hearing it from afar. It is already together with us [here], it has already affected us!”

She was speaking at a meeting which sought to discuss alternative remedies to salvage the appalling land and environmental rights situation in Kiryandongo at the district headquarters on Thursday.

The one-day dialogue was aimed at reflecting on the plight of land and environmental rights defenders, local and indigenous communities and the role of responsible land-based investments in protecting people and the planet.

It was attended by private companies, members of civil society and local government officials and organised by Witness Radio – an advocate for land and environmental rights in Uganda – in partnership with Oxfam, and Kiryandongo District leadership.

“Some people have even died, families are broken up, and brothers are not seeing eye-to-eye because of land rights. Access to justice is equally becoming very difficult because when you hire one lawyer that
lawyer will talk to learned friends, and they agree. They leave you in suspense,” Ms Adyeri said.

According to her, some children have not accessed education because of land and environmental rights.

Mr Jonathan Akweteireho, the deputy Resident District Commissioner of Kiryandongo, said enlightened people especially should be sensitive to the historical injustice of this area.

“We can never handle the Bonyoro land question without thinking about that history. It will be an injustice to the incomers, to the government and to the leaders who don’t understand,” he said.

“We had 38 ranches here which on the guidance of these international organisations, especially the World Bank, the government restructured them, allowing people to settle there, they were never given titles and up to today, there are big problems in all those ranches,” he added.

Mr Jeff Wokulira Ssebaggala, the executive director of Witness Radio, said that a well-functional land sector supports land users or holders and investors, reduces inefficiencies and provides mechanisms to resolve land disputes.

Mr David Kyategeka, the secretary to the Kiryandongo District Land Board, said the issue of land rights is very clear but the major challenge has been sensitising the locals to know what rights he or she expects to enjoy out of this very important resource.


Continue Reading

Resource Center

Legal Framework




Subscribe to Witness Radio's newsletter


Subscribe to Witness Radio's newsletter