WITNESS RADIO MILESTONES
Exclusive: Facebook Opens Up About False News
NEWS FEED, THE algorithm that powers the core of Facebook, resembles a giant irrigation system for the world’s information. Working properly, it nourishes all the crops that different people like to eat. Sometimes, though, it gets diverted entirely to sugar plantations while the wheat fields and almond trees die. Or it gets polluted because Russian trolls and Macedonian teens toss in LSD tablets and dead raccoons.
For years, the workings of News Feed were rather opaque. The company as a whole was shrouded in secrecy. Little about the algorithms got explained and employees were fired for speaking out of turn to the press. Now Facebook is everywhere. Mark Zuckerberg has been testifying to the European Parliament via livestream, taking hard questionsfrom reporters, and giving tech support to the Senate. Senior executives are tweeting. The company is running ads during the NBA playoffs.
In that spirit, Facebook is today making three important announcements on false news, to which WIRED got an early and exclusive look. In addition, WIRED was able to sit down for a wide-ranging conversation with eight generally press-shy product managers and engineers who work on News Feed to ask detailed questions about the workings of the canals, dams, and rivers that they manage.
The first new announcement: Facebook will soon issue a request for proposals from academics eager to study false news on the platform. Researchers who are accepted will get data and money; the public will get, ideally, elusive answers to how much false news actually exists and how much it matters. The second announcement is the launch of a public education campaign that will utilize the top of Facebook’s homepage, perhaps the most valuable real estate on the internet. Users will be taught what false news is and how they can stop its spread. Facebook knows it is at war, and it wants to teach the populace how to join its side of the fight. The third announcement—and the one the company seems most excited about—is the release of a nearly 12-minute video called “Facing Facts,” a title that suggests both the topic and the repentant tone.
The film, which is embedded at the bottom of this post, stars the product and engineering managers who are combating false news, and was directed by Morgan Neville, who won an Academy Award for 20 Feet from Stardom. That documentary was about backup singers, and this one essentially is too. It’s a rare look at the people who run News Feed: the nerds you’ve never heard of who run perhaps the most powerful algorithm in the world. In Stardom, Neville told the story through close-up interviews and B-roll of his protagonists shaking their hips on stage. This one is told through close-up interviews and B-roll of his protagonists staring pensively at their screens.
In many ways, News Feed is Facebook: It’s an algorithm comprised of thousands of factors that determines whether you see baby pictures, white papers, shitposts, or Russian agitprop. Facebook typically guards information about the way the Army guards Fort Knox. This makes any information about it valuable, which makes the film itself valuable. And right from the start, Neville signals that he’s not going to merely scoop out a bowl of peppermint propaganda. The opening music is slightly ominous, leading into the voice of John Dickerson, of CBS News, intoning about the bogus stories that flourished on the platform during the 2016 election. Critical news headlines blare, and Facebook employees, one carrying a skateboard and one a New Yorkertote, move methodically up the stairs into headquarters.
‘Is there a silver bullet? There isn’t.’
EDUARDO ARIÑO DE LA RUBIA
The message is clear: Facebook knows it screwed up, and it wants us all to know it knows it screwed up. The company is confessing and asking for redemption. “It was a really difficult and painful thing,” intones Adam Mosseri, who ran News Feed until recently, when he moved over to run product at Instagram. “But I think the scrutiny was fundamentally a helpful thing.”
After the apology, the film moves into exposition. The product and engineering teams explain the importance of fighting false news and some of the complexities of that task. Viewers are taken on a tour of Facebook’s offices, where everyone seems to work hard and where there’s a giant mural of Alan Turing made of dominos. At least nine times during the film, different employees scratch their chins.
Oddly, the most clarifying and energizing moments in “Facing Facts” involve whiteboards. There’s a spot three and a half minutes in when Eduardo Ariño de la Rubia, a data science manager for News Feed, draws a grid with X and Y axes. He’s charismatic and friendly, and he explains that posts on Facebook can be broken into four categories, based on the intent of the author and the truthfulness of the content: innocent and false; innocent and true; devious and false; devious and true. It’s the latter category—including examples of cherry-picked statistics—that might be the most vexing.
A few minutes later, Dan Zigmond—author of the book Buddha’s Diet, incidentally—explains the triptych through which troublesome posts are countered: remove, reduce, inform. Terrible things that violate Facebook’s Terms of Service are removed. Clickbait is reduced. If a story appears fishy to fact-checkers, readers are informed. Perhaps they will be shown related stories, or more information on the publisher. It’s like a parent who doesn’t take the cigarettes away but who drops down a booklet on lung cancer and then stops taking them to the drug store. Zigmond’s whiteboard philosophy is also at the core of a Hard Questions blog post Facebook published today.
The central message of the film is that Facebook really does care profoundly about false news. The company was slow to realize the pollution building up in News Feed, but now it is committed to cleaning it up. Not only does Facebook care, it’s got young, dedicated people who are on it. They’re smart, too. John Hegeman, who now runs News Feed, helped build the Vickrey-Clark-Groves auction system for Facebook advertising, which has turned it into one of the most profitable businesses of all time.
The question for Facebook, though, is no longer whether it cares. The question is whether the problem can be solved. News Feed has been tuned, for years, to maximize our attention and in many ways our outrage. The same features that incentivized publishers to create clickbait are the ones that let false news fly. News Feed has been nourishing the sugar plantations for a decade. Can it really help grow kale, or even apples?
To try to get at this question, on Monday, I visited with the nine stars of the film, who sat around a rectangular table in a Facebook conference room and explained the complexities of their work. (A transcript of the conversation can be read here.) The company has made all sorts of announcements since December 2016 about its fight against false news. It has partnered with fact-checkers, limited the ability of false news sites to make money off their schlock, and created machine-learning systems for combatting clickbait. And so I began the interview by asking what had mattered most.
The answer, it seems, is both simple and complex. The simple part is that Facebook has found that just strictly applying its rules—”blocking and tackling,” Hegeman calls it—has knocked many purveyors of false news off the platform. The people who spread malarkey also often set up fake accounts or break basic community standards. It’s like a city police force that cracks down on the drug trade by arresting people for loitering.
In the long run, though, Facebook knows that complex machine-learning systems are the best tool. To truly stop false news, you need to find false news, and you need machines to do that because there aren’t enough humans around. And so Facebook has begun integrating systems—used by Instagram in its efforts to battle meanness—based on human-curated datasets and a machine-learning product called DeepText.
Here’s how it works. Humans, perhaps hundreds of them, go through tens or hundreds of thousands of posts identifying and classifying clickbait—”Facebook left me in a room with nine engineers and you’ll never believe what happened next.” This headline is clickbait; this one is not. Eventually, Facebook unleashes its machine-learning algorithms on the data the humans have sorted. The algorithms learn the word patterns that humans consider clickbait, and they learn to analyze the social connections of the accounts that post it. Eventually, with enough data, enough training, and enough tweaking, the machine-learning system should become as accurate as the people who trained it—and a heck of a lot faster.
In addition to identifying clickbait, the company used the system to try to identify false news. This problem is harder: For one, it’s not as simple as analyzing a simple, discrete chunk of text, like a headline. Secondly, as Tessa Lyons, a product manager helping to oversee the project, explained in our interview, truth is harder to define than clickbait. So Facebook has created a database of all the stories flagged by the fact-checking organizations that it has partnered with since late 2016. It then combines this data with other signals, including reader comments, to try to train the model. The system also looks for duplication, because, as Lyons says, “the only thing cheaper than creating fake news is copying fake news.” Facebook does not, I was told in the interview, actually read the content of the article and try to verify it. That is surely a project for another day.
Interestingly, the Facebook employees explained, all clickbait and false news is treated the same, no matter the domain. Consider these three stories that have spread on the platform in the past year.
“Morgue employee cremated by mistake while taking a nap.” “President Trump orders the execution of five turkeys pardoned by Obama.” “Trump sends in the feds— Sanctuary City Leaders Arrested.”
The first is harmless; the second involves politics, but it’s mostly harmless. (In fact it’s rather funny.) The third could scare real people and bring protesters into the streets. Facebook could, theoretically, deal with each of these kinds of false news differently. But according to the News Feed employees I spoke with, it does not. All headlines pass through the same system and are evaluated the same way. In fact, all three of these examples seem to have gotten through and started to spread.
Why doesn’t Facebook give political news strict scrutiny? In part, Lyons said, because stopping the trivial stories helps the company stop the important ones. Mosseri added that weighting different categories of misinformation differently might be something that the company considers later. “But with this type of integrity work I think it’s important to get the basics done well, make real strong progress there, and then you can become more sophisticated,” he said.
Behind all this though is the larger question. Is it better to keep adding new systems on top of the core algorithm that powers News Feed? Or might it be better to radically change News Feed?
I pushed Mosseri on this question. News Feed is based on hundreds, or perhaps thousands, of factors, and as anyone who has run a public page knows, the algorithm rewards outrage. A story titled “Donald Trump is a trainwreck on artificial intelligence,” will spread on Facebook. A story titled “Donald Trump’s administration begins to study artificial intelligence” will go nowhere. Both stories could be true, and the first headline isn’t clickbait. But it pulls on our emotions. For years, News Feed—like the tabloids—has heavily rewarded this kind of story, in part because the ranking was heavily based on simple factors that correlate with outrage and immediate emotional reactions.
Now, according to Mosseri, the algorithm is starting to take into account more serious factors that correlate with a story’s quality, not just its emotional tug. In our interview, he pointed out that the algorithm now gives less value to “lighter weight interactions like clicks and likes.” In turn, it is putting more priority on “heavier weight things like how long do we think you’re going to watch a video for? Or how long do we think you’re going to read an article for? Or how informative do you think you’d say this article is if we asked you?” News Feed, in a new world, might give more value to a well-read, informative piece about Trump and artificial intelligence, instead of just a screed.
‘Two billion people around the world are counting on us to fix this.’
Perhaps the most existential question for Facebook is whether the nature of its business inexorably helps the spread of false news. Facebook makes money by selling targeted ads, which means it needs to know how to target people. It gathers as much data as it can about each of its users. This data can, in turn, be used by advertisers to find and target potential fans who will be receptive to their message. That’s useful if an advertiser like Pampers wants to sell diapers only to the parents of newborns. It’s not great if the advertiser is a fake-news purveyor who wants to find gullible people who can spread his message. In a podcast with Bloomberg, Cyrus Massoumi, who created a site called Mr. Conservative, which spread all kinds of false news during the 2016 election, explained his modus operandi. “There’s a user interface facebook.com/ads/manager and you create ads and then you create an image and advert, so lets say, for example, an image of Obama. And it will say ‘Like if you think Obama is the worst president ever.’ Or, for Trump, ‘Like if you think Trump should be impeached.’ And then you pay a price for those fans, and then you retain them.”
In response to a question about this, Ariño de la Rubia noted that the company does go after any page it suspects of publishing false news. Massoumi, for example, now says he can’t make any money from the platform. “Is there a silver bullet?” Ariño de la Rubia asked. “There isn’t. It’s adversarial, and misinformation can come from any place that humans touch and humans can touch lots of places.”
Pushed on the related question of the possibility of shutting down political Groups into which users have put themselves, Mosseri noted that it would indeed stop some of the spread of false news. But, he said, “you’re also going to reduce a whole bunch of healthy civic discourse. And now you’re really destroying more value than problems that you’re avoiding.”
Should Facebook be cheered for its efforts? Of course. Transparency is good, and the scrutiny from journalists and academics (or at least most academics) will be good. But to some close analysts of the company, it’s important to note that this is all coming a little late. “We don’t applaud Jack Daniels for putting warning labels about drinking while pregnant. And we don’t cheer GM for putting seat belts and airbags in their cars,” says Ben Scott, a senior adviser to the Open Technology Institute at the New America Foundation. “We’re glad they do, but it goes with the territory of running those kinds of businesses.”
Ultimately, the most important question for Facebook is how well all these changes work. Do the rivers and streams get clean enough that they feel safe to swim in? Facebook knows that it has removed a lot of claptrap from the platform. But what will happen in the American elections this fall? What will happen in the Mexican elections this summer?
Most importantly, what will happen as the problem gets more complex? False news is only going to get more complicated, as it moves from text to images to video to virtual reality to, one day, maybe, computer-brain interfaces. Facebook knows this, which is why the company is working so hard on the problem and talking so much. “Two billion people around the world are counting on us to fix this,” Zigmond said.
Complaint against unprofessional conduct of the DPC Kiryandongo district for aiding and abetting land grabbing in kiryandongo district.
Professional Standards Unit, Uganda Police-Kampala.
RE: COMPLAINT AGAINST UNPROFESSIONAL CONDUCT OF THE DPC KIRYANDONGO DISTRICT FOR AIDING AND ABETTING LAND GRABBING IN NYAMUTENDE KITWARA PARISH KIRYANDONGO DISTRICT AND CARRYING OUT ILLEGAL ARRESTS AND DETENTION OF INNOCENT RESIDENTS/ BIBANJA OWNERS FOR PROTESTING AGAINST THE ILLEGAL EVICTION FROM THEIR LAND.
We act for and behalf of the Lawful and bonafide occupants of Land described as LRV MAS 2 FOLIO 8 BLOCK 8 PLOT 22 (FORMERLY KNOWN AS RANCH 22).
Our Clients are residents of Nyamutende Village, Kitwara Parish in Kiryandongo District where they have lived for more than 30 years and sometime in 2017, they applied for a lease of the said Land to Kiryandongo District Land Board through the Directorate of Land Matters State House.
As they were still awaiting their Application to be processed, they were shocked to establish that the said land had been instead leased to and registered in the names of Isingoma Julius, Mwesige Simon, John Musokota William, Tumusiime Gerald, Wabwire Messener Gabriel, Ocema Richard and Wilson Shikhama, some of whom were not known to the Complainants. A copy of the Search is attached hereto
Our clients protested the above action and appealed to relevant offices, but were shocked to discover that the above persons had gone ahead and sold the same to a one Maseruka Robert.
Aggrieved by these actions, the Complainants appealed to the RDC who advised them to institute proceedings against the said persons, and assigned them a one Mbabazi Samuel to assist them to that effect. The said Mbabazi accordingly filed Civil Suit Noa 46 of 2019 against tne said registered proprietors at Masindi High Court challenging the illegal and fraudulent registration, sale and transfer of the subject land to Maseruka Robert.
While awaiting the progress of the case mentioned hereinabove, the Complainants were surprised to find that the said Mbabazi, instead of assisting them, he went into a consent settling the said suit on their behalf without their knowledge or consent. A copy of the Consent is attached hereto.
Among the terms of the said consent Judgment was that the residents would be compensated without specifying how much and would in return vacate the Land.
As if that was not enough, Maseruka Robert and Mbabazi Samuel are going ahead to execute the said Consent Judgment by forcefully evicting the occupants without compensation which has prompted the complainants to challenge the said Consent by applying for its review and setting aside at Masindi High Court which is coming up for hearing on the 29th March 2023. A copy of the Application is attached hereto.
Sensing the imminent threat of eviction, we also filed an application for interim stay of execution of the said consent to avoid rendering their application for review nugatory but unfortunately the same could not be heard on the date it was fixed for hearing (6th February 2023). A copy of the Application is attached hereto
On Thursday last week, three tractors being operated by 6 workers of a one Mbabazi Samuel [the very person who had been entrusted to represent our Clients to secure their Land through Civil Suit No.46 of 2019] encroached close to 50 acres of our Clients’ land and started ploughing it but our Client’s protested and chased them away.
We have however been shocked to receive information from our Clients that on Sunday at Mid night, 3 police patrols invaded the community in the night and arrested community members; Mulenje Jack, Steven Kagyenji, Mulekwa David, Ntambala Geoffrey, Tumukunde Isaac 15 years, Kanunu Innocent, Mukombozi Frank, Kuzara, Rwamunyankole Enock, and took them to Kiryandongo Police Station where they are currently detained.
We strongly protest the illegal arrests and detention of our Clients as this is a carefully orchestrated land grabbing scheme by Maseruka Robert and Mbabazi Samuel who are receiving support from the DPC Kiryandongo.
The purpose of this Letter therefore is to request your good office to investigate the misconduct, abuse of office and unprofessionalism of the said DPC Kiryandongo District and all his involvement in the land grabbing schemes on land formerly known as Ranch 22.
Looking forward to your urgent intervention,
C.C The Head Police Land Protection Unit Police Head Quarters Naguru
CC The RDC Kiryandongo District
CC The Chairman LCVKityadongo District
CC The Regional Police CommanderAlbertine Region
WITNESS RADIO MILESTONES
The Executive Director of Witness Radio Uganda talks about the role played by Witness Radio in protecting communities affected by large-scale agribusinesses in Kiryandongo district in an interview with the ILC.
WITNESS RADIO MILESTONES
Witness Radio Uganda wins the best CSO land rights defenders award at the National Land Forum Awards.
By Witness Radio Team
Uganda’s leading land and environmental rights watchdog, Witness Radio has been awarded the best CSO land rights defender award 2022 in the recently concluded National Land Forum Awards held last week at Mestil hotel in Kampala.
Witness Radio’s executive Director, Jeff Wokulira Ssebaggala attributed the award to the community land and environmental rights defenders who stand up against the intimidation and different forms of harassment from land grabbers (economically powerful and politically connected companies and individual investors).
“This is an award for defenders at a community level. They work in very deadly environments filled with harassment, torture, death threats, arrest, trumped-up charges, and kidnaps among others to advocate for community land and environment rights. This is happening at a spate where criminalization and silencing of community land rights defenders are at increase.” Jeff added.
The award has come at a time when hundreds of Ugandans in different parts of the country are accessing services provided by the organization ranging from legal service provisions, non-judicial mechanism engagements, empowerment to help them understand their rights, and using the same knowledge to use the same skills to push back against illegal and forced evictions
The chairman of the organizing committee of the second National Land Forum, Mr. Jimmy Ochom noted some progress on legislation in Uganda’s land Governance. He cited growing inequalities on land where the poor are more vulnerable.
During awards, the state minister for housing, Hon persis Namuganza revealed that the government approved the plan for 2018-2040 that maps the land use in the country.
According to the minister, the government had identified land for settlement, game reserves, wildlife, arable land for farming, and water bodies among others in the plan which she said was passed a few weeks ago.
The event was organized by Oxfam and partners and provided a platform for discussions by the different actors in the land sector on issues around land governance, including land rights, land administration, and land governance for improved collaboration, cooperation between the actors, and improved land service delivery for Ugandans under a theme “Taking stock of the National Land Policy in addressing Land inequality in addressing Land inequality in Uganda.”
Other categories of awards that were won by different organizations and individuals including Mr. Eddie Nsamba-Gayiiya for his contribution to research on land rights, Justice Centers Uganda for Promoting Access to Land Justice, and Mr. Henry Harrison Irumba for Championing Legal Reforms among others.
MEDIA FOR CHANGE NETWORK5 days ago
Six cattlemen opposed to the Tilenga oil project-related forced land eviction have been granted bail but will remain in prison…
MEDIA FOR CHANGE NETWORK1 week ago
Persecution: The prosecutor’s office is turned into a tool to harass locals for opposing land grabs to give way to the oil project in Mid Western Uganda.
NGO WORK2 weeks ago
The Black Sea Grain Initiative: When the United Nations Brokers Profits for Corporations, Bankers, and Oligarchs
NGO WORK2 weeks ago
A corporate cartel fertilises food inflation
STATEMENTS1 week ago
Joint statement: AfDB should open spaces for civil society and communities, during the Annual Meetings and beyond.