×
United States

Congress's Big Tech Stock Stakes Make Regulation Awkward (bloomberg.com) 60

A proposed antitrust bill has cast a spotlight on the immense portfolios of dozens of lawmakers. From a report: At a December press conference, House Speaker Nancy Pelosi was asked her opinion of proposed restrictions on stock trading by members of Congress. Her response was quick and clear: She hated the idea. "We are a free-market economy," Pelosi, whose family's shareholdings exceed $100 million, shot back. "They should be able to participate in that." Growing numbers of legislators from both sides of the aisle disagree. Following a series of recent abuses, at least five bills making their way through Congress would forbid lawmakers from owning individual stocks or force them to move their assets into a blind trust. One would make violators turn over any profits they earn to the U.S. Treasury Department. Another would extend the ban to family members. A third would also encompass top staffers.

[...] The fight over the measure highlights the potential conflicts of interest in lawmakers' shareholdings. A Bloomberg Businessweek examination of financial filings found that at least 18 senators and 77 House members report owning shares of one or more of the companies, and the law could have a significant effect on the value of their portfolios. Pelosi disclosed that her husband has as much as $25.5 million in Apple stock alone. Republican Representative Mike McCaul of Texas reported that his family holds shares of all four tech giants, with a collective value topping $8 million. Last year members of Congress filed more than 4,000 trading disclosures involving more than $315 million of stock and bond transactions, according to Tim Carambat, a researcher who maintains databases of lawmakers' financial trades.

Democrats

Democrats Unveil Bill To Ban Online 'Surveillance Advertising' (theverge.com) 146

Democrats introduced a new bill that would ban nearly all use of digital advertising targeting on ad markets hosted by platforms like Facebook, Google, and other data brokers. From a report: The Banning Surveillance Advertising Act -- sponsored by Reps. Anna Eshoo (D-CA), Jan Schakowsky (D-IL), and Sen. Cory Booker (D-NJ) -- prohibits digital advertisers from targeting any ads to users. It makes some small exceptions, like allowing for "broad" location-based targeting. Contextual advertising, like ads that are specifically matched to online content, would be allowed. "The 'surveillance advertising' business model is premised on the unseemly collection and hoarding of personal data to enable ad targeting," Eshoo, the bill's lead sponsor, said in a Tuesday statement. "This pernicious practice allows online platforms to chase user engagement at great cost to our society, and it fuels disinformation, discrimination, voter suppression, privacy abuses, and so many other harms. The surveillance advertising business model is broken."
Government

USPS Built and Secretly Tested a Blockchain-Based Mobile Voting System Before 2020 (washingtonpost.com) 60

An anonymous reader quotes a report from The Washington Post: The U.S. Postal Service pursued a project to build and secretly test a blockchain-based mobile phone voting system before the 2020 election (Warning: may be paywalled; alternative source), experimenting with a technology that the government's own cybersecurity agency says can't be trusted to securely handle ballots. The system was never deployed in a live election and was abandoned in 2019, Postal Service spokesman David Partenheimer said. That was after cybersecurity researchers at the University of Colorado at Colorado Springs conducted a test of the system during a mock election and found numerous ways that it was vulnerable to hacking.

The project appears to have been conducted without the involvement of federal agencies more closely focused on elections, which were then scrambling to make voting more secure in the wake of Russian interference in the 2016 contest. Those efforts focused primarily on using paper ballot so the voter could verify their vote was recorded accurately and there would be a paper trail for auditors -- something missing from any mobile phone or Internet-based system. The project appears to have been conducted without the involvement of federal agencies more closely focused on elections, which were then scrambling to make voting more secure in the wake of Russian interference in the 2016 contest. Those efforts focused primarily on using paper ballot so the voter could verify their vote was recorded accurately and there would be a paper trail for auditors -- something missing from any mobile phone or Internet-based system.

The Postal Service system allowed people to cast votes on an Internet-connected mobile app similar to how they might add items to an online shopping cart or fill out an online survey. The votes were designed to be anonymous and to be recorded in multiple digital locations simultaneously. The idea is that each of those digital records would act as a check to verify the accuracy of the other records. This is essentially the same method that cryptocurrencies such as bitcoin use to ensure transactions are accurately recorded. But the system didn't protect against the numerous ways hackers might fake or corrupt votes, the University of Colorado researchers said. Those include impersonating voters, attacking the blockchain system itself so votes can't be trusted, flooding the system with information so it becomes too overwhelmed to function, and using techniques that undermine voters' privacy and the secrecy of the ballot. The researchers were able to successfully perform all those hacks during a mock election held on campus.
"The Postal Service was awarded a public patent for the concept in August 2020, but had not previously revealed that it built a prototype system or tested it," the report notes.
Facebook

Two US Senators Urge Federal Investigations Into Facebook About Safety - and Ad Reach (cnbc.com) 6

Two leading U.S. Senators "are urging federal regulators to investigate Facebook over allegations the company misled advertisers, investors and the public about public safety and ad reach on its platform," reports CNBC: On Thursday, Senator Warren urged the heads of the Department of Justice and Securities and Exchange Commission to open criminal and civil investigations into Facebook or its executives to determine if they violated U.S. wire fraud and securities laws. A day earlier, Senator Cantwell, chair of the Senate Commerce Committee, encouraged the Federal Trade Commission to investigate whether Facebook, now called Meta, violated the agency's law against unfair or deceptive business practices. Cantwell's letter was made public on Thursday...

In her letter to the FTC, Cantwell focused on Facebook's claims about the safety of its products, in addition to the allegedly inflated ad projections... She suggested the agency investigate Facebook and, depending what the evidence shows, pursue monetary relief for advertisers and disgorgement of allegedly ill-gotten gains.

Senator Warren points to a whistleblower's recent allegations that Facebook misled both investors and advertising customers about their ad reach, according to the article. But Warren's letter also argued the possibility Facebook violated securities law with "breathtakingly illegal conduct by one of the world's largest social media companies," according to the article. And in addition, Warren "wrote that evidence increasingly suggests executives were aware the metric 'was meaningfully and consistently inflated.'"

Bloomberg adds this quote from Senator Cantwell's letter: "A thorough investigation by the Commission and other enforcement agencies is paramount, not only because Facebook and its executives may have violated federal law, but because members of the public and businesses are entitled to know the facts regarding Facebook's conduct as they make their decisions about using the platform."
Social Networks

The Head of Instagram Agrees To Testify as Congress Probes the App's Effects on Young People (nytimes.com) 13

Adam Mosseri, the head of Instagram, has agreed for the first time to testify before Congress, as bipartisan anger mounts over harms to young people from the app. From a report: Mr. Mosseri is expected to appear before a Senate panel during the week of Dec. 6 as part of a series of hearings on protecting children online, said Senator Richard Blumenthal, who will lead the hearing. Mr. Mosseri's appearance follows hearings this year with Antigone Davis, the global head of safety for Meta, the parent company of Instagram and Facebook, and with Frances Haugen, a former employee turned whistle-blower. Ms. Haugen's revelations about the social networking company, particularly those about Facebook and Instagram's research into its effects on some teenagers and young girls, have spurred criticism, inquiries from politicians and investigations from regulators.

In September, Ms. Davis told Congress that the company disputed the premise that Instagram was harmful for teenagers and noted that the leaked research did not have causal data. But after Ms. Haugen's testimony last month, Mr. Blumenthal, a Connecticut Democrat, wrote a letter to Mark Zuckerberg, the chief executive of Meta, suggesting that his company had "provided false or inaccurate testimony to me regarding attempts to internally conceal its research." Mr. Blumenthal asked that Mr. Zuckerberg or Mr. Mosseri testify in front of the consumer protection subcommittee of the Senate's Commerce Committee to set the record straight.

Politics

A Three-Party Alliance is Set To Govern Germany (npr.org) 88

Three German parties have reached a deal to form a new government that will end the era of longtime Chancellor Angela Merkel, according to Olaf Scholz, who is poised to replace her. From a report: Scholz, of the center-left Social Democrats, said he expects that members of the parties will give their blessing to the deal in the next 10 days. At a news conference, Scholz and other leaders gave some indications of how the coalition would govern. Among the first measures agreed: compulsory vaccinations in places where particularly vulnerable people are cared for, with the option of expanding that rule. That comes as Germany is seeing a surge in cases, and the political transition has somewhat hampered the country's response. Scholz also stressed the importance of a sovereign Europe, friendship with France and partnership with the United States as key cornerstones of the government's foreign policy -- continuing a long post-war tradition. The new government will not seek "the lowest common denominator, but the politics of big impacts," Scholz promised. Robert Habeck, co-leader of the environmentalist Green party, meanwhile, said measures planned by the government would put Germany on a path to meet the goals of the 2015 Paris climate accord.
United States

US Joins Global Cybersecurity Partnership (axios.com) 16

The U.S. is now part of an international agreement on cybersecurity that the Trump administration declined to sign up for, Vice President Kamala Harris announced in Paris Wednesday. From a report: 80 countries, along with hundreds of tech companies -- including Microsoft and Google -- nonprofits and universities have signed the Paris Call for Trust and Security in Cyberspace, established in 2018 to create international norms and laws for cybersecurity and warfare. The U.S. support of the voluntary Paris Call reflects the Biden administration's "priority to renew and strengthen America's engagement with the international community on cyber issues," per a White House statement.

It builds on U.S. efforts to improve cybersecurity for citizens and businesses, the statement continued. This includes "rallying G7 countries to hold accountable nations that harbor cyber criminals, supporting the update of NATO cyber policy for the first time in seven years, and the recent counter-ransomware engagement with over 30 countries around the world to accelerate international cooperation to combat cybercrime."

Books

New Book Warns CS Mindset and VC Industry are Ignoring Competing Values (computerhistory.org) 116

So apparently three Stanford professors are offering some tough-love to young people in the tech community. Mehran Sahami first worked at Google when it was still a startup (recruited to the company by Sergey Brin). Currently a Stanford CS professor, Sahami explained in 2019 that "I want students who engage in the endeavor of building technology to think more broadly about what are the implications of the things that they're developing — how do they impact other people? I think we'll all be better off."

Now Sahami has teamed up with two more Stanford professors to write a book calling for "a mature reckoning with the realization that the powerful technologies dominating our lives encode within them a set of values that we had no role in choosing and that we often do not even see..."

At a virtual event at Silicon Valley's Computer History Museum, the three professors discussed their new book, System Error: Where Big Tech Went Wrong and How We Can Reboot — and thoughtfully and succinctly distilled their basic argument. "The System Error that we're describing is a function of an optimization mindset that is embedded in computer science, and that's embedded in technology," says political scientist Jeremy Weinstein (one of the book's co-authors). "This mindset basically ignores the competing values that need to be 'refereed' as new products are designed. It's also embedded in the structure of the venture capital industry that's driving the growth of Silicon Valley and the growth of these companies, that prioritizes scale before we even understand anything about the impacts of technology in society. And of course it reflects the path that's been paved for these tech companies to market dominance by a government that's largely been in retreat from exercising any oversight."

Sahami thinks our technological landscape should have a protective infrastructure like the one regulating our roads and highways. "It's not a free-for all where the ultimate policy is 'If you were worried about driving safely then don't drive.'" Instead there's lanes and traffic lights and speed bumps — an entire safe-driving infrastructure which arrived through regulation." Or (as their political science professor/co-author Rob Reich tells the site), "Massive system problems should not be framed as choices that can be made by individual consumers."

Sahami also thinks breaking up big tech monopolies would just leaves smaller "less equipped" companies to deal with the same problems — but that positive changes in behavior might instead come from government scrutiny. But Reich also wants to see professional ethics (like the kind that are well-established in biomedical fields). "In the book we point the way forward on a number of different fronts about how to accelerate that..."

And he argues that at colleges, just one computing-ethics class isn't enough. "Ethics must be embedded through the entire curriculum."
Social Networks

Trump's Truth App Bans Criticism of Itself - and Also 'Excessive Use of Capital Letters' (msn.com) 225

Time magazine spotted three things in the terms of service for former U.S. president Trump's "Truth Social" site: - Despite advertising itself as a platform that will "give a voice to all," according to a press release, TRUTH Social's terms of service state that users may not "disparage, tarnish, or otherwise harm, in our opinion, us and/or the Site." In other words, any user who criticizes Trump or the site can be kicked off the platform...

- [W]hile portraying itself as a refuge for free speech and the "first major rival to 'Big Tech,'" TRUTH Social's terms of service make it clear that the platform not only intends to moderate content — just as Twitter and Facebook do — but reserves the right to remove users for any reason it deems necessary. The terms go on to say that if TRUTH Social decides to terminate or suspend your account, the platform may also sue you — something that Twitter and Facebook's terms don't say. "In addition to terminating or suspending your account, we reserve the right to take appropriate legal action, including without limitation pursuing civil, criminal, and injunctive redress," TRUTH Social's terms state...

- Maybe most notably, the site's list of prohibited activities includes the "excessive use of capital letters," an idiosyncrasy that Trump became known for on Twitter and that no other major social network specifically bans. TRUTH Social's terms also contain some sections written in all-caps.

The terms also specify explicitly that the site considers itself "not responsible" for the accuracy/reliability of what's posted on the site. Yet the Washington Post reports the newly-formed "Trump Media & Technology Group" has already applied for trademark rights for the terms "truthing," "post a truth," and "retruth."

Meanwhile, the Software Freedom Conservancy believes the end of the site's public test launch was directly tied to a recently-discovered violation of a Conservancy license. "Once caught in the act, Trump's Group scrambled and took the site down."

One of the license's authors emphasizes that the license "purposefully treats everyone equally (even people we don't like or agree with), but they must operate under the same rules of the copyleft licenses that apply to everyone else..." To comply with this important FOSS license, Trump's Group needs to immediately make that Corresponding Source available to all who used the site today while it was live. If they fail to do this within 30 days, their rights and permissions in the software are automatically and permanently terminated. That's how AGPLv3's cure provision works — no exceptions — even if you're a real estate mogul, reality television star, or even a former POTUS."
Facebook

Facebook Accused of Tolerating Dangerous and Criminal Behavior to Preserve Profitability (fortune.com) 196

A new whistleblower affidavit submitted by a former Facebook employee "alleges that the company prizes growth and profits over combating hate speech, misinformation and other threats to the public," reports the Washington Post: The SEC affidavit goes on to allege that Facebook officials routinely undermined efforts to fight misinformation, hate speech and other problematic content out of fear of angering then-President Donald Trump and his political allies, or out of concern about potentially dampening the user growth key to Facebook's multi-billion-dollar profits...

Friday's filing is the latest in a series since 2017 spearheaded by former journalist Gretchen Peters and a group she leads, the Alliance to Counter Crime Online. Taken together, the filings argue that Facebook has failed to adequately address dangerous and criminal behavior on its platforms, including Instagram, WhatsApp and Messenger... "Zuckerberg and other Facebook executives repeatedly claimed high rates of success in restricting illicit and toxic content — to lawmakers, regulators and investors — when in fact they knew the firm could not remove this content and remain profitable," Peters said in a statement.

Friday's filing, which was accompanied by a second affidavit from Peters based on interviews she conducted with other former company employees, argues that top leaders at Facebook, including chief executive Mark Zuckerberg and Chief Operating Officer Sheryl Sandberg, are aware of the severity of problems within the company but have failed to report them in SEC filings available to investors... Section 230 of the Communications Decency Act, which some lawmakers are pushing to reform, gives broad immunity to Internet companies for content that users post on their platforms. That is a barrier to some kinds of legal scrutiny but not necessarily to an investigation by the SEC, which has wide-ranging enforcement powers.

There appears to be a convenient case study available. Facebook "had set up safeguards that were aimed at combating misinformation and other forms of platform abuse" in the run-up to America's 2020 election, "but it dismantled many of them by mid-December," Bloomberg reported Friday, citing a new package of redacted documents provided to Congress by whistleblower Frances Haugen.

And in addition, "In early December, Facebook disbanded a 300-person squad known as Civic Integrity, which had the job of monitoring misuse of the platform around elections... even as efforts to delegitimize the election intensified." Meanwhile, Stop the Steal groups were "amplifying and normalizing misinformation and violent hate in a way that delegitimized a free and fair election," Facebook's internal analysis concluded.
But there's more in that company after-action report, adds the Washington Post: The documents also provide ample support that the company's internal research over several years had identified ways to diminish the spread of political polarization, conspiracy theories and incitements to violence but that in many instances, executives had declined to implement those steps...

The documents and interviews with former employees make clear that Facebook has deep, highly precise knowledge about how its users are affected by what appears on its sites. Facebook relentlessly measures an astonishing array of data points, including the frequency, reach and sources of falsehoods and hateful content and often implements measures to suppress both. The company exhaustively studies potential policy changes for their impacts on user growth and other factors key to corporate profits, such as engagement, the extent of sharing and other reactions.

The article adds that at Facebook, even the public relations and political impacts "are carefully weighed — to the point that potentially flattering and unflattering news headlines about the company are sketched out for review."
Republicans

Donald Trump To Launch Social Media Platform Called Truth Social (theguardian.com) 387

An anonymous reader quotes a report from The Guardian: Donald Trump has announced plans to launch a social media platform called TRUTH Social that will rolled be out early next year. The former president, who was banned from Facebook and Twitter earlier this year, says his goal is to rival the tech companies that have denied him the megaphone that was paramount to his rise. "I'm excited to soon begin sharing my thoughts on TRUTH social and to fight back against big tech," Trump said in a statement. Trump announced the news in a press release on Wednesday, saying the platform will be open to "invited users" for a beta launch in November, with plans to make it available to the broader public in the beginning of next year. Truth social will be a product of a new venture called the Trump Media & Technology Group which was created through a merger with Digital World Acquisition Corp. The group said it seeks to become a publicly listed company. Users can sign up to be put on a waiting list or pre-order the app via the App Store.
Politics

Andrew Yang Suggests Power May Affects Politicians' Brain Neurons (politico.com) 157

Today tech entrepreneur-turned-politician Andrew Yang candidly reflected on the pitfalls of power that he'd learned about during his 2020 run for president. "In national politics, it turns out, you're not as much the CEO as you are yourself the product... [E]veryone in my orbit started treating me like I might be a presidential contender. I was getting a crash course in how we treat the very powerful — and it was weird.

"But it was more than just a head rush. There are psychological consequences to being treated this way for months on end." The historian Henry Adams described power as "a sort of tumor that ends by killing the victim's sympathies." This may sound like hyperbole, but it has been borne out by years of lab and field experiments. Dacher Keltner, a psychology professor at UC Berkeley, has been studying the influence of power on individuals. He puts people in positions of power relative to each other in different settings. He has consistently found that power, over time, makes one more impulsive, more reckless and less able to see things from others' points of view. It also leads one to be rude, more likely to cheat on one's spouse, less attentive to other people, and less interested in the experiences of others. Does that sound familiar? It turns out that power actually gives you brain damage.

This even shows up in brain scans. Sukhvinder Obhi, a neuroscientist at McMaster University in Ontario, recently examined the brain patterns of the powerful and the not so powerful in a transcranial-magnetic-stimulation machine. He found that those with power are impaired in a specific neural process — mirroring — that leads to empathy... Perhaps most distressing is that in lab settings the powerful can't address this shortcoming even if told to try. Subjects in one study were told that their mirroring impulse was the issue and to make a conscious effort to relate to the experiences of others. They still couldn't do it. Effort and awareness made no difference in their abilities...

On the campaign trail, I could clearly see how politicians become susceptible to growing so out of touch. You spend time with dozens of people whose schedules and actions revolve around you. Everyone asks you what you think. You function on appearance; appearance becomes your role. Empathy becomes optional or even unhelpful. Leadership becomes the appearance of leadership.

The process through which we choose leaders neutralizes and reduces the capacities we want most in them. It's cumulative as well; the longer you are in it, the more extreme the effects are likely to be over time.

Facebook

Facebook Whistleblower Speaks, Shares Documents on Deliberate Lies and Disregard of Misinformation, Contacts US Regulators (cbsnews.com) 151

An Iowa data scientist with a computer engineering degree and a Harvard MBA has come forward as the whistleblower leaking damaging information about Facebook to the Wall Street Journal — and that's just the beginning. They've now also filed at least eight complaints with America's Securities and Exchange Commission, "which has broad oversight over financial markets and has the power to bring charges against companies suspected of misleading investors," reports the Washington Post. To buttress the complaints, the whistleblower secretly copied "tens of thousands" of pages of internal Facebook research, according to a report tonight on the CBS News show 60 Minutes, which summarizes her ultimate conclusion: "that the company is lying to the public about making significant progress against hate, violence and misinformation.

"One study she found from this year says 'We estimate that we may action as little as 3 to 5% of hate, and about 0.6% of violence and incitement on Facebook. Despite being the best in the world at it." Another internal Facebook document admits point-blank that "We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world."

60 Minutes points out that Facebook "has 2.8 billion users, which is 60% of all internet-connected people on Earth."

[Whistleblower Frances] Haugen told us the root of Facebook's problem is in a change that it made in 2018 to its algorithms — the programming that decides what you see on your Facebook news feed... "One of the consequences of how Facebook is picking out that content today is it is optimizing for content that gets engagement, or reaction. But its own research is showing that content that is hateful, that is divisive, that is polarizing, it's easier to inspire people to anger than it is to other emotions... Facebook has realized that if they change the algorithm to be safer, people will spend less time on the site, they'll click on less ads, they'll make less money."
60 Minutes reports that Facebook was even contacted by "major political parties across Europe," according to leaked internal documents which say the parties specifically complained that a change Facebook's algorithm "has forced them to skew negative in their communications on Facebook... leading them into more extreme policy positions." (Or, as 60 Minutes puts it, "The European political parties were essentially saying to Facebook the way you've written your algorithm is changing the way we lead our countries." The whistleblower sees their position as "You are forcing us to take positions that we don't like, that we know are bad for society. We know if we don't take those positions, we won't win in the marketplace of social media." Haugen says Facebook understood the danger to the 2020 Election. So, it turned on safety systems to reduce misinformation — but many of those changes, she says, were temporary. "And as soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritize growth over safety. And that really feels like a betrayal of democracy to me." Facebook says some of the safety systems remained. But, after the election, Facebook was used by some to organize the January 6th insurrection....

After the attack, Facebook employees raged on an internal message board copied by Haugen. "...Haven't we had enough time to figure out how to manage discourse without enabling violence?"

The whistleblower will now appear Tuesday before a U.S. Senate Commerce consumer protection subcommittee — and has already shared some of their documents with Congressional offices probing Facebook, according to the Washington Post. "It's important because Big Tech is at an inflection point," the whistleblower's lawyer tells the newspaper. They argue that ultimately Big Tech "touches every aspect of our lives — whether it's individuals personally or democratic institutions globally. With such far reaching consequences, transparency is critical to oversight.

"And lawful whistleblowing is a critical component of oversight and holding companies accountable."
Democrats

Senate Democrats Call on FTC To Fix Data Privacy 'Crisis' (theverge.com) 33

Senate Democrats are calling on the Federal Trade Commission to write new rules to protect consumer data privacy in a new letter to the agency authored on Monday. From a report: The letter, led by Sen. Richard Blumenthal (D-CT) and signed by eight other Democratic senators, was sent to FTC Chair Lina Khan Monday, calling on the agency to "begin a rulemaking process" on privacy. Specifically, the senators are requesting that the FTC pen new rules addressing privacy, civil rights, and the collection of consumer data. "Consumer privacy has become a consumer crisis," the lawmakers wrote. "Tech companies have routinely broken their promises to consumers and neglected their legal obligations, only to receive wrist-slap punishments after long delay, providing little relief to consumers, and with minimal deterrent effect."
Facebook

WSJ: Facebook's 2018 Algorithm Change 'Rewarded Outrage'. Zuck Resisted Fixes (livemint.com) 54

This week the Wall Street Journal reported that a 2018 algorithm change at Facebook "rewarded outrage," according to Facebook's own internal memos. But the Journal says the memos showed "that CEO Mark Zuckerberg resisted proposed fixes," and that the memos "offer an unparalleled look at how much Facebook knows about the flaws in its platform and how it often lacks the will or the ability to address them." In the fall of 2018, Jonah Peretti, chief executive of online publisher BuzzFeed, emailed a top official at Facebook Inc. The most divisive content that publishers produced was going viral on the platform, he said, creating an incentive to produce more of it... Mr. Peretti blamed a major overhaul Facebook had given to its News Feed algorithm earlier that year to boost "meaningful social interactions," or MSI, between friends and family, according to internal Facebook documents reviewed by The Wall Street Journal that quote the email...

Facebook's chief executive, Mark Zuckerberg, said the aim of the algorithm change was to strengthen bonds between users and to improve their well-being. Facebook would encourage people to interact more with friends and family and spend less time passively consuming professionally produced content, which research suggested was harmful to their mental health. Within the company, though, staffers warned the change was having the opposite effect, the documents show. It was making Facebook's platform an angrier place. Company researchers discovered that publishers and political parties were reorienting their posts toward outrage and sensationalism. That tactic produced high levels of comments and reactions that translated into success on Facebook. "Our approach has had unhealthy side effects on important slices of public content, such as politics and news," wrote a team of data scientists, flagging Mr. Peretti's complaints, in a memo reviewed by the Journal... They concluded that the new algorithm's heavy weighting of reshared material in its News Feed made the angry voices louder. "Misinformation, toxicity, and violent content are inordinately prevalent among reshares," researchers noted in internal memos.

Some political parties in Europe told Facebook the algorithm had made them shift their policy positions so they resonated more on the platform, according to the documents. "Many parties, including those that have shifted to the negative, worry about the long term effects on democracy," read one internal Facebook report, which didn't name specific parties...

Mr. Zuckerberg resisted some of the proposed fixes, the documents show, because he was worried they might hurt the company's other objective — making users engage more with Facebook.

Google

Google and Apple, Under Pressure From Russia, Remove Voting App (nytimes.com) 60

Apple and Google removed an app meant to coordinate protest voting in this weekend's Russian elections from the country on Friday, a blow to the opponents of President Vladimir V. Putin and a display of Silicon Valley's limits when it comes to resisting crackdowns on dissent around the world. From a report: The decisions came after Russian authorities, which claim the app is illegal, threatened to prosecute local employees of Apple and Google -- a sharp escalation in the Kremlin's campaign to rein in the country's largely uncensored internet. A person familiar with Google's decision said the authorities had named specific individuals who would face prosecution, prompting it to remove the app.

The person declined to be identified for fear of angering the Russian government. Google has more than 100 employees in the country. Apple did not respond to phone calls, emails or text messages seeking comment. The app was created and promoted by allies of the opposition leader Aleksei A. Navalny, who were hoping to use it to consolidate the opposition vote in each of Russia's 225 electoral districts. It disappeared from the two technology platforms just as voting got underway in the three-day parliamentary election, in which Mr. Putin's United Russia party -- in a carefully stage-managed system -- holds a commanding advantage.

Mr. Navalny's team reacted with outrage to the decision, suggesting the companies had made a damaging concession to the Russians. "Removing the Navalny app from stores is a shameful act of political censorship," an aide to Mr. Navalny, Ivan Zhdanov, said on Twitter. "Russia's authoritarian government and propaganda will be thrilled." The decisions also drew harsh condemnation from free-speech activists in the West. "The companies are in a really difficult position but they have put themselves there," David Kaye, a former United Nations official responsible for investigating freedom of expression issues, said in an interview. "They are de facto carrying out an element of Russian repression. Whether it's justifiable or not, it's complicity and the companies need to explain it."

The Internet

New Texas Law Tries Making it Illegal for Social Media Sites to Ban Users Over Political Viewpoints (bbc.com) 469

The U.S. state of Texas "has made it illegal for social media platforms to ban users 'based on their political viewpoints'," repots the BBC: Prominent Republican politicians have accused Facebook, Twitter and others of censoring conservative views... The social networks have all denied stifling conservative views. However, they do enforce terms of service which prohibit content such as incitement to violence and co-ordinated disinformation. "Social media websites have become our modern-day public square," said Texas governor Greg Abbott, after signing the bill into law on Thursday. "They are a place for healthy public debate where information should be able to flow freely...."

The new law states social media platforms with more than 50 million users cannot ban people based on their political viewpoints. Facebook, Twitter and Google's YouTube are within its scope...

The law is due to come in to force in December, but may face legal challenges.

"Critics say the law does not respect the constitutional right of private businesses to decide what sort of content is allowed on their platforms," the BBC adds, with the president of NetChoice trade association arguing that the bill "would put the Texas government in charge of content policies."
Facebook

Facebook Said To Consider Forming An Election Commission (nytimes.com) 75

Facebook has approached academics and policy experts about forming a commission to advise it on global election-related matters, said five people with knowledge of the discussions, a move that would allow the social network to shift some of its political decision-making to an advisory body. The New York Times reports: The proposed commission could decide on matters such as the viability of political ads and what to do about election-related misinformation, said the people, who spoke on the condition of anonymity because the discussions were confidential. Facebook is expected to announce the commission this fall in preparation for the 2022 midterm elections, they said, though the effort is preliminary and could still fall apart. Outsourcing election matters to a panel of experts could help Facebook sidestep criticism of bias by political groups, two of the people said. The company has been blasted in recent years by conservatives, who have accused Facebook of suppressing their voices, as well as by civil rights groups and Democrats for allowing political misinformation to fester and spread online. Mark Zuckerberg, Facebook's chief executive, does not want to be seen as the sole decision maker on political content, two of the people said.

If an election commission is formed, it would emulate the step Facebook took in 2018 when it created what it calls the Oversight Board, a collection of journalism, legal and policy experts who adjudicate whether the company was correct to remove certain posts from its platforms. Facebook has pushed some content decisions to the Oversight Board for review, allowing it to show that it does not make determinations on its own. Facebook, which has positioned the Oversight Board as independent, appointed the people on the panel and pays them through a trust.

Internal conversations around an election commission date back to at least a few months ago, said three people with knowledge of the matter. An election commission would differ from the Oversight Board in one key way, the people said. While the Oversight Board waits for Facebook to remove a post or an account and then reviews that action, the election commission would proactively provide guidance without the company having made an earlier call, they said.

Democrats

Senate Democrats To Introduce Legislation That Would Tax Energy Companies Responsible For Major Greenhouse Gas Emissions (thehill.com) 207

Zack Budryk writes via The Hill: The Polluters Pay Climate Fund Act, sponsored by Sen. Chris Van Hollen (D-Md.), would require between 25 to 30 of the U.S. corporations responsible for the most greenhouse gas pollution to pay $300 billion into a fund over 10 years. The legislation would require companies to pay into the fund if they were responsible for at least .05 percent of global carbon dioxide and methane emissions between 2000 and 2019 based on data from the Treasury Department and Environmental Protection Agency. In a document shared with The Hill, Van Hollen's office estimated major companies such as Shell, ExxonMobil and Chevron would be taxed $5 billion to $6 billion annually under the bill. The Democratic senator pointed to other policies that could accompany the measure, such as carbon pricing and a clean-energy standard.

The exact uses of the money in the fund have not yet been determined, Van Hollen said, adding there would be a public comment period. Possible uses include building more climate-resilient infrastructure, particularly in disadvantaged communities and communities of color. [...] After years of opposition, major institutions and trade groups like the American Petroleum Institute and the U.S. Chamber of Commerce have come out in favor of a tax on carbon emissions in recent months. However, Van Hollen's proposal would go further than that, specifically targeting major players like Exxon Mobil and Chevron.
Further reading: Democrats Seek $500 Billion in Climate Damages From Big Polluting Companies (The New York Times)
Social Networks

'Disinformation for Hire' is Becoming a Booming Industry (nytimes.com) 148

Sunday the BBC reported YouTube influencers were offered money to spread vaccine misinformation.

But according to the New York Times, that's just the tip of the iceberg. "The scheme appears to be part of a secretive industry that security analysts and American officials say is exploding in scale: disinformation for hire: Private firms, straddling traditional marketing and the shadow world of geopolitical influence operations, are selling services once conducted principally by intelligence agencies. They sow discord, meddle in elections, seed false narratives and push viral conspiracies, mostly on social media. And they offer clients something precious: deniability. "Disinfo-for-hire actors being employed by government or government-adjacent actors is growing and serious," said Graham Brookie, director of the Atlantic Council's Digital Forensic Research Lab, calling it "a boom industry."

Similar campaigns have been recently found promoting India's ruling party, Egyptian foreign policy aims and political figures in Bolivia and Venezuela. Mr. Brookie's organization tracked one operating amid a mayoral race in Serra, a small city in Brazil. An ideologically promiscuous Ukrainian firm boosted several competing political parties. In the Central African Republic, two separate operations flooded social media with dueling pro-French and pro-Russian disinformation. Both powers are vying for influence in the country. A wave of anti-American posts in Iraq, seemingly organic, were tracked to a public relations company that was separately accused of faking anti-government sentiment in Israel.

Most trace to back-alley firms whose legitimate services resemble those of a bottom-rate marketer or email spammer... For-hire disinformation, though only sometimes effective, is growing more sophisticated as practitioners iterate and learn. Experts say it is becoming more common in every part of the world, outpacing operations conducted directly by governments. The result is an accelerating rise in polarizing conspiracies, phony citizen groups and fabricated public sentiment, deteriorating our shared reality beyond even the depths of recent years... Commercial firms conducted for-hire disinformation in at least 48 countries last year — nearly double from the year before, according to an Oxford University study. The researchers identified 65 companies offering such services...

Platforms have stepped up efforts to root out coordinated disinformation. Analysts especially credit Facebook, which publishes detailed reports on campaigns it disrupts. Still, some argue that social media companies also play a role in worsening the threat. Engagement-boosting algorithms and design elements, research finds, often privilege divisive and conspiratorial content.

The article also notes "a generation" of populist political leaders around the world who have risen "in part through social media manipulation.

"Once in office, many institutionalize those methods as tools of governance and foreign relations."

Slashdot Top Deals