Category Archives: Internet and public issues

dealing with the big tech platforms

We can hold several ideas in our minds, even though they’re in tension, and try to work through to a better solution.

One one hand …

  • Any platform for discussion and communication needs rules. It won’t work if it’s wide open.
  • A privately owned platform is free to make up its own rules, and even to enforce them at will (except as governed by contracts that it has freely entered). A private actor is not bound to permit speech it dislikes or to use due process to regulate speech. It enjoys freedom of the press.
  • Donald Trump was doing great damage on Twitter and Facebook. It’s good that he’s gone.

Yet …

  • It is highly problematic that a few companies own vastly influential global platforms for communication without being accountable to any public. The First Amendment is a dead letter if the public sphere is a small set of forums owned by private companies.
  • Twitter’s reasons for banning Trump seem pretty arbitrary. The company refers to how Trump’s tweets were “received” by unnamed “followers” and invokes the broad “context” of his comments. But speakers don’t control the reception of their words or the contexts of their speech. A well-designed public forum would have rules, but probably not these rules.
  • If a US-based company can ban a political leader in any given country (including any competitive democracy), then democratic governance is threatened.
  • Facebook, Twitter, and Google profit from news consumption, denying profits to the companies that provide shoe-leather reporting. Fewer than half as many people are employed as journalists today, compared to 10 years ago. This is at the heart of the current, very interesting battle between the Australian government and the big tech. companies.
  • These companies deploy algorithms and other design features to maximize people’s time on their platforms, which encourages addictiveness, outrageous content, and filter bubbles and polarization.

Regulation is certainly one option, but it must overcome these challenges: 1) private communications companies have genuine free speech rights. 2) Forcing a powerful company to make really good choices is hard; externally imposed rules can be ignored or distorted. 3) The fact that there are 193 countries creates major coordination problems. (I wouldn’t mind if a patchwork of inconsistent rules hurt the big companies–I think these firms do more net harm than good. But it’s not clear that the resulting mix of rules would be good for the various countries themselves.) 4) The major companies are very powerful and may be able to defeat attempts to regulate them. For instance, they are simply threatening to withdraw from Australia. 5) There is a high potential for regulatory capture–major incumbent businesses influencing the regulators and even using complicated regulatory regimes to create barriers to entry for new competitors. Imagine, for example, that laws require content-moderation. Who would be able to hire enough moderators to compete with Facebook?

Antitrust is worth considering. If the big companies were broken up, there might be more competitors. But you must believe very strongly in the advantages of a competitive marketplace to assume that the results would be better instead of worse than the status quo. Metcalfe’s Law also tends to concentrate communication networks, whatever we do with antitrust.

Another approach is to try to build new platforms with better rules and designs. The economic challenge–not having enough capital to compete with Google and Facebook–could be addressed. Governments could fund these platforms, on the model of the BBC. I think the bigger problem is that the platforms would have to draw lots of avid users, or else they would be irrelevant. They would have to be attractive without being addictive, compelling without hyping sensational content, trustworthy yet also free and diverse.

Those are tough design challenges–but surely worth trying.

See also: why didn’t the internet save democracy?; the online world looks dark; democracy in the digital age; what sustains free speech?; a civic approach to free speech, etc.

Rewiring Democracy

Matt Leighninger and Quixada Moore-Vissing have published “Rewiring Democracy: Subconscious Technologies, Conscious Engagement, and the Future of Politics” (Public Agenda 2018).

I would pick out this major contrast from the complex document of 68 pages.

  • On one hand, technologies are being used ubiquitously to influence individuals and the political world without our conscious awareness. Examples include tools that allow organizations to predict what individuals want without having to ask them, techniques for microtargeting messages, and methods of surveillance.
  • On the other hand, people are deliberately inventing and using new tools for civic purposes, i.e., for free and intentional self-governance. Examples include tools for collecting contributions of money or time and techniques for circulating information in geographical communities.

Much depends on which force prevails, and that depends on us.

The report ends with 3-page case studies of civic innovations. Public Agenda is also publishing those examples separately, starting with a nice piece on the changing role of tech on social movements. It explores how contemporary social movements share photos and collaboratively produce maps, among other developments.

See also: democracy in the digital age; the new manipulative politics: behavioral economics, microtargeting, and the choice confronting Organizing for Action; qualms about Behavioral Economics; when society becomes fully transparent to the state

Defending the Truth: An Activist’s Guide to Fighting Foreign Disinformation Warfare

(Dayton, OH) I recommend Maciej Bartkowski’s Defending the Truth: An Activist’s Guide to Fighting Foreign Disinformation Warfare from the International Center for Nonviolent Conflict. It’s free, concise, practical, and inspiring.

Some examples of advice:

Establish local networks that can be rapidly activated to verify accuracy of shared information or legitimacy of online personas that call for certain actions in the community.

Educate, drill, and practice. … Teach how to identify a deep fake and conspiracy theories and ways to react to them.

Be aware of anonymous interlocutors who attempt to draw you to causes that seemingly align with your own activism goals. Ask them to reveal their identities first before committing to anything. … Do your homework by vetting your potential partners. Perform due diligence by asking the following questions: Who are these anonymous personas asking me to join an online protest group or alive street protest? Do they know anything about my community? Who do they represent? …

Insist on a degree of self-control in community interactions. Civility does not preclude a conflict, but conflict must always be carried  out through disciplined, nonviolent means.

Declare your commitment to truth and verifiable facts, including making public and honest corrections if you inadvertently shared inaccurate information or joined actions set up by fake personas. Praise those who adhere to truth or publicly retract untruthful information that they might have previously shared.

Stress the importance of truth in community as a matter of inviolable human rights. There are no human rights without state institutions being truthful to citizens. There is no public truth without respect for human rights.

 

why didn’t the internet save democracy?

I don’t always like this format, but Dylan Matthews’ short interviews with Clay Shirky, Jeff Jarvis, David Weinberger, and Alec Ross add up to a useful overview of the question that Matthews poses to all four: “The internet was supposed to save democracy. … What went wrong”?

The only interviewee who really objects to the framing is Ross, who asserts that his predictions were always value-neutral. He didn’t predict that the good guys would win, only that the weak would chasten the strong. So when Putin’s Russia took Obama’s America down a peg, that fulfilled his prophesy (Russian being weaker).

Some highlights, for me:

Clay Shirky:

I underestimated two things, and both of them make pessimism more warranted. The first is the near-total victory of the “social graph” as the ideal organizational form for social media, to the point that we now use “social media” to mean “media that links you to your friends’ friends,” rather than the broader 2000s use of “media that supports group interaction.”

The second thing I underestimated was the explosive improvement in the effectiveness of behavioral economics and its real-world consequences of making advertising work as advertised.

Taken together, these forces have marginalized the earlier model of the public sphere characterized by voluntary association (which is to say a public sphere that followed [Jürgen] Habermas’s conception), rather than as a more loosely knit fabric for viral ideas to flow through.

Shirky adds that he wrote (in 2008) much more about Meetup than Facebook, when both were still startups. Facebook rules the world and Meetup is marginal. Meetup would better embody a Habermasian theory of the public sphere. (See my post Habermas and critical theory: a primer but also saving Habermas from the deliberative democrats.)

Jarvis:

I was rather a dogmatist about the value of openness. I still value openness. But as Twitter, Blogger, and Medium co-founder Ev Williams said at [South by Southwest] recently, he and we did not account for the extent of the bad behavior that would follow. These companies accounted and compensated for dark-hat SEO, spam, and other economically motivated behavior. They did not see the extent of the actions of political bad actors and trolls who would destroy for the sake of destruction.

Weinberger:

It’s a tragedy that while the web connects pages via an open protocol, the connections among people are managed by closed, for-profit corporations. A lot of our political problems come from that: The interests of those corporations and of its users and citizens are not always aligned.

Weinberger wants to emphasize the positive, as well, and to remind us that “applications can be adjusted so that they serve us better.”

See also the online world looks dark (2107) and democracy in the digital age.

the online world looks dark

(Chicago) I’m at the #ObamaSummit, much of which can be followed online.

In the opening plenary, several speakers (including President Obama) noted the drawbacks of social media: psychological isolation, manipulation by powerful companies and governments, fake news, balkanization, and deep incivility.

I remember when discussions of civic tech were generally optimistic: people saw the Internet and social media as creative and democratic forces.

I went to the specialized breakout session with “civic media” entrepreneurs and asked them whether they shared the dark picture painted by the plenary speakers. Each gave an interesting and nuanced answer, but in short, they said Yes. The reason they build and use digital tools is basically to combat the larger trends in social media, which for the most part, they see as harmful. Even Adrian Reyna of @ United We Dream, a leader of one of the best social movements that has used online tools, emphasized that relying on civic tech can disempower people and alienate communities.

This is no reason to give up on improving the civic impact of digital media. The work remains as important as ever. It’s just that the atmosphere now feels very sober; the heady days of cyber-optimism have passed, at least for people concerned about politics and civic culture.

[See also democracy in the digital age and four questions about social media and politics]