@josh@sciences.social
@josh@sciences.social avatar

josh

@josh@sciences.social

Associate Professor at UMass Amherst studying the civic impacts of media distribution. I work in our Journalism Department and also co-edit the Distribution Matters series for The MIT Press. Opinions mine.

Photos: Beth Wallace, Timothy Neesam

Keywords:
#Commodon #Communication #MediaStudies #AcademicChatter #MediaDistribution #ScienceJournalism #Emacs #Zettelkasten #Linux #ica23

This profile is from a federated server and may be incomplete. Browse more on the original instance.

josh, to random
@josh@sciences.social avatar

I'm late to this (blame the end of the semester), but if you're thinking about the tensions that NBC faces as it tries to house very different news brands under one roof, and how that plays in the age of social media… (1/2)

https://www.nytimes.com/2024/05/15/business/media/nbc-msnbc-trump-biden.html

AlexSanterne, to academia
@AlexSanterne@astrodon.social avatar

Modern #academia is perversed.

@academicchatter

josh,
@josh@sciences.social avatar

@fabiocosta0305 @buermann @AlexSanterne @academicchatter I really love this article — discusses how 18th and 19th century prejudices created the backdrop for much of contemporary biology and the things we're still unlearning in order to reach truer understandings of nature: https://orionmagazine.org/article/what-slime-knows/

josh, to academicchatter
@josh@sciences.social avatar

Closed out the paperwork for the Spring term yesterday and, with my first post-semester deadlines approaching, I'm realizing that I seriously under-budgeted the amount of time I would need to stare into near space.

cc @academicchatter

josh, to communicationscholars
@josh@sciences.social avatar

Listened to The Daily yesterday, which featured Cade Metz's reporting that OpenAI and its competitors have scraped the entire English-language Internet as input for their LLMs and it's almost as if it should be a limit problem from a @stevenstrogatz's book. What happens to the sum of copyright infringement penalties as the number of infractions approaches infinity?

https://www.nytimes.com/2024/04/16/podcasts/the-daily/ai-data.html

cc @communicationscholars

natematias, to random
@natematias@social.coop avatar

@josh is dropping some awesome insight on the causes of technical accidents/disasters at this National Academies workshop.

Josh has summarized the issues in this really great video on costs and incentives for advertising, news, and disinformation.

https://vimeo.com/930828854

josh,
@josh@sciences.social avatar

@natematias Also, since you've been so kind as to mention this presentation, I should note that I also published an essay on this topic:

"Normal Accidents in the Digital Age: How Programmatic Advertising Became a Disaster"

Here's the author preprint, but I also have a PDF of the finished essay that I can give out on request to anyone who's interested:

https://umass-my.sharepoint.com/:b:/g/personal/jabraun_umass_edu/EapFDwihCOZAqyoCIUX8JigBgN98pYMH_VvI7ItMF2NL7w?e=ErEXH8&download=1

@commscholar has an excellent essay in the same volume (edited by Matthew P. McAllister and Emily West).

josh,
@josh@sciences.social avatar

@cyberlyra @natematias @commscholar This is a spot on critique! They're treating harms to the public good as acceptable risks rather than critical issues. I think on another level we're in agreement here, though, in the sense that I'm arguing that we've learned from Perrow and others that there are ways to take these problems seriously if companies were so inclined (or were forced by regulators to do so).

josh,
@josh@sciences.social avatar

@cyberlyra @natematias @commscholar I agree and I rake them over the coals in my first article on adtech. I think both things can be true at the same time, though, and agree I didn't emphasize this in the video: They created a system with a horrendous risk profile because they focused on profit to the exclusion of harms. There's still plenty of culpability here.

josh,
@josh@sciences.social avatar

@natematias @asociologist @cyberlyra @commscholar I think the schism that normal accident theory precipitated is instructive. Perrow's main recommendation was that we walk away from nuclear power and other systems where the stakes were unacceptably high and, in his view, the risks couldn't be managed effectively. …

josh,
@josh@sciences.social avatar

@natematias @asociologist @cyberlyra @commscholar Instead of heeding this as a warning, some took it as a challenge. And so you get high-reliability theory, which insists that the proper organizational culture can handle lots of inherent risk. Perrow was largely critical of this stance — as @asociologist's note above illustrates. But Perrow ends up engaging with their ideas and offering up half-measures, like compartmentalization, intended to contain accidents he saw as inevitable.

josh,
@josh@sciences.social avatar

@natematias @asociologist @cyberlyra @commscholar In short, as I understand it, the literature contains a spectrum of stances, including highly critical ones — abandon and, for God's sake don't create, high-stakes systems that are prone to normal accidents. And it also includes research that insists you can manage these problems if you do things right. And it's certainly possible to play these perspectives off one another in bad faith.

josh,
@josh@sciences.social avatar

@natematias @asociologist @cyberlyra @commscholar Lastly, it's worth noting that even high reliability theory — the perspective most sanguine toward the possibility of effectively managing tightly coupled, interactively complex systems — insists that doing so necessitates making profit a secondary concern to safety and reliability, and relies on an organizational culture relentlessly focused on quality control. All of which is…well, not what's going on in adtech.

josh, to communicationscholars
@josh@sciences.social avatar

This National Academies workshop on "Evolving Technological, Legal and Social Solutions to Counter Disinformation in Social Media" starts today at noon (US EDT) and will be streamed to the Web. I'll be participating and will present in a panel tomorrow on changing the incentive structures around the propagation of disinformation.

Also presenting, some other friends of the mammoth, @natematias, @commscholar, and @NathalieSmuha

https://www.nationalacademies.org/event/41384_04-2024_evolving-technological-legal-and-social-solutions-to-counter-disinformation-in-social-media-a-workshop

cc @communicationscholars

josh,
@josh@sciences.social avatar

We had to sign Very Official Copyright releases on our multimedia materials, so I drew my own images to avoid any accidental infringement. Please enjoy my terrible artwork.

https://vimeo.com/930828854

cc @communicationscholars

josh, to communicationscholars
@josh@sciences.social avatar

Pleased to be taking part in this National Academies workshop on April 10–11th, organized by Joan Donovan and Saul Perlmutter: "Evolving Technological, Legal, and Social Solutions to Counter Disinformation in Social Media"

The event is on Zoom and registration is open!

https://mailchi.mp/nas/disinformation_workshop?e=61d54e25a1

@communicationscholars

josh, to TikTok
@josh@sciences.social avatar

Amid all the #TikTok craziness this week, I thought I'd plug Aynne Kokas' book, "Trafficking Data," which is one of the most thoughtful takes I've read on the concerns around — and convoluted politics of —Americans using Chinese apps. It's an indictment of tech industry regulation in both the U.S. and China. 1/11

https://global.oup.com/academic/product/trafficking-data-9780197620502?cc=us&lang=en#

josh,
@josh@sciences.social avatar

My summary will be an oversimplification — Aynne's the international policy expert, not me. But, essentially, China's laws around national security, its strict domestic controls on free speech, and its approach to managed capitalism give the Chinese government the authority to access pretty much any data they'd like, provided it's housed on, or processed by, Chinese servers. And they're free to peruse the records of Chinese companies. 2/11

josh,
@josh@sciences.social avatar

Other countries may have privacy laws that protect the data of their citizens, but once the apps and devices those citizens are using send their data across the border for storage or processing — to China or elsewhere — those laws no longer apply. 3/11

josh,
@josh@sciences.social avatar

It should be noted the U.S. takes full advantage of this — the National Security Agency loves that so much of the world's Internet traffic gets routed through the United States and is notorious for the scope and extent of the data it captures on foreigners, to say nothing of U.S. citizens communicating with folks abroad. No doubt lawmakers concerned about China's capture of Americans' data are paranoid in part because they know just how much of this sort of thing we're up to over here. 4/11

josh,
@josh@sciences.social avatar

At any rate, as Aynne's book chronicles, a ton of the world's most popular apps and hardware are produced by Chinese companies and/or connect to Chinese servers, where Chinese authorities have authority to access it. 5/11

josh,
@josh@sciences.social avatar

In fairness, the fact that China has given itself the legal authority to comb through lots of expropriated American data doesn't necessarily mean they're using this authority effectively. We may be imagining a dystopian database drawing from thousands of apps and devices, queried by super spies, but it's just as likely that our stuff gets stored on a morass of different corporate servers that don't talk to each other and that bureaucratic incompetence prevents any effective use of it. 6/11

josh,
@josh@sciences.social avatar

Still, the potential for strategic use of American citizens' data by Chinese authorities is there. As Kokas lays out, it's a real possibility and something that could certainly be to China's advantage. 7/11

josh,
@josh@sciences.social avatar

Meanwhile, the U.S. government has more or less refused to pass effective privacy regulation, seeing the surveillance-capitalism business model of the Silicon Valley as an economic juggernaut that, at least until recently, needed to be encouraged rather than reined in. It was thus American companies, with the support of American regulators, who laid the groundwork of the massive market for personal data China now participates in with apps like TikTok. 8/11

josh,
@josh@sciences.social avatar

If American regulators wanted to do something really effective here, they'd focus less on banning individual apps and instead actually pass meaningful cross-cutting privacy legislation that also applied to U.S. companies. This is Kokas's argument. As she points out, Facebook would love it if the government would do the dirty work of banning its competition, while leaving its own abusive data practices unchecked. 9/11

josh,
@josh@sciences.social avatar

Meaningful #privacy legislation that predicated the ability to do business in the U.S. on limited capture and responsible use of user data would address potential harms by both foreign and domestic companies and stand less risk of spilling over into fear mongering or anti-Asian sentiment. 10/11

josh,
@josh@sciences.social avatar

Meanwhile, Kokas's book looks beyond social media, to examine broadly the types of data to which Chinese authorities have access. If you're going to get worked up about potential threats posed by China to U.S. national security, it's probably more significant that they process data from our smart agricultural equipment, our ports, and our medical assays. But, sure, let's all focus on TikTok.¯_(ツ)_/¯ 11/11

  • All
  • Subscribed
  • Moderated
  • Favorites
  • megavids
  • thenastyranch
  • rosin
  • GTA5RPClips
  • osvaldo12
  • love
  • Youngstown
  • slotface
  • khanakhh
  • everett
  • kavyap
  • mdbf
  • DreamBathrooms
  • ngwrru68w68
  • provamag3
  • magazineikmin
  • InstantRegret
  • normalnudes
  • tacticalgear
  • cubers
  • ethstaker
  • modclub
  • cisconetworking
  • Durango
  • anitta
  • Leos
  • tester
  • JUstTest
  • All magazines