If you want to understand how startups and venture capital work, go to 26:24 on the recording of my talk “Excuse Me, Your Unicorn Keeps Shitting In My Back Yard, Can He Please Not?” from 2016:
@ben They’re not yours, they’re theirs. Jeff Atwood thanks you for your free labour. (I’m kidding, he doesn’t. Feel grateful he even allowed you to contribute in the first place, serf.)
Speaking of Jeff Atwood, isn’t he the guy helping fund Mastodon now? 🤔
Update: been told Jeff left StackExchange a while ago so please substitute whichever Silicon Valley tech bro is currently running it.
I like how social media and news sites call it your "feed". Makes me feel like a farm animal getting fattened up for slaughter so I can be divided up and sold piecemeal to advertisers.
Can someone please ask the CEO of Mastodon gGmbH to defederate with Threads? I used to know Eugen who ran the instance and could have asked him myself but I don’t know the CEO of Mastodon gGmbH.
Just got registered with a local GP in Ireland¹ and their first email states that all their staff are “formally trained in GDPR procedures and any information you give will be used in the strictest confidence.”
Their email address is @gmail.com
🤦♂️
¹ This, in and of itself, is a bloody miracle these days, apparently, and only happened because it’s a new practice that’s just opening up.
Actually, let me use this as an example of how everything has gone wrong with web development in the last decade or so.
Dan Abramov is a very brilliant guy who is part of the Facebook's React team. He has been the most important name in the team working on React for years. And now, they are pushing for changes in React that would make it consume streams of data that updates the UI before the entire data request is completed, instead of just requesting the data and then 'painting' it once they get the reply for that request.
This is nuts. This is a micro optimization. 95% of the users won't ever notice, and those who do (people using extremely bad connections) would be much better if the site wasn't using React at all. At the same time, I'm sure half of the websites in the World who currently uses react will jump to implement this, making their code way more complex, brittle, sucking their productivity down, and in the long term, being worse for the users. Just for absolutely not even a short-term gain at all in their products.
Then why these kind of things keep happening? Because Facebook is too big. And somehow they ended being the ones in control of the most popular web-app framework used by most of the sites nowadays.
The state of the current Javascript ecosystem is what happens when you get companies with hundreds, thousands of engineers, to build sites that 15 years ago would have been built by 1/10th of that number of people. What you get is a lot of people working on a product that's actually mature already, and whose job end being going after that extra 1%, that last micro optimization that could make your site better in a very narrow set of cases. And they don't care about the complexity, because they are part of an engineering organization with literally thousands of hands to throw at any problem. Setting up your code bundler now takes hundreds of lines of code that need constant maintenance to achieve just a 5% improvement over gzipped plain JavaScript? No big deal, they have 6 people working full time on that. React switching to a different programming paradigm each two versions? Nice, now the 900 devs working in the web version has something to do for a few months.
But then small to medium teams adopt these tools. And suddenly you have a 5, 20, 50 devs team having to do the same work the Facebook web team does. Without any of the problems Facebook has to solve.
What's worse: a big share of the current JavaScript ecosystem exists just to solve problems introduced by the previous iterations. Think about it from a user perspective: does the web work any better, does Netflix, Facebook, twitter, tumblr, etc load faster, perform better than they did ten years ago? On the contrary, most of us have more powerful computers, phones. We have significantly faster internet connections. But sites are, at best, as fast as they used to ten years ago. In most cases they are even slower.
And from the engineer perspective it's not better: web development is significantly harder, more complex, slower nowadays that what it was ten years ago. Things that were trivial are now complex. Things that were complex still are. Product-wise, we are not doing anything more complex than what we were doing in early to mid 10s. But somehow now everything is harder, involves more code, everything is now orders of magnitude more complex. And it's not even making the web a better experience.
We made this mess. We made the web worse for everyone. We made our jobs harder for ourselves. It's so stupid.
The W3C – the standards body of surveillance capitalism – on privacy.
If you had any “privacy principles” to speak of, what would Adobe, Alibaba, Amazon, Baidu, Bloomberg, Google, Huawei, IAB, IBM, Meta, Microsoft, Oracle, Samsung, Bilibili, SoftBank, Tencent, Yahoo!, Zoom, etc., be doing on your members list?
Oh, by the way, since the optimisations I made on the bundle size, it now takes ~3.61 seconds to download and install Kitten¹ on my machine and ~1 second to update it (the initial install takes longer as Kitten downloads its own Node.js runtime and that takes the bulk of the time).
Every second you don’t spend waiting for something is a second you can spend doing something else 🤓
“The subtlety of it is pretty insidious. Like some kind of distributed long con, played out over and over, in the midst of so many millions of other simultaneous ones.”
Are you a privacy professional? Would you like to work with companies like Google and Facebook to help them continue to violate our privacy? The W3C has a job for you.
Pays well, by the way (violating human rights always does).
W3C is seeking a full-time staff member to lead our Privacy standardization efforts.
The position is for remote work from anywhere in the world.
Requirements include: extensive knowledge of privacy technologies and methodologies, including authentication, identity management, cryptography and familiarity with core web technologies, such as HTML, HTTP, Web APIs, and scripting #hiring#webprivacy See more at: https://www.w3.org/news/2024/hiring-privacy-lead/
Saying put a robots.txt file on your site if you don’t want your work to be abused by corporations for profit is like saying wear a t-shirt listing all the people you don’t want to have sex with if you don’t want them to have sex with you.
To the utter befuddlement of techbro douchebags everywhere, turns out that’s not how consent works.
@RyunoKi just reminded me of this piece I wrote two years ago. It seems rather timely what with the news about Harvard and Meta, and, beyond tech, Cop28, etc.