This magazine is from a federated server and may be incomplete. Browse more on the original instance.

cwagner, in The Web Is Fucked


  • Loading...
  • argv_minus_one,

    “Fuck you for wanting a sterile web where everything is boring”

    …said no one ever.



  • Loading...
  • ShittyKopper,

    “hacker” “news” is a big fan of anything that inflicts pain and misery to anyone that’s not exactly like them (men working in high paying vc funded tech startups that will inevitably go out of business or sell out to some giant and cash out a big fat check)


    Out of curiosity, I have always thought text only web pages would have been way more accessible at the time were RSS was still a thing, then the blinking ad ridden pages you get nowadays.

    You tell me that wasn’t a thing?


    The article acknowledges this in the conclusion (emphasis mine):

    I’m done. There you have it. That’s my opinion about how ____ed the web is. Look, we will never get the web of old back. Let’s be honest, it wasn’t perfect either. The web of today is more accessible, more dynamic and pretty much a cornerstone of our society.

    Accessibility wasn’t the main topic discussed in the article. It was mostly pointing out that the current web is too centralised.


    Yeah, then sadly, they missed the boat on web 3.0 which is decentralized, resilient, static, and doesn't require blockchain.

    @0x1C3B00DA@lemmy.ml avatar

    Accessibility wasn’t the main topic discussed in the article

    That’s part of the problem. All these rants about the glory of Web 1.0 are ignoring the fact that Web 1.0 wasn’t usable for anybody with accessibility issues and the modern web is better for them. A tiny acknowledgement at the bottom of their rant shows how they value accessibility lower than all of their other concerns.

    @TheBat@lemmy.world avatar

    I don’t think accessibility is meant in term of disabled people.

    I understood it as accessible in terms of technical knowledge. Anyone can whip out their phone and access the internet… or at least use an app which needs internet.

    Eternal September is another term for it.

    @0x1C3B00DA@lemmy.ml avatar

    Accessibility almost always refers to disabled people, especially in web development. I’ve never heard anyone in the industry refer to accessibility in any other way, without explicitly making that clear.

    If they meant the reading you took from it, that’s even worse and my point is even more pertinent.

    @TheBat@lemmy.world avatar

    If they meant the reading you took from it, that’s even worse and my point is even more pertinent.

    Why? The internet is a powerful tool and there are plenty of morons using it without knowing anything about it.

    @0x1C3B00DA@lemmy.ml avatar

    my original point was that the main idea of the article down plays the accessibility gains of the modern web. Your reading was that the author meant a different definition of accessibility and not A11y, which would mean the author didn’t just down play it, they completely ignored it. The author is complaining that the modern web is awful, while ignoring the huge gains for people who need these accessibility features and how awful web 1.0 was for them

    @TheBat@lemmy.world avatar

    I think the author used both meanings at different times.

    First time they mention interesting website designs at the cost of accessibility.

    But the second time they mean how low the technical barrier is to access the modern (and bland) web and how it tries to caters to lowest common denominator.

    Isoprenoid, (edited )

    The article wasn’t really about Web 1.0 as much as it was about the time that Web 1.0 was around. The author could remove “Web 1.0” and replace it with “late 1990s to early 2000s Internet”.

    That’s part of the problem.

    No, thats just the angle that the article wanted to take. Just because it ignores an aspect of something doesn’t mean that its position is moot.

    Are you asking for every article ever to have a section discussing accessibility? I’d rather we let the author speak their mind, and focus on what they want to say.

    @0x1C3B00DA@lemmy.ml avatar

    Are you asking for every article ever to have a section discussing accessibility?

    No. I’m asking that when they complain about how the modern web is “fucked” and web 1.0 was better, they don’t try to act like that is an absolute, since that’s an opinion that is not widely applicable.

    No, thats just the angle that the article wanted to take. Just because it ignores an aspect of something doesn’t mean that its position is moot.

    Ignoring part of a topic makes your argument weaker.


    they don’t try to act like that is an absolute

    Again, to write an article means to cut out things that don’t matter to the core argument. You’re asking for the writer to complete a thesis.

    Ignoring part of a topic makes your argument weaker.

    And again, this is an opinion piece, not a well developed thesis. What you are asking for is both unreasonable and impractical when writing an opinion piece.

    wrath-sedan avatar

    Living somewhere now where many of the local websites are terribly dated and while the initial nostalgia factor was nice the lack of functionality/accessibility is seriously a problem. Not to say you can’t make a functional/accesible site with old web standards, but some things changed for a reason.

    scarrtt, in Is Google having a total breakdown right now???

    Advertising spend has dried up and the shareholders are looking for growth. I don’t mind paying for YouTube Premium but I get it much cheaper than that because I connect via VPN when I pay for it. I use Romania but other people recommend Turkey or Argentina or any of the other countries that are having issues with their currencies.


    Yeah, but supporting Google & their practices still might not be the best thing … If possible, I avoid Google services


    Yep that’s the best way in my opinion. People recommend alternative Youtube frontends all the time but there’s none that works cross-platform, unfortunately.

    The discounted rate through other countries is the best way to get Premium perks without giving Google a shitload of money they certainly don’t deserve.

    I registered through Turkey. Last month’s bill was 1.04€.


    I’ve wondered what their figures are looking like now that alternative platforms are gaining popularity. Possible shrinkage of users? I know that a lot of content creators are publishing on multiple platforms now. When that happens a company typically starts to milk existing customers for more money to maintain shareholder value (short term).


    Ad spend probably dried up because they’re fucking over the advertisers.

    I work with Google ads every day, it’s my job.

    Google for the last 7 years have been trying fuck over their customers constantly,pushing changes that make them spend money on irrelevant searches, pushing automation and telling them it’ll be better (it never is).

    Confidence is just at an all time low, and they keep finding more ways to make it worse. I have to literally warm my customers not to listen to the Google reps because they’re third party sales companies hired to push these new tactics that end up costing them more money with less return.

    You’d think they’d try and impress customers with great sales figures, but instead they seem to just be pushing short term profits and fucking everyone over, making the platform a minefield


    Wow, I didn’t realize that it’s been going on for that long… Stone Soup, I suppose.

    Max_P, in Do we need Live Reload (Watch) in bundlers?
    @Max_P@lemmy.max-p.me avatar

    Incremental builds are much faster as it often only need to rebuild the specific part that changed. Just re-running the build in VSCode won’t help you if the build takes like 5 minutes, but still instant with watch mode.

    Hot reload also has some advantages such as not having to reload the page and lose state: it can just update the CSS live so you can quickly iterate on your styles even if you’re deep into the navigation at page 10 of a modal window.

    We don’t need live reload/watch, but it sure is convenient.

    @superb@lemmy.blahaj.zone avatar

    Are incremental builds only available when you’re hot reloading?

    @Max_P@lemmy.max-p.me avatar

    Depends entirely on the bundler. They all have a watch mode, not all of them do hot reload. Hot reload is cool but full of gotchas, it’s a hack but worst case you reload the page anyway. Some probably cache on disk, I think webpack can.

    But if you think about it, you either want a clean build for production, or rebuild quickly during development. Watch mode does exactly what you’d do anyway, re-run a build. And IDEs can watch its output to give you error messages.

    It’s much easier to implement: just emit the code again when the file changes, and let it go through the pipeline. It’s already there in memory all figured out, no need to figure a serialization format to put on disk to load back up.

    But to circle back to the original question of why use watch mode when you can just rebuild it when saving a file: you’re reinventing the wheel, and watch mode scales better and will always be faster. Yes, some people need watch mode.

    @superb@lemmy.blahaj.zone avatar

    I’m not a JS user, I was just surprised that bundlers were so bad


    Why could a build take 5 minutes? Is it due to the size of your project or the language the bundler is written in (JavaScript being a slower option over Rust)?

    @Max_P@lemmy.max-p.me avatar

    The scale of things. Large projects take longer to compile and bundle because they’re made out of thousands of files and hundreds of thousands of lines of code.

    Yeah, your hello program will go just as fast without a bundler, or a simple bundler. It’s when you go big, and use tons of libraries and stuff that it slows down and the advantages become clearer.

    That’s especially true when using big libraries like React and dozens of plugins for it, frameworks like Next.js, SASS for CSS or CSS in JS, all the JSX, all the TypeScript. It adds up especially if Babel is also involved to transpile for older browser support.

    5 minutes is a bit of an extreme use case, but the point is after the first build, live reload would still refresh your code at the same speed. So working on one page you don’t need to constantly rebuild the other hundreds of them, just the one you changed. If you target mobile platforms with Cordova or React Native then you also add a whole Android/iOS build + install + restart of the app. The same live reload would work even in that context, making the speed advantage even more apparent.

    These things are designed for large enterprise projects, not for your small helloworld sized hobbyist programs.

    Maddier1993, in You would think it would be easier for Canadians to get sponsorship

    For decades, outsourcing was happening to India and other east asian countries to cut costs. This happened even under low interest rate for loans.

    Now that interests are higher the cost cutting and outsourcing only increase.

    My own feelings are that we really should figure out if we want to work for such lowballers who cut us off like parasites at the first sign of trouble.

    It would be prudent for workers to seek work in smaller, private companies. While they have the risk of going bust… they’re hopefully not laying off people in the same breath as announcing record growth.

    slazer2au, in 6 Best Embedded Databases for 2024
    1. SQLite
    2. DuckDB
    3. RocksDB
    4. Chroma
    5. Kùzu
    6. Faiss
    Dendr0, (edited ) in Is owning websites for private individuals become forbidden by our governments ?

    Most of that only applies if you're running your website as a public-facing entity, i.e. open to the general public to browse/use or if you're running it as part of a business.

    If it's a non-commercial, private-use site you don't need any of those "requirements".

    -edit- Obviously depending on jurisdiction.

    Vincent, in Federated Credential Management (FedCM) API - Web APIs | MDN

    To make this work well with the Fediverse, you’d need to be able to specify your own server (e.g. programming.dev), which is under discussion at github.com/fedidcg/FedCM/issues/240.

    @ericjmorey@programming.dev avatar

    To make it work in a way that preserves privacy as a value held by many current users of federated social services, yes.

    But it seems like it can be implemented as is in any federated service and improve security for 3rd party frontend apps.

    Maybe I’m missing something essential, but holding out for a perfect implementation which may not be broadly adapted might be a mistake on a developer’s part if they want to provide value to ther service they’re developing for.


    Oh, I’m not calling for anyone to hold out (well, maybe except until this is widely supported across browsers), just encouraging folks to participate in the experimentation going on in that thread.

    Ember_Rising, in XMLHttpRequest (XHR) Vs Fetch API

    Also, you can use Wretch for a really neat builder pattern to make requests.


    walter_wiggles, in XMLHttpRequest (XHR) Vs Fetch API

    I think of XHR as more of a low-level API. Most of the time though you don’t need access to those low-level details.

    The fetch API is bit higher level and nicer to work with.

    Max_P, (edited ) in XMLHttpRequest (XHR) Vs Fetch API
    @Max_P@lemmy.max-p.me avatar

    XHR is absolutely ancient. Like, I used it on Internet Explorer 6 era websites. Using a 3x3 table with images in all 8 outer cells to make rounded corners.

    It still works but is so old it can’t really be updated. It’s entirely callback driven so no async. It’s not even async by default if I recall correctly, it just hangs the browser.

    The Fetch API was designed to modernize what XHR does, and does so well. Now, a simple get request you can pretty much await fetch(…) whereas the XHR one is probably 20-40 lines of code and most people end up using a library to deal with it, or write their own little wrapper for it. It supports promises natively.You can stream very large files without loading it all in memory. There’s nothing XHR can do that fetch can’t do, and usually does it better too. For most use cases the performance will be the same, network IO is orders of magnitude slower than JavaScript execution. But the API being better probably does lead to better performance by not doing unnecessary things and possibly processing data as it arrives instead of all in one go when the download is finished.

    It’s a modern replacement because it’s literally what it was designed to be. Try both and it’ll be abundantly clear why.


    Thanks @Max_P weird because I can use XHR as async… see: developer.mozilla.org/en-US/docs/Web/API/…/openopen(method, url, async)


    Sure it let’s you use it asynchronously, but it predates and is not really compatible with JavaScript’s async/Promise API.

    mozz, in Anyone able to help with my website being slow to certain sections?
    @mozz@mbin.grits.dev avatar

    Can you show a screenshot of the waterfall display while loading with a fresh cache / without the fresh cache?

    @mac@infosec.pub avatar

    I looked at the waterfall and I realised I had some design images in there at 2500+ resolution that were taking 5s to load and that was interfering with the loading of the JS.

    Thanks, I’ll make sure to check this tab first before posting next time.

    @mozz@mbin.grits.dev avatar

    All good 👍🏻

    onlinepersona, (edited ) in Understand the Next Phase of Web Development - Steve Sanderson - NDC London 2024

    From Microsoft? Oh no… is this a blazor sales pitch?

    Edit: not really. Of course he had to use it, but it wasn’t too prominent. Didn’t finish the video but server-side rendering seems to be the “future”. I dunno… that doesn’t give me a warm tingly feeling.

    Anti Commercial-AI license

    LPThinker, (edited )

    More specifically, he argued (and the recent and upcoming releases of most major frameworks agree) that rendering most content on the server with islands of client-side interactivity is the future.

    That’s not necessarily a huge revelation, but the big difference from what people have been doing with PHP for decades is the level of integration and simplicity in mixing server-side and client-side code seamlessly so that a dev can choose the appropriate thing in each context and not have to go through a lot of effort when requirements change or scaling becomes an issue. I would say that this represents a new level of maturity in the “modern” web frameworks where devs can choose the right technology for every problem to serve their users best.

    some_guy, in Understand the Next Phase of Web Development - Steve Sanderson - NDC London 2024

    I’ll save you some time: the web will be made shittier still. I didn’t watch. That’s just what I’ve observed to be true.


    I think you would be pleasantly surprised by the direction web dev is moving if you gave it a chance.

    For example, I suspect that you think one of the ways the web has gotten shittier is because sites are too bloated and JavaScript frameworks are too heavy and slow.

    One of the key takeaways is that, across almost all frameworks and stacks, web dev is moving back to doing as much work on the server-side as possible, while retaining the minimum necessary interactivity via Islands of Interactivity with much lighter JavaScript than what was pushed for the last decade.


    This is welcome news. Thanks for giving me a bit of optimism.

    NostraDavid, (edited ) in Back to Basics in Web Apps
    @NostraDavid@programming.dev avatar

    I find it refreshing to write, not generate, HTML and CSS, and then sprinkle some JavaScript for interactivity.

    I’ve found hugo to be rather amazing in generating static HTML and CSS (converting either HTML or Markdown templates into regular HTML).

    I started out my personal website as:

    PS: Have you ever seen TheNet (1995)?

    PPS: All the HTML is pretty much all Semantic HTML, instead of Twitter’s div>div>div>div>div



    Ah yes, div soup.

    By the way, have you tried different generators and compared them, or tried only Hugo, out of curiosity?

    @NostraDavid@programming.dev avatar

    Only Hugo; I didn’t want to try anything JS based and hugo is faaaaaast in its generation. Sub 1 second fast. It’s so nice.


    Cool! Very nice. Do you need it to be that fast or is it just nice because faster is better?

    Aijan, (edited )

    Author here. My blog is also generated with Hugo, and it’s great. I just prefer not to generate HTML and CSS from JavaScript unless it’s necessary.

    Sorry, I haven’t seen that movie. Thanks for the recommendation though.

    @NostraDavid@programming.dev avatar

    Too bad you haven’t seen it - my site has a little easter egg from that movie :3

    xoggy, in Back to Basics in Web Apps
    @xoggy@programming.dev avatar

    1996 was the beginning of the chaos. We went from a document that the user’s browser parsed and styled to a free-for-all of website designs glued together on construction paper. Accessibility took a step back. You now needed a graphical desktop to view the web. Page content was no longer machine readable and became less portable. Now in a world of dynamic page content generated by javascript you need a v8 browser in crawlers just to index page content.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • webdev@programming.dev
  • DreamBathrooms
  • everett
  • cisconetworking
  • magazineikmin
  • mdbf
  • rosin
  • ngwrru68w68
  • thenastyranch
  • Youngstown
  • slotface
  • khanakhh
  • kavyap
  • Durango
  • ethstaker
  • JUstTest
  • normalnudes
  • tester
  • cubers
  • tacticalgear
  • InstantRegret
  • osvaldo12
  • modclub
  • Leos
  • provamag3
  • GTA5RPClips
  • anitta
  • megavids
  • lostlight
  • All magazines