@jbqueru@fosstodon.org
@jbqueru@fosstodon.org avatar

jbqueru

@jbqueru@fosstodon.org

I'm JBQ / Djaybee, Husband, Immigrant, Veteran, Highly Sensitive Person #HSP. He/Him. I write about tech and other things. I'm fluent in French and English.

I like: #skiing, #hiking, #biking, #games, #photography, visual #astronomy. #PixelArt, #painting, #knitting, #weaving, #crochet. #bead weaving, #CrossStitch and #BlackWork embroidery.

I am in the year-long process of moving from Spokane, WA, USA to Preveza, Greece.

This profile is from a federated server and may be incomplete. Browse more on the original instance.

jbqueru, to random
@jbqueru@fosstodon.org avatar

My wife @eugenialoli has been working on installing Linux on various old computers for which a lot of other options are now unsupported.

She's been finding that machines with 2GB or RAM or 16GB of storage tend to struggle, whether while installing the OS, booting, installing common apps or running those apps.

2GB of RAM is an incredibly large amount. As is 16GB of storage.

WTF are we software people doing as an industry that makes us consume so many resources?

jbqueru,
@jbqueru@fosstodon.org avatar

For reference, Windows XP's official requirements were 64MB of RAM and 1.5GB of storage. Even giving ourselves a 4x margin at 256MB/6GB, that's a very far cry from 2GB/16GB.

Windows XP was honestly a vastly usable OS, with vastly usable applications running on top.

What are we doing that now requires so much more?!?

jbqueru,
@jbqueru@fosstodon.org avatar

Let's talk orders of magnitude.

My workstation has 128GB of main RAM. Enough for some very heavy applications.

Shrink by 1000x, 128MB would run Windows XP or MacOS X of the same era, or early iOS or Android. Rich graphical environment, one application + accessories.

1000x further, 128kB is the top end of many 8-bit machines. Text mode single-tasking, simple documents.

Still 1000x, 128B is the Atari 2600 (+4kB or ROM for the code). Working set for simple games.

jbqueru,
@jbqueru@fosstodon.org avatar

@myfear I wish we still had the kind of foresight that IBM and Microsoft had at the time. Which computers today ship with the ability to expand the RAM to 10x the factory amount, simply by plugging in an expansion card, without any software modifications?

jbqueru,
@jbqueru@fosstodon.org avatar

@twallutis @eugenialoli I ran Multiplan in 128kB indeed. I'm not sure it's laziness, I've worked on one of the heavier web pages out there (one that can hit 1GB of RAM) and nobody around me was lazy. Wrong priorities, probably, indirectly because capitalism.

djm, to random
@djm@cybervillains.com avatar

Here's my 2c on the xz incident.

This is the nearest of near-misses. Anyone who suggests this was any kind of success is a fool. No system caught this, it was luck and individual heroics. That's not acceptable when unauthorised access to ~every server on the internet is on the table. We need to find a way to do better.

1/n

jbqueru,
@jbqueru@fosstodon.org avatar

@djm Looking at bits of the attack, it was indeed a sophisticated attack, but it also remained somewhat sloppy in some of its implementation, and a retrospective from the point of view of the attacker will find obvious flaws that can easily be corrected for the next round, such that the same luck won't protect us any more.

jbqueru, to random
@jbqueru@fosstodon.org avatar

At times, I tell myself that I'm not qualified to think about the xz situation.

Also, I remind myself that I've architected some major build system (and been scared by complexity and the security implications), that I've been tech lead for a major Open Source initiative (and got burned out in the process), and that I've had a production server breached when an attacker exploited a flaw in ssh that exfiltrated my sudo-enabled password.

Sadly, I've been there...

jbqueru, to random
@jbqueru@fosstodon.org avatar

2 days ago: xz backdoor.
Today: Easter Eggs.
Tomorrow: April Fools'.

Not a good time to trust software.

timbray, to random
@timbray@cosocial.ca avatar

1/2 Looking at one of the #xz writeup, this struck my eye: “The release tarballs upstream publishes don't have the same code that GitHub has. This is common in C projects so that downstream consumers don't need to remember how to run autotools and autoconf.” Ah, GNU AutoHell, I remember it well. Tl;dr: With AutoHell, even if you're building for a 19-bit Multics variant from 1988, it’s got your back. Except for it’s just too hard to understand and use, thus the above.

#infosec

jbqueru,
@jbqueru@fosstodon.org avatar

@timbray I'm also seeing surprisingly many lessons in there.

A partial list from me: Security through obscurity doesn't work. Everything is a security boundary, even build scripts and test files. Everything should be buildable from source and human-readable, even test files.

An indirect one that I remember sharply for having had to deal with it a lot in a past job: not everyone in an Open Source community has the same goals and priorities.

jbqueru,
@jbqueru@fosstodon.org avatar

@timbray Indirectly, Ken Thompson's paper comes to mind, as does a friend's PhD thesis on formally proving that the output of a compiler matched its input.

jbqueru,
@jbqueru@fosstodon.org avatar

@timbray I sadly agree. and that's an issue. Ultimately, if such files aren't human-readable, we can't trust that the output of a build matches its source files... and that's what happened with xz.

jbqueru,
@jbqueru@fosstodon.org avatar

@lauren @timbray I believe that this is a pivotal moment, that there will be a before and an after. I agree that the issue here is mostly a human issue, and technical solutions can't fix the whole thing. At the same time, we need to realize that neither humans nor technology can be fully trusted, and that a strong solution will have to strengthen on both sides.

jbqueru,
@jbqueru@fosstodon.org avatar

@klausfiend @lauren @timbray Deep inside, it's the complaining that worries me (and that hurt me personally in the past).

Open Source might imply indeed "you can legally maintain the software you rely on without having to ask anything from the original authors and maintainers", but that comes with a counterpart "this is provided as-is and you might be foreced to maintain it yourself, be prepared for that possibility."

We all rely on more software than we can maintain, and that's a major issue.

jbqueru,
@jbqueru@fosstodon.org avatar

@kkeller @mark @lauren @timbray As I understand, now, because 5.6 was starting to see adoption in test channels of major distros. I haven't read through the code enough to figure out why it didn't or couldn't get caught earlier. I'd rather not speculate further, at least not in public.

jbqueru,
@jbqueru@fosstodon.org avatar

@robryk @timbray You and I are reaching the same conclusion: chances are, building and testing should be separate phases, separate enough that test files aren't available while building.

That should be especially true for large binary-only test files that can't be reviewed and have to be blindly trusted - being binary-only, those have no reason to have to be processed through a build pipeline.

jbqueru,
@jbqueru@fosstodon.org avatar

@kkeller @lauren @mark @timbray From a regulatory point of view, I believe that the EU's Cyber Resilience Act (CRA) will assign liability, which might end up falling onto such distro maintainers (or whichever is the upstream-most commercial non-person legal entity, IANAL).

jbqueru,
@jbqueru@fosstodon.org avatar

@tknarr @lauren @timbray I think both of these can be true at the same time:
-It might be easier to infiltrate proprietary software than Open Source one.
-Open Source communities need to evolve their threat models and adapt accordingly.

And, in a scary Venn diagram kind of way, proprietary software companies probably also need to evolve their threat models, but might have a lot less ability to adapt accordingly.

jbqueru,
@jbqueru@fosstodon.org avatar

@mark @kkeller @lauren @timbray For stuff that is both free-as-in-code and free-as-in-beer, I expect things to remain mostly unchanged (IANAL). Once you take away either of those two and you're not talking about a personal project or personal contributions, things will probably change indeed and there'll probably be some chilling effect (IANAL).

I'd be much more a fan of the CRA if it didn't contain a 5-year planned obsolescence - it only requires security updates for 5 years at most.

jbqueru, (edited )
@jbqueru@fosstodon.org avatar

@tknarr @lauren @timbray I expect that we'll see a deprecation of a lot of gibberish languages: when those proliferate in a project, it becomes harder to defend a project (you need expertise in all of those at the same time) than to attack it (you only need to know the oddities of a single one, like how a line with only a period in it terminates the script such that nothing after it gets executed).

(1/2)

jbqueru,
@jbqueru@fosstodon.org avatar

@tknarr @lauren @timbray Generally speaking, I think in 4 dimensions: Processes and Practices (which I cluster under People), and Technology and Tools (which I cluster under Technical). Having awareness about those for gives me a framework for the various options to approach a problem.

Not accepting binary blobs is a Practice thing, generating binary data is a Tools issue, eliminating gibberish is a Technology issue, etc...

(2/2)

saagar, to random

The security community is going through the five stages of grief right now with the xz thing and I think a lot of people are coping with “there are technical measures that could have prevented this”. To move on, it is important to understand that this is not true in the slightest

jbqueru,
@jbqueru@fosstodon.org avatar

@saagar Yup. Chances are, the time between submitting the ifunc change and the actual exploit was spent trying to make the exploit work at all against the technology measures that existed. If there were different technology measures in place, that work would have been potentially harder or at least different, but that work would still have continued until the exploit worked.

eugenialoli, to linux
@eugenialoli@mastodon.social avatar

When I used to write for @osnews in the early 2000s, I was a proponent of the BSD-like licenses, where a user could decide not to open source back their changes. I still believe in that freedom, but I don't believe in all people anymore: corporations now have reached total , so now I'm a believer of the AGPL, the most extreme version of the GPL3. Everything should be as open as possible, and remain so. No exceptions.

jbqueru,
@jbqueru@fosstodon.org avatar

@eugenialoli Well, you've convinced me to switch my licensing from Apache 2.0 to AGPL 3.0+. My current code might be inconsequential, but that choice will guide me in the future.

jbqueru, to random
@jbqueru@fosstodon.org avatar

I stayed up until 1am last night fighting with interrupts on Amstrad CPC. That level of coding is so much fun, it's a like a huge puzzle. But it's eating into my sleep!

I think I won this round of the fight, and I got to change the timing of interrupts to my liking by abusing a mechanism where the interrupt timing tries to keep interrupts from happening too quickly back-to-back (where it either delays or skips).

grimalkina, to random
@grimalkina@mastodon.social avatar

Reading about this xz backdoor story from the outside as a person who is still learning much about the technical ins and outs, but as a psychologist it is just overwhelming to imagine a maintainer in this position and all of the feelings of pressure and skill based identity and social isolation that must be involved.

Imho psychology has a duty to show up for technology practitioners and work for them just like we see and work for the well-being of emergency workers, healthcare providers.

jbqueru,
@jbqueru@fosstodon.org avatar

@grimalkina Thanks for this. I've been an Open Source maintainer for a major project, I had my project attacked, and I did burn out doing so from community abuse, so I sadly speak with a tiny bit of experience, along with my overall experience as a software engineer and technical executive.

(1/n)

jbqueru, to random
@jbqueru@fosstodon.org avatar

Do not meddle in the affairs of Amstrad CPC interrupts, for they are subtle and quick to anger.

The mechanism is pretty smart, actually. The interrupts are running at 300Hz, in sync with the screen refresh (H+V). There's a mechanism to prevent interrupts from firing too close to one another in case one gets delayed, and a mechanism to get them back in sync with the vertical refresh.

I'm trying to abuse both of those mechanisms together, but so far I seem to be losing the battle.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • megavids
  • thenastyranch
  • rosin
  • GTA5RPClips
  • osvaldo12
  • love
  • Youngstown
  • slotface
  • khanakhh
  • everett
  • kavyap
  • mdbf
  • DreamBathrooms
  • ngwrru68w68
  • provamag3
  • magazineikmin
  • InstantRegret
  • normalnudes
  • tacticalgear
  • cubers
  • ethstaker
  • modclub
  • cisconetworking
  • Durango
  • anitta
  • Leos
  • tester
  • JUstTest
  • All magazines