@bagder have you considered enabling compression by default in curl/libcurl? Given the large number of bots and other automation on the net that make use of it, seems like having that on by default could have a sizable impact on the amount of global network traffic. #curl#libcurl#http#compression#webperf
Lighthouse is a fantastic tool, but we fixate too much on the Performance score. It's a diagnostic tool under very specific conditions.
The Chrome User Experience Report (CrUX) data as surfaced in PageSpeed Insights and Google Search Console is what users ACTUALLY experienced, and what is used for Core Web Vitals. https://www.youtube.com/watch?v=AiAxHYcEDiM
@tunetheweb Lighthouse managed to get a lot of attention, coming from Google, and it still an effort to help people understand the focus is CRUX/CWV for what Google includes in ranking and what real page views are experiencing.
Just spent 20 frustrating minutes figuring out how to setup a new S3 bucket that could be used to publicly serve static files, despite having devoted an unreasonable amount of time in the past to solving that exact same problem when I built https://s3-credentials.readthedocs.io/
@simon I tend to assume that anything AWS is likely going to be possible, but will take me way longer to figure out that I would guess. I understand why now there are AWS only specialists, it is a complex beast.