ctietze,
@ctietze@mastodon.social avatar

A question for more experienced developers than me:

We have this setup where all requests go through a PHP script for authorization (think: cookie) checking before serving files.

That's fine with HTML, but less ideal for 5MB PDFs.

I'm trying to search for ways to use PHP to allow/deny access, but otherwise let the web server (Apache) do its job.

Is there such a facility to rewrite requests for auth, but then go on serve the static files?

Crell,
@Crell@phpc.social avatar

@ctietze I wrote about some of the options and their relative costs a long, long time ago: https://www.garfieldtech.com/blog/readfile-memory (It's on a very old version of PHP, but I don't think the basic feature has changed in 20 years.)

Basically, authenticate and then use readfile(). You'll be fine. Or authenticate and then use XSendfile (Apache), or equivalent (nginx).

Symfony as a response object that does this for you, as do good PSR-7 implementations.

ctietze,
@ctietze@mastodon.social avatar

@Crell thank you, this analysis is great! I'm putting the gist of these findings into my Zettelkasten for reference. Haven't thought about memory mapping here at all!

ctietze,
@ctietze@mastodon.social avatar

@Crell Found a typo in the table :)

readflie → readfile

Crell,
@Crell@phpc.social avatar

@ctietze 12 YEARS and you're the first person to spot that... Humans really are bad at proof reading. 🙂

Fixed now, thanks.

ollieread,
@ollieread@phpc.social avatar

@ctietze not sure about Apache, but if switching to Nginx is an option, it supports auth subrequests. I don't know if there's a way to do it with Apache.

https://docs.nginx.com/nginx/admin-guide/security-controls/configuring-subrequest-authentication/#

ctietze,
@ctietze@mastodon.social avatar

@ollieread that's cool

Alister,
@Alister@mastodon.cloud avatar

@ctietze The normal answer would by x-mod-sendfile, which is not specifically listed as working with Apache v2.4, but from a quick search has reports of working fine. There is also a similar system (module) for Nginx (https://www.nginx.com/resources/wiki/start/topics/examples/x-accel/)

Another very useful alternative is using a separate CDN to serve the files with token authorisation - you add some parameters to the URL, and it will check and allow/deny the DL.
I had good results with both Google-Cloud-storage and BunnyCDN doing that.

Alister,
@Alister@mastodon.cloud avatar

@ctietze It would depend what the files are - if there were pre-created and you might want to download them again (or you otherwise want to store them long term), then pre-creating and uploading the files to the CDN is a great solution.

If they are dynamically created one-offs, then a local XSendFile solution is probably better - or just taking the hit with sending them to the user with https://www.php.net/manual/en/function.readfile.php

jamesholden,
@jamesholden@mas.to avatar

@ctietze If the PDFs are in S3, you could have your PHP generate a signed URL and redirect the user to it, which will authenticate them directly with AWS to get the file.

ghorwood,
@ghorwood@mastodon.social avatar

@jamesholden @ctietze i do this all the time in laravel and it’s pretty smooth.

https://dev.to/gbhorwood/laravel-storing-stuff-in-private-s3-buckets-39kh

ctietze,
@ctietze@mastodon.social avatar

@ghorwood @jamesholden That's really cool, thanks you two. We only read-protect 2 workshop slide decks at the moment, so a CDN isn't needed, but I can see where this would come in handy for our forums!

omich,
@omich@mastodon.social avatar

@ctietze Header Redirect - is that still a thing in #php?

ctietze,
@ctietze@mastodon.social avatar

@omich you're not that old, HTTP still exists :)

(yes)

deanatoire,
@deanatoire@mastodon.social avatar

@ctietze php can be quite efficient in terms of execution and memory to serve a static file after an autorisation check.
Symfony http component have a static file response that does that very efficiently.
Not sure for apache autorisation and redirection but I'm fairly sure it might be possible as well.

joby,
@joby@hachyderm.io avatar

@deanatoire @ctietze Yeah, if the implementation just uses readfile(), or fopen() with fpassthru(), or similar I wouldn't worry much unless you've clearly identified performance right there specifically as a problem.

There are probably some shenanigans that could be pulled with HTTP authentication, but I probably wouldn't.

ctietze,
@ctietze@mastodon.social avatar

@joby @deanatoire Thank you!

@Crell chimed in with this article: https://www.garfieldtech.com/blog/readfile-memory

That confirmed your responses and provided a couple of arguments, FYI

joby,
@joby@hachyderm.io avatar

@ctietze @deanatoire @Crell Nice. Yeah, I've got a few sites I run that while admittedly not very high traffic, do involve routinely serving hundreds of megabytes at a time through readfile() (same deal, big PDFs that need access control) and it's been totally fine even on a pretty modest server.

ctietze,
@ctietze@mastodon.social avatar

@joby @deanatoire @Crell That's super reassuring!

I guess you lose the ability to pause and resume the download maybe for huge files? (Haven't investigated yet.)

Crell,
@Crell@phpc.social avatar

@ctietze @joby @deanatoire Pretty sure yes. If you wanted to do that, there's some complex and rarely used HTTP headers that you'd have to handle manually, then work on the streams yourself. That would definitely be slower, but assuming my benchmarks are still vaguely correct, it wouldn't be orders of magnitude slower.

Unless that's a common issue for your use case, it's probably not worth the trouble.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • php
  • GTA5RPClips
  • DreamBathrooms
  • InstantRegret
  • magazineikmin
  • thenastyranch
  • ngwrru68w68
  • Youngstown
  • everett
  • slotface
  • rosin
  • ethstaker
  • Durango
  • kavyap
  • cubers
  • provamag3
  • modclub
  • mdbf
  • khanakhh
  • vwfavf
  • osvaldo12
  • cisconetworking
  • tester
  • Leos
  • tacticalgear
  • anitta
  • normalnudes
  • megavids
  • JUstTest
  • All magazines