From: "Jan G.B." on 29 Mar 2010 08:13 Top posting sucks, so I'll answer the post somewhere down there. <SCNR> 2010/3/29 Devendra Jadhav <devendra.in(a)gmail.com> > Then you can do file_get_contents within PHP. or any file handling > mechanism. > >> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt <im(a)ebhakt.com> wrote: > >>> Hi > >>> i am writing a web application in php > >>> this webapp primarily focuses on file uploads and downloads > >>> the uploaded files will be saved in a folder which is not in document > >>> root > >>> and my query is how will i be able to provide download to such files > not > >>> located in document root via php > >>> > Try something like that <?php $content = file_get_contents($filename); $etag = md5($content); header('Last-Modified: '.gmdate('D, d M Y H:i:s', filemtime($filename)).' GMT'); header('ETag: '.$etag); header('Accept-Ranges: bytes'); header('Content-Length: '.strlen($content)); header('Cache-Control: '.$cache_value); // you decide header('Content-type: '.$should_be_set); echo $content; exit; ?> Depending on the $filesize, you should use something else than file_get_contents() (for example fopen/fread). file_get_contents on a huge file will exhaust your webservers RAM. Regards
From: Nathan Rixham on 29 Mar 2010 08:40 Jan G.B. wrote: > Top posting sucks, so I'll answer the post somewhere down there. > <SCNR> > > 2010/3/29 Devendra Jadhav <devendra.in(a)gmail.com> > >> Then you can do file_get_contents within PHP. or any file handling >> mechanism. >>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt <im(a)ebhakt.com> wrote: >>>>> Hi >>>>> i am writing a web application in php >>>>> this webapp primarily focuses on file uploads and downloads >>>>> the uploaded files will be saved in a folder which is not in document >>>>> root >>>>> and my query is how will i be able to provide download to such files >> not >>>>> located in document root via php >>>>> > > Try something like that > <?php > $content = file_get_contents($filename); > $etag = md5($content); > header('Last-Modified: '.gmdate('D, d M Y H:i:s', > filemtime($filename)).' GMT'); > header('ETag: '.$etag); > header('Accept-Ranges: bytes'); > header('Content-Length: '.strlen($content)); > header('Cache-Control: '.$cache_value); // you decide > header('Content-type: '.$should_be_set); > echo $content; > exit; > ?> > > Depending on the $filesize, you should use something else than > file_get_contents() (for example fopen/fread). file_get_contents on a huge > file will exhaust your webservers RAM. Yup, so you can map the <Directory /path/to> in web server config; then "allow from" only from localhost + yourdomain. This means you can then request it like an url and do a head request to get the etag etc then return a 304 not modified if you received a matching etag Last-Modified etc; (thus meaning you only file_get_contents when really really needed). I'd advise against saying you Accept-Ranges bytes if you don't accept byte ranges (ie you aren't going to send little bits of the file). If you need the downloads to be secure only; then you could easily negate php all together and simply expose the directory via a location so that it is web accessible and set it up to ask for "auth" using htpasswd; a custom script, ldap or whatever. And if you don't need security then why have php involved at all? simply symlink to the directory or expose it via http and be done with the problem in a minute or two. Regards!
From: "Jan G.B." on 29 Mar 2010 08:53 2010/3/29 Nathan Rixham <nrixham(a)gmail.com> > Jan G.B. wrote: > > Top posting sucks, so I'll answer the post somewhere down there. > > <SCNR> > > > > 2010/3/29 Devendra Jadhav <devendra.in(a)gmail.com> > > > >> Then you can do file_get_contents within PHP. or any file handling > >> mechanism. > >>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt <im(a)ebhakt.com> wrote: > >>>>> Hi > >>>>> i am writing a web application in php > >>>>> this webapp primarily focuses on file uploads and downloads > >>>>> the uploaded files will be saved in a folder which is not in document > >>>>> root > >>>>> and my query is how will i be able to provide download to such files > >> not > >>>>> located in document root via php > >>>>> > > > > Try something like that > > <?php > > $content = file_get_contents($filename); > > $etag = md5($content); > > header('Last-Modified: '.gmdate('D, d M Y H:i:s', > > filemtime($filename)).' GMT'); > > header('ETag: '.$etag); > > header('Accept-Ranges: bytes'); > > header('Content-Length: '.strlen($content)); > > header('Cache-Control: '.$cache_value); // you decide > > header('Content-type: '.$should_be_set); > > echo $content; > > exit; > > ?> > > > > Depending on the $filesize, you should use something else than > > file_get_contents() (for example fopen/fread). file_get_contents on a > huge > > file will exhaust your webservers RAM. > > Yup, so you can map the <Directory /path/to> in web server config; then > "allow from" only from localhost + yourdomain. This means you can then > request it like an url and do a head request to get the etag etc then > return a 304 not modified if you received a matching etag Last-Modified > etc; (thus meaning you only file_get_contents when really really needed). > > I'd advise against saying you Accept-Ranges bytes if you don't accept > byte ranges (ie you aren't going to send little bits of the file). > > If you need the downloads to be secure only; then you could easily > negate php all together and simply expose the directory via a location > so that it is web accessible and set it up to ask for "auth" using > htpasswd; a custom script, ldap or whatever. > > And if you don't need security then why have php involved at all? simply > symlink to the directory or expose it via http and be done with the > problem in a minute or two. > > Regards! > In my opinion, serving user-content on a productive server is wicked sick. You don't want your visitors to upload malicous files that may trigger some modules as mod_php in apache. So it makes sense to store user-uploads outside of a docroot and with no symlink or whatsover. One more thing added: your RAM will be exhausted even if you open that 600mb file just once. Apaches memory handling is a bit weird: if *one* apache process is using 200mb RAM on *one* impression because your application uses that much, then that process will not release the memory while it's serving another 1000 requests for `clear.gif` which is maybe 850b in size. So better forget that file_get_contents)( when the filesize can be huge. :-) Regards
From: Anshul Agrawal on 29 Mar 2010 08:53 On Mon, Mar 29, 2010 at 6:10 PM, Nathan Rixham <nrixham(a)gmail.com> wrote: > Jan G.B. wrote: > > Top posting sucks, so I'll answer the post somewhere down there. > > <SCNR> > > > > 2010/3/29 Devendra Jadhav <devendra.in(a)gmail.com> > > > >> Then you can do file_get_contents within PHP. or any file handling > >> mechanism. > >>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt <im(a)ebhakt.com> wrote: > >>>>> Hi > >>>>> i am writing a web application in php > >>>>> this webapp primarily focuses on file uploads and downloads > >>>>> the uploaded files will be saved in a folder which is not in document > >>>>> root > >>>>> and my query is how will i be able to provide download to such files > >> not > >>>>> located in document root via php > >>>>> > > > > Try something like that > > <?php > > $content = file_get_contents($filename); > > $etag = md5($content); > > header('Last-Modified: '.gmdate('D, d M Y H:i:s', > > filemtime($filename)).' GMT'); > > header('ETag: '.$etag); > > header('Accept-Ranges: bytes'); > > header('Content-Length: '.strlen($content)); > > header('Cache-Control: '.$cache_value); // you decide > > header('Content-type: '.$should_be_set); > > echo $content; > > exit; > > ?> > > > > Depending on the $filesize, you should use something else than > > file_get_contents() (for example fopen/fread). file_get_contents on a > huge > > file will exhaust your webservers RAM. > > Yup, so you can map the <Directory /path/to> in web server config; then > "allow from" only from localhost + yourdomain. This means you can then > request it like an url and do a head request to get the etag etc then > return a 304 not modified if you received a matching etag Last-Modified > etc; (thus meaning you only file_get_contents when really really needed). > > I'd advise against saying you Accept-Ranges bytes if you don't accept > byte ranges (ie you aren't going to send little bits of the file). > > If you need the downloads to be secure only; then you could easily > negate php all together and simply expose the directory via a location > so that it is web accessible and set it up to ask for "auth" using > htpasswd; a custom script, ldap or whatever. > > And if you don't need security then why have php involved at all? simply > symlink to the directory or expose it via http and be done with the > problem in a minute or two. > > Regards! > > -- > PHP General Mailing List (http://www.php.net/) > To unsubscribe, visit: http://www.php.net/unsub.php > > Also look at readfile() and fpassthru if dealing with large files. Moreover, if you have control over the webserver then you can use PHP only for authenticating the getFile request and offload the file delivery operation to your webserver (Apache, NginX, lighttpd) using "X-SendFile" header in the response. Best, Anshul
From: Nathan Rixham on 29 Mar 2010 16:41 Jan G.B. wrote: > 2010/3/29 Nathan Rixham <nrixham(a)gmail.com> > >> Jan G.B. wrote: >>> Top posting sucks, so I'll answer the post somewhere down there. >>> <SCNR> >>> >>> 2010/3/29 Devendra Jadhav <devendra.in(a)gmail.com> >>> >>>> Then you can do file_get_contents within PHP. or any file handling >>>> mechanism. >>>>>> On Mon, Mar 29, 2010 at 1:00 AM, ebhakt <im(a)ebhakt.com> wrote: >>>>>>> Hi >>>>>>> i am writing a web application in php >>>>>>> this webapp primarily focuses on file uploads and downloads >>>>>>> the uploaded files will be saved in a folder which is not in document >>>>>>> root >>>>>>> and my query is how will i be able to provide download to such files >>>> not >>>>>>> located in document root via php >>>>>>> >>> Try something like that >>> <?php >>> $content = file_get_contents($filename); >>> $etag = md5($content); >>> header('Last-Modified: '.gmdate('D, d M Y H:i:s', >>> filemtime($filename)).' GMT'); >>> header('ETag: '.$etag); >>> header('Accept-Ranges: bytes'); >>> header('Content-Length: '.strlen($content)); >>> header('Cache-Control: '.$cache_value); // you decide >>> header('Content-type: '.$should_be_set); >>> echo $content; >>> exit; >>> ?> >>> >>> Depending on the $filesize, you should use something else than >>> file_get_contents() (for example fopen/fread). file_get_contents on a >> huge >>> file will exhaust your webservers RAM. >> Yup, so you can map the <Directory /path/to> in web server config; then >> "allow from" only from localhost + yourdomain. This means you can then >> request it like an url and do a head request to get the etag etc then >> return a 304 not modified if you received a matching etag Last-Modified >> etc; (thus meaning you only file_get_contents when really really needed). >> >> I'd advise against saying you Accept-Ranges bytes if you don't accept >> byte ranges (ie you aren't going to send little bits of the file). >> >> If you need the downloads to be secure only; then you could easily >> negate php all together and simply expose the directory via a location >> so that it is web accessible and set it up to ask for "auth" using >> htpasswd; a custom script, ldap or whatever. >> >> And if you don't need security then why have php involved at all? simply >> symlink to the directory or expose it via http and be done with the >> problem in a minute or two. >> >> Regards! >> > > In my opinion, serving user-content on a productive server is wicked sick. > You don't want your visitors to upload malicous files that may trigger some > modules as mod_php in apache. So it makes sense to store user-uploads > outside of a docroot and with no symlink or whatsover. even the simplest of server configurations will ensure safety. just use ..htaccess to SetHandler default-handler which treats everything as static content and serves it right up. > One more thing added: your RAM will be exhausted even if you open that 600mb > file just once. > Apaches memory handling is a bit weird: if *one* apache process is using > 200mb RAM on *one* impression because your application uses that much, then > that process will not release the memory while it's serving another 1000 > requests for `clear.gif` which is maybe 850b in size. again everything depends on how you have your server configured; you can easily tell apache to kill each child after one run or a whole host of other configs; but ultimately if you can avoid opening up that file in php then do; serving statically as above is the cleanest quickest way to do it (other than using s3 or similar). regards!
|
Next
|
Last
Pages: 1 2 3 Prev: Sessions and Security Concerns Next: How to redefine a function if it doesn't exist? |