Prev: Join me on Multiply
Next: Connecting to MySql with PHP
From: chris h on 4 Oct 2010 14:39 I'm currently working on a project that requires the parsing of excel files. Basically the user uploads an excel file, and then a script needs to save a row in a Postgres database for each row in the excel file. The issue we are having is that when we task PHPExcel with parsing an excel file with, say 27k rows, it explodes with a memory error. I've read up on the PHPExcel forums and we've tried cell caching as well as ReadDataOnly, they do not seem to be sufficient. Does anyone here know of a way to do this? Surely there is a way to parse a large excel file with PHP. This is also NOT an on-demand service. That is, when someone uploads a file they get a task_id which allows them to check the status of their excel file. So the solution does not need to be a fast one! Thanks, Chris.
From: Per Jessen on 4 Oct 2010 14:46 chris h wrote: > I'm currently working on a project that requires the parsing of excel= > files. Basically the user uploads an excel file, and then a script > needs to save a row in a Postgres database for each row in the excel > file. The issue we are having is that when we task PHPExcel with > parsing an excel file with, say 27k rows, it explodes with a memory > error. I've read up on the PHPExcel forums and we've tried cell > caching as well as ReadDataOnly, they do not seem to be sufficient. >=20 > Does anyone here know of a way to do this? Surely there is a way to > parse a large excel file with PHP. =20 If your excel file is or can be transformed to XML, I would just use XSLT. No PHP needed.=20 --=20 Per Jessen, Z=C3=BCrich (19.1=C2=B0C)
From: Marc Guay on 4 Oct 2010 14:47 I use this: http://code.google.com/p/php-csv-parser/ No idea if it's any better than your current solution. I presume you've tried extending PHP's memory limit?
From: shiplu on 4 Oct 2010 15:01 On Tue, Oct 5, 2010 at 12:39 AM, chris h <chris404(a)gmail.com> wrote: > I'm currently working on a project that requires the parsing of excel files. > Â Basically the user uploads an excel file, and then a script needs to save a > row in a Postgres database for each row in the excel file. Â The issue we are > having is that when we task PHPExcel with parsing an excel file with, say > 27k rows, it explodes with a memory error. Â I've read up on the PHPExcel > forums and we've tried cell caching as well as ReadDataOnly, they do not > seem to be sufficient. > > Does anyone here know of a way to do this? Surely there is a way to parse a > large excel file with PHP. Â This is also NOT an on-demand service. Â That is, > when someone uploads a file they get a task_id which allows them to check > the status of their excel file. Â So the solution does not need to be a fast > one! > > > Thanks, > Chris. > 1. Remove any variable that contains big object if its not necessary. 2. Use unset when applicable 3. Read chunk by chunk. 4. Profile it to find the exact place where you are wasting memory. Optimizing that little portion of code can improve memory performance. -- Shiplu Mokadd.im My talks, http://talk.cmyweb.net Follow me, http://twitter.com/shiplu SUST Programmers, http://groups.google.com/group/p2psust Innovation distinguishes bet ... ... (ask Steve Jobs the rest)
From: chris h on 4 Oct 2010 15:09 Thanks Jessen/Marc, though the user provided format can be in xls, xlsx, or csv. So i need a solution to support all formats. Thanks for the ideas shiplu I'll get with the team and see if there's anything there we aren't trying. Chris. On Mon, Oct 4, 2010 at 3:01 PM, shiplu <shiplu.net(a)gmail.com> wrote: > On Tue, Oct 5, 2010 at 12:39 AM, chris h <chris404(a)gmail.com> wrote: > > I'm currently working on a project that requires the parsing of excel > files. > > Basically the user uploads an excel file, and then a script needs to > save a > > row in a Postgres database for each row in the excel file. The issue we > are > > having is that when we task PHPExcel with parsing an excel file with, say > > 27k rows, it explodes with a memory error. I've read up on the PHPExcel > > forums and we've tried cell caching as well as ReadDataOnly, they do not > > seem to be sufficient. > > > > Does anyone here know of a way to do this? Surely there is a way to parse > a > > large excel file with PHP. This is also NOT an on-demand service. That > is, > > when someone uploads a file they get a task_id which allows them to check > > the status of their excel file. So the solution does not need to be a > fast > > one! > > > > > > Thanks, > > Chris. > > > > 1. Remove any variable that contains big object if its not necessary. > 2. Use unset when applicable > 3. Read chunk by chunk. > 4. Profile it to find the exact place where you are wasting memory. > Optimizing that little portion of code can improve memory performance. > > -- > Shiplu Mokadd.im > My talks, http://talk.cmyweb.net > Follow me, http://twitter.com/shiplu > SUST Programmers, http://groups.google.com/group/p2psust > Innovation distinguishes bet ... ... (ask Steve Jobs the rest) >
|
Pages: 1 Prev: Join me on Multiply Next: Connecting to MySql with PHP |