Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Perfomance issues with big files #30

Open
GoogleCodeExporter opened this issue Apr 21, 2016 · 3 comments
Open

Perfomance issues with big files #30

GoogleCodeExporter opened this issue Apr 21, 2016 · 3 comments

Comments

@GoogleCodeExporter
Copy link

I've got a problem when using reader2 for parsing big Excel files (from 5Mb
filesize)
Seems like it takes more then 75Mb of RAM to build full structure of all
raws and columns. And actually it's more then most of memory limits for
general php settings (mine is 64Mb). I was surprised because first reader
(from Vadim Tkachenko) takes about 55Mb RAM for the same file parsing.
Which is also a lot, but still within limit.

So first idea was to add functionality of to skip some sheets and skip
unneeded cell info (links, colors etc.). I've tried it myself but it gave
me not more then 5Mb RAM saving.

The second idea is to go the same way with XML parsers where beside full
getting of DOM there are bunch of agile readers which get only what we need.
So idea is when we work with big file, like mine. First we parse document
and get only meta data: number of sheets, of columns and rows. And then
call something like $reader->getCell(sheet, col, row) and parse data from
that row.
So in the memory will be only pointers to needed bits, but not all the document

I have not time now to work on it, but if I came up with smth I'll post it here

Thanks,
Nick

Original issue reported on code.google.com by [email protected] on 10 Jun 2009 at 2:14

@GoogleCodeExporter
Copy link
Author

I have same issue. 6k lines/7columns can jump up over 128MB RAM and then my PHP 
(CLI) fail because the limit has been reached..

Enable/Disable formating parsing would maybe help... ?

Can anybody look on this issue please? 

Original comment by [email protected] on 18 Jan 2012 at 3:33

@GoogleCodeExporter
Copy link
Author

I have removed formating metadata from class for those who don't need them - 
this saves memory usage in case you have large files - see attached file


Original comment by [email protected] on 18 Jan 2012 at 4:00

Attachments:

@GoogleCodeExporter
Copy link
Author

Increase the following value in .htaccess file
php_value post_max_size 200M
php_value upload_max_filesize 200M
php_value memory_limit 512M
php_value max_execution_time 180

Original comment by [email protected] on 5 Oct 2012 at 1:22

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant