You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've got a problem when using reader2 for parsing big Excel files (from 5Mb
filesize)
Seems like it takes more then 75Mb of RAM to build full structure of all
raws and columns. And actually it's more then most of memory limits for
general php settings (mine is 64Mb). I was surprised because first reader
(from Vadim Tkachenko) takes about 55Mb RAM for the same file parsing.
Which is also a lot, but still within limit.
So first idea was to add functionality of to skip some sheets and skip
unneeded cell info (links, colors etc.). I've tried it myself but it gave
me not more then 5Mb RAM saving.
The second idea is to go the same way with XML parsers where beside full
getting of DOM there are bunch of agile readers which get only what we need.
So idea is when we work with big file, like mine. First we parse document
and get only meta data: number of sheets, of columns and rows. And then
call something like $reader->getCell(sheet, col, row) and parse data from
that row.
So in the memory will be only pointers to needed bits, but not all the document
I have not time now to work on it, but if I came up with smth I'll post it here
Thanks,
Nick
Original issue reported on code.google.com by [email protected] on 10 Jun 2009 at 2:14
The text was updated successfully, but these errors were encountered:
I have same issue. 6k lines/7columns can jump up over 128MB RAM and then my PHP
(CLI) fail because the limit has been reached..
Enable/Disable formating parsing would maybe help... ?
Can anybody look on this issue please?
Increase the following value in .htaccess file
php_value post_max_size 200M
php_value upload_max_filesize 200M
php_value memory_limit 512M
php_value max_execution_time 180
Original issue reported on code.google.com by
[email protected]
on 10 Jun 2009 at 2:14The text was updated successfully, but these errors were encountered: