I have a large set of data that I need to pull from mysql. Proably would reach 100,000+. I have done optimization for processing the data using indexing and other things. The problem I am facing right now is the memory overflow. Whenever i tried to pull the data higher than 100,000 then it shows memory size error. I have put the memory limit to 512M but there may be chances of large data so I cannot increase the memoery everytime. Is there a better of way of handling this. I am using cakephp and I need all data at once for my system.
-
8The requirement, that you need all data outside of the database at the same time seems flawed. Maybe you can push some computations to the database and retrieve just a smaller set of data.Sirko– Sirko2014-10-28 08:00:27 +00:00Commented Oct 28, 2014 at 8:00
-
Please define: 'pull from mysql'.KIKO Software– KIKO Software2014-10-28 08:01:01 +00:00Commented Oct 28, 2014 at 8:01
-
1possible duplicate of Returning a lot of rows in CakePHP & MySQLReeno– Reeno2014-10-28 08:42:23 +00:00Commented Oct 28, 2014 at 8:42
-
Also maybe this helps: stackoverflow.com/questions/5865417/…Reeno– Reeno2014-10-28 08:42:54 +00:00Commented Oct 28, 2014 at 8:42
-
retrieving so many records is perhaps required only if you are synchronizing your whole database with some remote client or redundant server, in such cases you should re-define your synchronization strategy.user2009750– user20097502014-10-28 13:02:14 +00:00Commented Oct 28, 2014 at 13:02
1 Answer
You can't escape the fact that the memory is limited. Period. The requirement is silly but fine. You'll have to process the data in chunks that fit into the available memory and send the chunks to the client or append them to the file you're writing.
This is basically AJAX pagination or "endless loading" just that you don't change the page but append the next page to the previous in the DOM tree until you reached the last page. Have fun enjoying a probably very slow responding site when all records are loaded and millions of elements exist in the DOM tree. ;)
Another way could be to create the view in a background job (shell, cron task) and then just send the pre-generated file to the client like a static page. The shell would have to paginate the data as well and append it to work around the memory limitation.