Hi,
The $xpdo->getCollection method returns an entire collection of objects. That’s fine if there aren’t too many objects to be retrieved, but if there are a lot php can run out of memory. The solutions I’ve found are:
1) increase the php memory limit - not a good approach since it could potentially compromise the stability of the server,
2) break the data into chunks by executing a query to return the ids i need, then loop through passing them to xpdo->getObject to retrieve the objects i want - not great because we’re essentially duplicating queries to the db, adding extra load to it.
So, i was wondering if instead of doing:
$people = $xpdo->getCollection('Person');
foreach ($people as $person)
$person->eat();
it might be better to implement some kind of memory management that would be accessed like:
$people = $xpdo->getBufferedCollection('Person');
while ($person = $people->getNextObject())
$person->eat();
The getNextObject method would refill a buffer every time it runs out, meaning that we don’t have to duplicate calls to the db, and we wouldn’t run into the memory issues of a pure getCollection call for large datasets.
Does this sound like a good idea? Is it in the works? Is it already implemented somehow in xpdo and I’ve missed it? Is there a better alternative?