ORM Query results: Arrays vs Result handle wrapped in Iterator interface

Okay, here's one for the pro's:

For a couple of years now, i've been working on my own PHP ORM/ActiveRecord implementation that i named Pork.dbObject.

It's loosly based on the 'make your own site with rails in 5 minutes' movie we all saw a couple of years ago. You can do things like:

$clients = dbObject::Search("Client", array("ID > 500"));


$client = new Client(218); // fetch row with id 218 from client table


$projects = $client->Find('Project');

This will fetch one or more rows from the database, wrap them in a dbObject and return them in one array, or return false of there are no results.

All of this has been working perfectly in dozens of sites and backends, but now my colleague is using it to create a huge logparser and here starts the memory usage problems..

The queries he runs can return over 20.000 rows, maybe even more, which is ofcourse not a very good thing to wrap into an object wrapper all at once and return as a single array.

The obvious solution would be to return an object that implements the Iterator interface instead of an array. It shouldn't instantly fetch all the records from the resultset, but just hold the result resource for the generated database query and use mysql_fetch_* internally when you traverse the object as if it was an array.

Now we get to my real question: Can I, without any problems just do this? Are databases able to handle multiple open resultsets, and mix them and keep them in memory for a while?

For example, fetch 20 objects, loop them, let each of these 20 fetch 5 others, wich in their turn also fetch 3 others. This would create a loop where a number of different result handles will be kept in memory.

I know i can't serialize one of these objects, but will i be able to implement this without any problems in PHP5, or will database interfaces give me problems?


It depends on which database you're using and your database configuration.

For MySQL you need to make sure you use buffered queries. In PDO you set it like this:

$myPdo->setAttribute(PDO::MYSQL_ATTR_USE_BUFFERED_QUERY, true);

This means that all the data will be sent to the client (not the same as fetching it all in PHP).

The other (probably worse) alternative is to open a new database connection whenever you detect a query being run with a result set still open.

The normal mysql_query() uses a buffered query, so that will work with multiple result sets.

Need Your Help

Selecting a library / framework for video capture & recording

ffmpeg video-capture video-encoding gstreamer vlc

In one of the project that we have undertaken we are looking for a video capture & recording library. Our groundwork (based on google search) shows that vlc (libvlc), ffmpeg (libavcodec) and

COM port read - Thread remains alive after timeout occurs

c++ c serial-port

I have a dll which includes a function called ReadPort that reads data from serial COM port, written in c/c++. This function is called within an extra thread from another WINAPI function using the

About UNIX Resources Network

Original, collect and organize Developers related documents, information and materials, contains jQuery, Html, CSS, MySQL, .NET, ASP.NET, SQL, objective-c, iPhone, Ruby on Rails, C, SQL Server, Ruby, Arrays, Regex, ASP.NET MVC, WPF, XML, Ajax, DataBase, and so on.