[mod_python] Perfect solution for split recordset by N number of records per page

Sean Davis sdavis2 at mail.nih.gov
Thu Sep 28 07:16:23 EDT 2006


On Thursday 28 September 2006 06:51, Mike Looijmans wrote:
> > You probably need to think about tuning your database a bit to see if you
> > can speed things up.  Make sure that you have the correct indices in
> > place and than you are using limit/offset as best you can.  This is
> > really a database problem as much as a web app design issue.  In other
> > words, if your SQL query to get a "page" of data takes 3 seconds, this is
> > a database issue, which I suppose it is.
>
> It's also a UI issue. Who would want a user to browse through 10000 records
> in pages of 100 records? The UI should allow some method of limiting or
> summarizing the results.

Good point, Mike.

Of course, but if it takes 10 seconds (for example) to pull up 100 records for 
the UI (to render one page of the 100 pages), that IS a database issue.  I 
(probably incorrectly) inferred that the OP was having problems with using 
the paging strategy with large resultsets.  That, to me, implies that the 
database needs to be tuned.  

As for the UI and user experience, I think that is often specific to the data 
being presented.  Sometimes an interactive graphical summary can be useful, 
sometimes simple text reports with links, and sometimes the best way IS to 
present 100 pages of 100 records each sorted in some clever way with 
appropriate links to move around (eg., Google).

Sean


More information about the Mod_python mailing list