[mod_python] Graphs in a webpage

Gregory Trubetskoy grisha at modpython.org
Mon Nov 13 22:07:38 EST 2000


I used gd with nsapy on a windows machine in 1997 and it was very very
fast (considering computers were slow back then). Yahoo uses it on their
quotes.yahoo.com of all other tools, mrtg uses it, it's probably the
fastest thign out there.

The things that are the slowest are repetetive operations that involve
memory allocations - e.g. if you're building some large list from the
ascii files one byte at a time and your ascii file is 90K, it will be
slow. You will see a big improvement if you only build the list once and
cache it inside a module somewhere. 

If your content is dynamic and cannot be cached, then try to avoid things
like readlines() of the whole file, then copying it some list. A much
faster solution is to read the file as you need the data avoiding all
unnecessary mallocs.

A database may make things faster - it very much depends on what you
do. For example if you try to process data in some way, i.e. sort or
select records, etc, then most definitely a database, especially if ran on
a separate server will give you an improvement. 

If you're just using it as flat file storage, then it will be about as
fast as a file, perhaps a little slower.

mod_python interpreter lives as long the as the apache process (or in case
of win32 thread) lives. The simplest way to cache data is to use the
Python "import" mechanism - make a module that reads all your ascii files,
then import it from a script. Python only imports once the first time and
ignores all subsequent requests - perfect for caching.

HTH

--
  Gregory (Grisha) Trubetskoy
       grisha at modpython.org

On Tue, 14 Nov 2000, Alexis Iglauer wrote:

> This may be the wrong forum for this, but......
> 
> I am writing an app which takes data out of text files and makes graphs.
> The user interactively specifies the graphs in a browser, with everything
> being done in Python as a handler.  The resulting page can have 10+ graphs
> on it.
> 
> Problem is, my solution is sloooooow.  I am using gdchart to generate the
> graphs on the fly (<img src="grph.py?color=0x0000ff" alt="xxx"> and plain
> vanilla python for the rest.
> 
> I am not sure whether reading the ASCII files is slow and whether sticking
> them into a database will be worth the effort (ideas, anyone?).  I get some
> advantage when running apache single-threaded by checking whether my data is
> already loaded, but as soon as I run apache normally this advantage
> disappears.
> 
> I think the bottleneck is creating the graph, and in trying to solve that I
> have a few questions regarding how mod_python works.  How long does an
> instance of mod_python exist?  When is an instance destroyed?  What data can
> be passed from one instance to the next?  Also, does anyone have any ideas
> on how to efficiently generate graphs from a reasonably large number of
> datapoints?
> 
> Thanks in advance
> Alexis
> 
> _______________________________________________
> Mod_python mailing list
> Mod_python at modpython.org
> http://www.modpython.org/mailman/listinfo/mod_python
> 




More information about the Mod_python mailing list