[Zope] Serve large files efficiently from a pipe

Roman Suzi rnd at onego.ru
Tue Nov 8 15:34:54 EST 2005


On Tue, 8 Nov 2005, Paul Winkler wrote:

> On 11/8/05, Roman Suzi <rnd at onego.ru> wrote:
> > I don't know if writing to response one chunk at a time is proper
>> solution? Will Zope store response body or sent it right away? I am not
>> sure that it is the later...
>
> The classic way to do streaming data in Zope is like so:
>
> RESPONSE.setHeader('Content-length', N)
> for chunk in some_iterable_yielding_data:
>    RESPONSE.write(chunk)
>
> ... where N is the size in bytes of the data you are going to write.

OK... Then streaming from Unix pipe can't be done in Zope...

> However, you can also publish any method that returns an
> implementation of IStreamIterator as shown in
> http://svn.zope.org/Zope/trunk/lib/python/ZPublisher/Iterators.py?view=markup
>
> The advantage is that this has less overhead, as once your method has
> returned, the Zope app and ZODB are no longer involved, and your
> iterator and ZPublisher do all the work.
> This can become quite significant when you are streaming large data,
> more than a few hundred kB. However, your iterator *must* be able to
> do its work without a ZODB connection.
>
> You haven't said where the data you want to stream is going to come from,

Sorry if it was not clear: data comes from a pipe (that is, from another
application, called by popen2...

> so I can't guess whether the IStreamIterator technique will work for you...

  - I do not have named file... only file handler. But maybe named pipe
will do the trick... However, missing content-length (and unfortunately,
resulting content-length is not known) is the show-stopper.

> --
> http://www.slinkp.com
>

Sincerely yours, Roman Suzi
-- 
rnd at onego.ru =\= My AI powered by GNU/Linux RedHat 7.3


More information about the Zope mailing list