[Zope] File uploads causing memory to be used up

Tony McDonald tony.mcdonald@ncl.ac.uk
Tue, 15 Aug 2000 15:09:27 +0100


Hi all,
I have an external method that uploads RTF files(and only RTF files) 
into a unix directory. It works fine, except it doesn't seem to 
release the memory it uses up in uploading the file.

The method is;

def process_file(self, file, location, REQUEST):
         if file.read(1) == '':
                 return "NO file. Something wrong?"
         file.seek(0)
         split = string.split(file.filename, '\\')
         filename = string.lower(split[-1])

# save the file into the correct directory
         oname = os.path.join(savedir + location, filename)
         f = open(oname, "wb")
         body = file.read()
         f.write(body)
         f.close()
         if type(body) is not type(''):
                 body=body.data
         content_type, enc = guess_content_type(oname, body, None)

         if content_type == 'application/rtf':
# it's ok, we can go ahead and process this file.

# need to strip off the .rtf extension however
                 guideid = string.replace(filename, '.rtf', '')
                 filesize = os.stat(oname)[6]

# copy the file from savedir to opsdir
                 os.system("cp -f %s %s" % (oname, opsdir))

# now cd to the operations directory and do some stuff.
                 os.chdir(opsdir)

# actually do the grinding...
                 os.system("DoAGuide %s" % guideid)

# return the output of the log file
                 logfilename = os.path.join(opsdir, "%s.log" % guideid)
                 f = open(logfilename, "r")
                 logoutput = f.read()
                 f.close()

# get the template we're going to use for output.
                 template_str = self.done.read_raw()
                 template = DocumentTemplate.HTML(template_str)
                 theresult = template(self, filename=filename, 
contenttype=content_type, filesize=filesize, guideid=guideid, 
logoutput=logoutput)
                 return theresult

         return "File saved %s, size (bytes) %s. content-type [%s]" % 
(filename, filesize, content_type)

The files are roughly 2-3 megs in size, and each time they're 
uploaded the amount of memory (as reported by 'top') used by the 
process increases by about that amount.

*Any* ideas or thoughts would be greatly appreciated. We've had our 
server brought down twice by a python process (ie zope) that was 
taking up 1.1 gigs of RAM.

Tone
------
Dr Tony McDonald,  FMCC, Networked Learning Environments Project 
http://nle.ncl.ac.uk/
The Medical School, Newcastle University Tel: +44 191 222 5116
Fingerprint: 3450 876D FA41 B926 D3DD  F8C3 F2D0 C3B9 8B38 18A2