[Zope-CMF] Re: CMFTestCase: Best way to create the CMF site?

Tres Seaver tseaver at palladion.com
Thu Oct 6 11:47:21 EDT 2005


-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

Geoff Davis wrote:
> On Wed, 05 Oct 2005 12:34:25 +0200, Lennart Regebro wrote:
> 
> 
>>Any test including PortalTestCase should really not be seen as a unit
>>test, but a fucntional test. ;) If we could put in some effort of
>>making a minimal dymmy-portal that can be deleted and recreated very
>>quickly, then that would be very interesting. I would assume that that
>>involved a lot of work though...
> 
> 
> CMFTestCase creates a minimal portal that can be deleted and recreated
> relatively quickly.  But it actually is even smarter than that: it uses
> the transaction machinery in such a way that it only has to create and
> delete the test site once.  See my reply to Chris for an explanation.
> 
> The nice thing about CMFTestCase is that it creates an actual CMF site,
> not some dummy site whose functionality may or may not be equivalent to
> that of a real CMF site.  I think there is a place for using really
> stripped-down dummy components.  However, widespread use of dummy
> components comes with some real headaches:
> 
> * As you note, dummy components take a lot of time to write.

Not necessarily.  They *do* require some knowledge of the API of the
thing they are fronting for, as well as a sense of what the calling test
needs.

> * Dummy components create the need for new tests to ensure that the dummy
> components' functionality really does match that of the components they
> are replacing.  Do we have such tests in the CMF?  I'm not sure we do.

I don't think we need to test the tests.  The point of the dummies is to
emulate the published API (the interface) of the tool / content they are
replacing.  Often, they won't actually *do* the work required, and may
in fact have extra stuff in them to make testing the *caller* easier.

> * Dummy components create the need for additional documentation.  The
> absence of such documentation creates barriers to test writing and, as a
> result, to the contribution of code to the CMF.

Nope.  Dummy components do *not* need documentation.  Their purpose
should be clear from use / naming, and their API is supposed to be the
same as the (already documented, we assume).  The price of maintenance
(occasionally having to extend / fix the jig) is a necessary

> At some point I think we have to trust the stack.

I do not believe that "trusting the stack" makes senses when trying to
test a component of the stack.  If you are writing tests for an
application (or higher layer) which *uses* the stack, then you can
safely trust it.  For instance, I'm willing to use OFS.SimpleItem and
OFS.Folder when building out a test jig, because they belong to a lower
layer of the stack, and have their own tests.

>  After all, we don't go
> around writing dummy versions of python modules such as httplib.  CMFCore
> should be able to assume Zope; CMFDefault modules should be able to assume
> CMFCore components; products built on CMFDefault should be able to assume
> it, etc.

Such assumptions don't create unwanted dependencies, true.  They may or
may not make for useful tests:

  - If the "trusted" component has no side effects which might affect
    this or later tests;

  - If the "trusted" component does not make unwarrented assumptions
    about the state of the system;

  - If the test being written does not need to "instrument" the
    component in order to write a better / clearer / more comprehensive
    test of its target component.

> I think the speed issue is a red herring.  I just timed Plone's tests
> (almost all of which use PloneTestCase) and CMFCore's tests (all of which
> use stripped down dummy components).  The results:
> 
> Plone tests: 0.14 sec/test
> CMFCore tests: 0.09 sec/test
> 
> The dummy components really aren't saving much time.  If you spent the
> same amount of work on customer projects that you would spent writing,
> documenting, and maintaining a set of good dummy components, I am sure you
> could buy a very, very fast computer that would run the tests in no time.

Timing *may* be a red herring;  the issue is likely worse for folks
trying to run tests on machines with less-than-blazing CPUs.  THere is a
classic back-and-forth in the test-driven development community
(documented by Beck and others) in which people write more and more
tests, until the run-time for the entire suite becomes so painful that
people begin avoiding running them all;  the team then has to stop and
profile / refactor the tests themselves, in order to remove that burden.

Here are timings for the stock CMF components and Plone on my box:

  Product         # tests   Wall-clock (s)
  ---------       -------   --------------
  CMFCore             382           28.775
  CMFDefault          164            2.980
  CMFActionIcons       11            0.002
  CMFCalendar          23            1.636
  CMFTopic             58            1.898
  CMFSetup            341            2.028
  DCWorkflow           10            0.025

My guess is that CMFCore's tests are ripe for such a refactoring (there
is a noticeable lag of a couple of seconds several times during the test
run, for instance).

False dependencies, poor separation of concerns, and poor test coverage
are real issues with using a "functional" test jig where a unit test jig
would be more appropriate.


Tres.
- --
===================================================================
Tres Seaver          +1 202-558-7113          tseaver at palladion.com
Palladion Software   "Excellence by Design"    http://palladion.com
-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.5 (GNU/Linux)
Comment: Using GnuPG with Thunderbird - http://enigmail.mozdev.org

iD8DBQFDRUcJ+gerLs4ltQ4RAqsdAJ4+uk63HdIswr9i/qrEZKWuW84sSwCeP2ol
97YF1o8z4V4TVn+PLBXArLE=
=Xzuy
-----END PGP SIGNATURE-----



More information about the Zope-CMF mailing list