How complex was the Pownce schema? (number of tables, columns in the most complex table, number of relationships, frequency of custom data types, etc.) How complex were the usage patterns?
I ask because I'm not interested in performance. I'm primarily interested in having effective domain objects for maintainable use from Python and a clean schema for friction-free use from my database shell. Declarative ORMs seem to straddle that gap to the determent of either end-point.
Most of the time, all you need for optimizing an ORM is a declarative "batch retrieve" framework, where getting one object can also cause the en-masse retrieval of collections of other objects. (Reducing the number of SQL queries by 100X in my experience.) I've written one in Smalltalk that lets you declare the sub-graph of the object model to be batch retrieved as just a list of accessor methods. Before that, I pair programmed with another Smalltalk ORM author while he was fixing another framework. And at my previous job, guess what they had -- another Smalltalk declarative batch retrieve framework! (The last one was written for TopLink)
Right. You'll notice that my article doesn't mention performance at all. I never claimed that ORMs are slow, or that any slowness caused by the abstraction couldn't be easily overcome.
I ask because I'm not interested in performance. I'm primarily interested in having effective domain objects for maintainable use from Python and a clean schema for friction-free use from my database shell. Declarative ORMs seem to straddle that gap to the determent of either end-point.