On Thu, Jul 30, 2009 at 7:50 AM, Alex Gaynor<alex.gay...@gmail.com> wrote:
>
> Hey all,
>
> In the process of my work on adding multi-db support to Django it has
> become necessary to alter the signatures of Field.db_type and
> Field.get_db_prep_(save, value, lookup).

Putting in my 2c as Alex's mentor on this project:

In the process of adding support for multiple databases, (and thus
multiple connections) Alex has identified the need to add a
'connection' argument to a couple of internal Field functions (db_type
and get_db_prep_*). Historically, these functions have used the global
`connection` variable; in a multi-db world, the connection needs to be
configurable.

These functions exist to allow users to define the behavior of custom
fields. They are documented API, so we can't change them to support
the new multi-db feature.

Alex has proposed using the same names for this family of functions,
but adding a `connection` argument. For example:
get_db_prep_value(self, value) would become get_db_prep_value(self,
value, connection).

To accommodate the new function signature, instead of calling
get_db_prep_value() directly, an introspective wrapper function would
be used. i.e.,

def call_db_prep_value(field, connection, value):
    if field.get_db_prep_value has a connection argument:
        return field.get_db_prep_value(value, connection)
    else:
        return field.get_db_prep_value(value)

This means that it legacy custom fields adhering to the old prototype
will continue to work; over time, we can introduce deprecation
warnings etc to encourage migration to the new-style prototype.

In order to accommodate any user-space invocation of
get_db_prep_value(value), the new prototype would need to provide a
default value for the connection parameter, defaulting to the default
connection. This answers Alex's second question - I don't see how we
can avoid having a default value for this argument.

I see a few weaknesses to this approach.

 * Firstly, there is the fact that introspection is involved. Alex
assures me that the overhead isn't large (he's quoted 20us per call),
but there is some overhead. Regardless, introspection isn't a
particularly Pythonic thing to do. I suspect it may also be possible
to do this using exceptions (i.e., call the old style function when a
TypeError is raised invoking the new-style  - ask forgiveness, not
permission), so there might be a work around for this objection.

 * Secondly, it rubs me the wrong way to add an overhead on every
normal usage of a core API. To my mind, the default usage should be a
clear path. Historical usage may need to invoke an overhead, but
correct usage shouldn't pay a penalty. Again, the exception approach
may help here.

 * Thirdly, there is the issue of appropriate defaults. Under this
proposal, you would be able to use an old custom Field in a multi-db
setup, even when the default connection isn't appropriate to the task.
For example, if your default connection was Postgres, but connection2
was MySQL, invoking call_db_prep_value(connection2, value) on an old
custom field will yield a result generated from the Postgres backend.
This strikes me as a class of error that will be hard to diagnose, and
should generate a warning.

 * Lastly, there is the fact that when the old-style functions are
deprecated, we need to do a run of housekeeping through the entire
codebase, removing the use of the shortcut that we (both Django's
developers, and any external users of these functions) will have grown
accustomed to using over time.

I would like to suggest an different approach. Instead of using the
same method name, lets introduce a completely new set of function
names - new_get_db_prep_value, for the sake of argument; a better
bikeshed colour is required. We would then modify Django to invoke the
new API as if the old one never existed.

Django's built-in fields would all implement new_get_db_prep_value().
New CustomField subclasses would also be expected to define
new_get_db_prep_value().

For backwards compatibility, the base Field class would have an
implementation that deferred to the old-style name - but only when
appropriate:

def new_get_db_prep_value(self, connection, value):
   if connection matches global_connection:
       return self.get_db_prep_value(value)
   else:
       raise IncompatibleFeatureException

As a result of this approach:

 - There is no overhead for the 'normal' case (all built-in Django
fields and new-style custom Fields)

 - Old custom fields would continue to work, but only when they will
work reliably. An exception is raised when they won't work.

 - Existing user-space code calling get_db_prep_value will continue to
work, but will behave unpredictably in a multi-db setting.

 - The old-style get_db_prep_value function can be formally deprecated
through normal approaches.

Of course, Alex and I may have missed an obvious alternative solution
- or a use case that isn't solved by either approach. Any feedback or
alternative approaches we may have missed are most welcome.

Yours,
Russ Magee %-)

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django developers" group.
To post to this group, send email to django-developers@googlegroups.com
To unsubscribe from this group, send email to 
django-developers+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-developers?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to