1

It's a self expaining question but here we go. I'm creating a business app in Django, and i didn't wanted to "spread" all the logic across app AND database, but in the other hand, i didn't wanted to let the Database handle this task (its possible through the use of Triggers).

So I wanted to "reproduce" the behavior of the Databse Triggers, but inside the Model Class in Django (um currently using Django 1.4).

After some research, I figured out that with single objects, I could override the "save" and "delete" methods of "models.Model" class, inserting the "before" and "after" hooks so they could be executed before and after the parent's save/delete. Like This:

     class MyModel(models.Model):

         def __before(self):
             pass

         def __after(self):
            pass

         @commit_on_success #the decorator is only to ensure that everything occurs inside the same transaction
         def save(self, *args, *kwargs):
             self.__before()
             super(MyModel,self).save(args, kwargs)
             self.__after()

The BIG problem is with bulk operations. Django doesn't triggers the save/delete of the models when running the "update()"/"delete()" from it's QuerySet. Insted, it uses the QuerySet's own method. And to get a little bit worst, it doesn't trigger any signal either.

Edit: Just to be a little more specific: the model loading inside the view is dynamic, so it's impossible to define a "model specific" way. In this case, I should create an Abstract Class and handle it there.

My last attempt was to create a custom Manager, and in this custom manager, override the update method, looping over the models inside the queryset, and trigering the "save()" of each model (take in consideration the implementation above, or the "signals" system). It works, but results in a database "overload" (imagine a 10k rows queryset being updated).

2
  • What exactly are you trying to do in the pre/post save triggers? Depending on what you want, some methods might work and others won't. Commented May 19, 2014 at 19:44
  • Its hard to say, since each model MAY have a different behavior. A "SaleItems" could update the value of its "Sale" record, and the "Sale" record being updated could in time updated the "BillingStatus", and so on. Again, each and every model MAY have a pre/post behavior. Commented May 19, 2014 at 19:48

2 Answers 2

1

First, instead of overriding save to add __before and __after methods, you can use the built-in pre_save, post_save, pre_delete, and post_delete signals. https://docs.djangoproject.com/en/1.4/topics/signals/

from django.db.models.signals import post_save

class YourModel(models.Model):
    pass

def after_save_your_model(sender, instance, **kwargs):
     pass

# register the signal
post_save.connect(after_save_your_model, sender=YourModel, dispatch_uid=__file__)

pre_delete and post_delete will get triggered when you call delete() on a queryset.

For bulk updating, you'll have to manually call the function you want to trigger yourself, however. And you can throw it all in a transaction as well.

To call the proper trigger function if you're using dynamic models, you can inspect the model's ContentType. For example:

from django.contrib.contenttypes.models import ContentType

def view(request, app, model_name, method):
    ...
    model = get_model(app, model_name)
    content_type = ContentType.objects.get_for_model(model)
    if content_type == ContenType.objects.get_for_model(YourModel):
        after_save_your_model(model)
    elif content_type == Contentype.objects.get_for_model(AnotherModel):
        another_trigger_function(model)
Sign up to request clarification or add additional context in comments.

9 Comments

The problem with "manually call" is that i'll never know which model i'm handling with, because the model handling in the view is dynamic. And again, bulk update/insert doesn't trigger signals
Could you clarify what you mean by "the model handling in the view is dynamic"? Would this work or updates: to manually call your signal, you would have to do: queryset.update(field=value); for model in queryset: after_save_your_model(model)
And you can do something similar for bulk inserts only for post_save.
Basically, every request made to the pattern 'r'^(?P<app>(.*?))/(?P<model>(.*?))/(?P<method>(.*?))/'' goes to the same view method, and inside the view, i load the model using "get_model(app,model)". And your sugestion is not REALLY like a DB Trigger, because it wouldn't work if the data of row "id=2" depends of the data of row "id=1".
No need for extra database queries with the content types. Instead, you can connect the different functions to the signal with different senders specified, or if you have more than one sender for a signal, pass sender=None in connect and check against the actual model class. Content types are only necessary for generic relations on a database level.
|
0

With a few caveats, you can override the queryset's update method to fire the signals, while still using an SQL UPDATE statement:

from django.db.models.signals import pre_save, post_save

def CustomQuerySet(QuerySet):
    @commit_on_success
    def update(self, **kwargs):
        for instance in self:
            pre_save.send(sender=instance.__class__, instance=instance, raw=False, 
                          using=self.db, update_fields=kwargs.keys())
        # use self instead of self.all() if you want to reload all data 
        # from the db for the post_save signal
        result = super(CustomQuerySet, self.all()).update(**kwargs)
        for instance in self:
            post_save.send(sender=instance.__class__, instance=instance, created=False,
                           raw=False, using=self.db, update_fields=kwargs.keys())
        return result

    update.alters_data = True

I clone the current queryset (using self.all()), because the update method will clear the cache of the queryset object.

There are a few issues that may or may not break your code. First of all it will introduce a race condition. You do something in the pre_save signal's receivers based on data that may no longer be accurate when you update the database.

There may also be some serious performance issues with large querysets. Unlike the update method, all models will have to be loaded into memory, and then the signals still need to be executed. Especially if the signals themselves have to interact with the database, performance can be unacceptably slow. And unlike the regular pre_save signal, changing the model instance will not automatically cause the database to be updated, as the model instance is not used to save the new data.

There are probably some more issues that will cause a problem in a few edge cases.

Anyway, if you can handle these issues without having some serious problems, I think this is the best way to do this. It produces as little overhead as possible while still loading the models into memory, which is pretty much required to correctly execute the various signals.

2 Comments

i dont think that it'll will get any closer than this. the only REAL problem that i see, is that yet aren't a sincronous chain between each record "pre/save/post" . In the way that you described, it triggers all the pres, and then saves, and then all the posts.
The update method executes a single sql UPDATE statement. Without saving each instance individually, there is no way to achieve a synchronous chain of pre/save/post for each row. Not in Python, anyway.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.