I am importing data using django-import-export but because I use ForeignKeyWidgets there are a lot of database calls making the import very slow for only a few 100 rows (checked with django-debug-toolbar).
On the documentation page of Bulk imports the following is mentioned:
"If you use ForeignKeyWidget then this can affect performance, because it reads from the database for each row. If this is an issue then create a subclass which caches get_queryset() results rather than reading for each invocation."
I believe caching the get_queryset() results could help me, but I have no idea how to do the caching. Could you help me with some example code?
I tried the following but still see the same amount of database calls:
class CachedForeignKeyWidget(ForeignKeyWidget):
def __init__(self, model, field="pk", use_natural_foreign_keys=False, **kwargs):
self.cached_queryset = model.objects.all()
super().__init__(model, field, use_natural_foreign_keys, **kwargs)
def get_queryset(self, value, row, *args, **kwargs):
return self.cached_queryset
get_queryset()call once and store internally in a dict, and then subsequent lookups can use the dict.