I was asked “How do you get data into a Django application?”.
Apart from the “regular” manual way through the website, I can see five different options:
- Write directly into the database
- Through a management command
- Through your API
- From an external API
We’ll discuss each of them in the following sections, and there’s a summary table at the end.
Fixtures are django’s built-in way to write some data into its database. It’s usually used to get starting data or testing data into a database, but you could easily write some JSON or YAML in some other tool and then load that data into your Django app as a fixture. However, this would normally be a manual process, not something you’d automateI mean, you could, but it’s not super-well made for that.
As with the direct database access, this circumvents all logic in Django and you need to keep track of all auxiliary data and processing yourself.
Most often, this technique is used when moving data from one Django installation to anotherThis is also possible with direct database access, possibly even intra-database transfer. The fixture process is a bit more offline and portable.. Simply
dumpdata on one end, copy the resulting file, and
loaddata on the other end. This works nicely most of the time, except if you have already used sequences in the database or created your content-types through migrations. Those sometimes clash and you’ll have to edit the transfer file.
Write directly into the database
Another way to get data into your Django is by interfacing directly with your database. The tables for your models are found easily, they’re called
appname_modelname, and their structure can be easily deduced and inspected. You could use direct access with the DB api, e.g. Psycopg or SQLite, or you could use a wrapper like SQLAlchemy.
This method can be quite fast and versatile, since you’re talking directly to a database, with all upsides and downsides that includes. Note, however, that you completely circumvent all Django logic with this; if you have any signals or need to do things with your models, or compute fields or whatever, you’ll have to keep track of that in your data importer tool.
Through a management command
One of the most convenient and simple ways is to write and run a Django management command. This is easy to do and will give you access to the Django ORM with all your application settingsie. if you have more than one database, or offline processing or …. Now you can get your data in whichever way you want and simply create Django ORM objects with it.
One neat trick if you have lots of data is to run bulk inserts:
YourModel.objects.bulk_create( [ YourModel(**data_item) for data_item in data_items ] )
The great advantage here is that you have full access to Django and all its facilities. The downside is that you have to run this on the Django server. So this will likely be part of a two-sided tooling: one side that prepares data in whatever wayeg. an external API that you query, the other side that is in the management command and inserts it into the database.
I would say this is the most common way to do bulk data inserts.
Through your API
Naturally, if you have an API, you can easily use that to push data into your system. the main stumbling block you’ll encounter here is authentication, and there are several ways around it. You could re-use a session ID by logging in with a browser and inspecting the cookies. Or you could do an actual log-in, but that would entail doing or disabling CSRF, both of which aren’t super nice. Or you could use specific API tokens, for example like the ones that Django Rest Framework has optionally available.
APIs, even internal ones, are a great way to push data from external sources into a Django system, in a distributed and networked way. If you have data sources that produce data at intervals, or if you have different types of sources, or if you simply have lots of sources, an API is usually the best way to get that data into your Django application.
From an external API
Finally, your Django application can also be an API consumer. This can be done in special views or in a background process. That is, you could have internally accessible views that you trigger manually that execute the API connection, but make sure that they don’t time out. Or you put the calls into a background process, with Celery or Django Background Tasks or any other such library. You can then trigger those calls from views without incurring the delay in the view, or you call them periodically, or from some other source.
One pattern that comes up often is called “webhooks”. They are URLs that get requested from some other system when an event happensOne example is with Github, which can send a POST request whenever an event happens.. You can then handle that event, which will likely result in calling an external API. Since the trigger should finish quickly, you’ll want to do your API access in the background.
This technique is most useful if your application isn’t the leading system or has to aggregate data from different queryable sources.
|Fixtures||easy to do||quite manual|
|a bit finicky with sequences and content types|
|Database||fast||manual data processing|
|super-flexible||must handle DB changes yourself|
|Management||easy to do||must be on server|
|full Django ORM||usually not frequent|
|(ie. usually not automated)|
|API||outside access||API must be maintained|
|distributed sources||requires client logic|
|simple to do frequent inserts/updates|
|external APIs||can query many sources||need to conform to external API|
|background process possible||access must be scheduled|
Choose which suits your situation best. Or talk to me and I’ll discuss it with you.