I have an architectural question: We have a Django project made of multiple apps. There is a core app that holds the main models used for the other sets of apps. Then, we have a couple apps for user facing APIs. Lastly, we have some internal apps and tools used by developers only that are accessible in Admin UI as extended features.
Our deployment process is very monolithic. We use Kubernetes and we deploy the whole project as a whole. Meaning that if we only had changes in an internal app and we need that in production, we will build a new Docker image and deploy a new release with a new version tag incremented.
I'm not a big fan of this because change in internal tools shouldn't create a new release of the user facing applications.
I have been wondering if there is a way to split those deployments (maybe make them into a microservice architecture?). So we could deploy the user facing applications separate from the internal tools. I know I could build separate images, tags and everything for parts of the project but I'm not sure how they could communicate between each other if internal_app_1
depends on the models of core
app and potentially the settings.py
and manage.py
file as well.
Also because in Kubernetes, having to separate applications would mean to separate deployments with two servers running, so this means two separate Django projects isolated from each other but using the same database.
Has anyone worked with something similar or would like to suggest an alternative, if there's any?
Below is a tree example of how our project is structured at the moment:
├── core
| ├── models.py
| ├── views.py
| └── urls.py
├── userapi_1
| ├── views.py
| └── urls.py
├── userapi_2
| ├── views.py
| └── urls.py
├── insternal_app_1
| ├── templates
| | └── ...
| ├── models.py
| ├── views.py
| └── urls.py
├── manage.py
├── settings.py
└── Dockerfiles
├── Dockerfile.core
└── Dockerfile.internal_app_1
Django and microservices? Yeah, maybe somewhere in the parallel universe.
Only one thing that I may recommend is to build two identical services like django_container_internal
and django_container_production
. In this case you will be able to release internal tools
without stopping production
.
If you want to prevent access to production
functionality with internal
endpoints you may deactivate production
URLs by using ENVs
. Usually Django project has common config/urls.py
that aggregate all URL endpoints and looks like
urlpatterns = [
url('core/api/v1/', include(core.urls)),
url('internal/api/v1/', include(internal_app_1.urls)),
url('user/api/v1/', include(userapi_1.urls))
...
]
For example you may add IS_INTERNAL_TOOLS
environment variable and update urls.py
like
from os import environ
urlpatterns = [
url('core/api/v1/', include(core.urls)),
...
]
if environ.get('IS_INTERNAL_TOOLS', 'false').lower() in ('true', '1', 'yes'):
urlpatterns.append(url('insternal/api/v1/', include(insternal_app_1.urls)))
else:
urlpatterns.append(url('user/api/v1/', include(userapi_1.urls)))
Pros:
Cons:
internal
and production
parts heavily dependable on common core
and it is impossible to deploy only updated core separatelyIf you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With