I have a post_save attached to a model which loads a substantial amount of information via json response from an external API when it is created. Unfortunately this takes long enough that heroku times out before it ever has a chance of finishing.
After doing some research is seems the best way to do this is as a background process.
What would be the best way to run a post_save signal asynchronously?
After reading about celery it seems too heavy-weight for one process. Are there other methods that are trusted?
from django.db.models.signals import post_save
import urllib2
import json
class Foobar(models.Model):
# ... fields ... #
def foobar_post_save(sender, instance, created, *args, **kwargs):
"""Load info from external API
data
----
sender - The model class. (Foobar)
instance - The actual instance being saved.
created - Boolean; True if a new record was created.
*args, **kwargs - Capture the unneeded `raw` and `using`(1.3) arguments.
"""
if created:
url_to_open = <api_url>
resp = urllib2.urlopen(url_to_open)
data = json.loads(resp.read())
# ... load data ... #
post_save.connect(foobar_post_save, sender=Foobar)
I ended up finding the answer with rq which has a nice Heroku documentation
And implemented like so:
from django.db.models.signals import post_save
import urllib2
import json
from rq import Queue
from worker import conn
q = Queue(connection=conn)
class Foobar(models.Model):
# ... fields ... #
def load_data(self):
url_to_open = <api_url>
resp = urllib2.urlopen(url_to_open)
data = json.loads(resp.read())
# ... load data ... #
def foobar_post_save(sender, instance, created, *args, **kwargs):
"""Load info from external API
data
----
sender - The model class. (Foobar)
instance - The actual instance being saved.
created - Boolean; True if a new record was created.
*args, **kwargs - Capture the unneeded `raw` and `using`(1.3) arguments.
"""
if created:
q.enqueue(instance.load_data, timeout=600)
post_save.connect(foobar_post_save, sender=Foobar)
I'm not 100% sure that this is exactly what you want, but I've done a similar thing, so I think this will do what you need.
import multiprocessing
proc = multiprocessing.Process(
target = foobar_post_save, #the function to be run
args = (sender,instance,created), #the arguments to pass to the function
name="foobar post save") #name not strictly necessary
proc.start() #starts the process
proc.join(60) #wait until this is done; optional time limit is in seconds
For more on this, see here: Python 3 multiprocessing module docs.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With