I can see quite a few different options for doing this and would like some feedback on the most efficient or 'best practice' method.
I get a Django Queryset with filter()
c_layer_points = models.layer_points.objects.filter(location_id=c_location.pk,season_id=c_season.pk,line_path_id=c_line_path.pk,radar_id=c_radar.pk,layer_id__in=c_layer_pks,gps_time__gte=start_gps,gps_time__lte=stop_gps)
This queryset could be very large (hundreds of thousands of rows).
Now what needs to happen is a conversion to lists and encoding to JSON.
Options (that i've seen in my searches):
Example:
gps_time = [lp.gps_time for lp in c_layer_points];
twtt = [lp.twtt for lp in c_layer_points];
In the end I would like to encode as json something like this format:
{'gps_time':[list of all gps times],'twtt',[list of all twtt]}
Any hints on the best way to do this would be great, Thanks!
You might not be able to get the required format from the ORM. However, you can efficiently do something like this:
c_layer_points = models.layer_points.objects.filter(
location_id=c_location.pk,
season_id=c_season.pk,
line_path_id=c_line_path.pk,
radar_id=c_radar.pk,
layer_id__in=c_layer_pks,
gps_time__gte=start_gps,
gps_time__lte=stop_gps
).values_list('gps_time', 'twtt')
and now split the tuples into two lists: (Tuple unpacking)
split_lst = zip(*c_layer_points)
dict(gps_time=list(split_lst[0]), twtt=list(split_lst[1]))
I will suggest you use the iterate through the query set and conform the json dictionary element by element from the queryset.
Normally, Django's QuerySets are lazy, this means they get load into memory whenever they get accessed. If you load the entire list: gps_time = [lp.gps_time for lp in c_layer_points]
you will have all those objects in memory (thousands). You'll be good by doing a simple iteration:
for item in c_layer_points:
#convert item to json and add it to the
#json dict.
As an aside note, you don't need the ;
character at the end of lines in python :)
Hope this helps!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With