I have been using Parse to retrieve a data for a list view. Unfortunately they limit requests to 100 by default to a 1000 max. I have well over that 1000 max in my class. I found a link on the web which shows a way to do it on iOS but how would you do it on Android? Web Link
I am currently adding all the data into a arraylist in a loop until all items are complete (100) then adding them to the list
I have figured out how to achieve my goal:
Declare Global Variable
private static List<ParseObject>allObjects = new ArrayList<ParseObject>();
Create Query
final ParseQuery parseQuery = new ParseQuery("Objects");
parseQuery.setLimit(1000);
parseQuery.findInBackground(getAllObjects());
Callback for Query
int skip=0;
FindCallback getAllObjects(){
return new FindCallback(){
public void done(List<ParseObject> objects, ParseException e) {
if (e == null) {
allObjects.addAll(objects);
int limit =1000;
if (objects.size() == limit){
skip = skip + limit;
ParseQuery query = new ParseQuery("Objects");
query.setSkip(skip);
query.setLimit(limit);
query.findInBackground(getAllObjects());
}
//We have a full PokeDex
else {
//USE FULL DATA AS INTENDED
}
}
};
}
Here is a JavaScript version without promises..
These are the global variables (collections are not required, just a bad habit of mine)..
///create a collection of cool things and instantiate it (globally)
var CoolCollection = Parse.Collection.extend({
model: CoolThing
}), coolCollection = new CoolCollection();
This is the "looping" function that gets your results..
//recursive call, initial loopCount is 0 (we haven't looped yet)
function getAllRecords(loopCount){
///set your record limit
var limit = 1000;
///create your eggstra-special query
new Parse.Query(CoolThings)
.limit(limit)
.skip(limit * loopCount) //<-important
.find({
success: function (results) {
if(results.length > 0){
//we do stuff in here like "add items to a collection of cool things"
for(var j=0; j < results.length; j++){
coolCollection.add(results[j]);
}
loopCount++; //<--increment our loop because we are not done
getAllRecords(loopCount); //<--recurse
}
else
{
//our query has run out of steam, this else{} will be called one time only
coolCollection.each(function(coolThing){
//do something awesome with each of your cool things
});
}
},
error: function (error) {
//badness with the find
}
});
}
This is how you call it (or you could do it other ways):
getAllRecords(0);
JAVA
So after 5 years, 4 months the above answer of @SquiresSquire needed some changes to make it work for me, and I would like to share it with you.
private static List<ParseObject>allObjects = new ArrayList<ParseObject>();
ParseQuery<ParseObject> parseQuery = new ParseQuery<ParseObject>("CLASSNAME");
parseQuery.setLimit(1000);
parseQuery.findInBackground(getAllObjects());
FindCallback <ParseObject> getAllObjects() {
return new FindCallback <ParseObject>() {
@Override
public void done(List<ParseObject> objects, ParseException e) {
if (e == null) {
allObjects.addAll(objects);
int limit = 1000;
if (objects.size() == limit) {
skip = skip + limit;
ParseQuery query = new ParseQuery("CLASSNAME");
query.setSkip(skip);
query.setLimit(limit);
query.findInBackground(getAllObjects());
}
//We have a full PokeDex
else {
//USE FULL DATA AS INTENDED
}
}
}
};
In C# I use this recursion:
private static async Task GetAll(int count = 0, int limit = 1000)
{
if (count * limit != list.Count) return;
var res = await ParseObject.GetQuery("Row").Limit(limit).Skip(list.Count).FindAsync();
res.ToList().ForEach(x => list.Add(x));
await GetAll(++count);
}
JS version:
function getAll(list) {
new Parse.Query(Row).limit(1000).skip(list.length).find().then(function (result) {
list = list.concat(result);
if (result.length != 1000) {
//do here something with the list...
return;
}
getAll(list);
});
}
Usage: GetAll() in C#, and getAll([]) in JS.
I store all rows from the class Rowin the list. In each request I get 1000 rows and skip the current size of the list. Recursion stops when the current number of exported rows is different from the expected.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With