I want to fetch the data in chunks like in first attempt from 1 to 50 records and in second attempt from 51 to 100 records. I saw the laravel documentation about chunk () but there is no provision for the custom offset.
I don't think you need to build something custom here. Using standard chunk()
should really work for you:
Model::chunk(50, function ($many) {
foreach ($many as $one) {
....
}
});
Update
If you want to send page
number (like 1, 2, 3) from outside to Laravel, you can use skip()
and take()
:
Model::skip(($page - 1) * 50)->take(50)->get();
I had a similar issue. Needed to use a big big query with lots of joins on an 192k records table. This is what I did:
The query was something like this:
DB::connection('mysql')
->table('table1 as A')
->join(
'table2 as B',
'A.Provincia',
'=',
'B.IdProvincia'
)
->select(
'A.NoExpediente',
...
'A.FechaCita'
)
->orderBy('NoExpediente')
// here goes the offset
->chunk(function ($records) use (&$offset) {
foreach ($records as $record) {
if ($offset > 0) {
$offset--;
continue;
}
// do something with $record
}
);
I know the solution is not ideal, because records are being brought from the db anyway, but it saved me a lot of processing on // do something with $record
the skipped.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With