Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Deleting a huge number of files at root

By mistake I saved a huge number of files at the root of my google drive (>100k files). I cant delete them online easily.

I'm currently trying to delete them by using the file list api and the file delete api, it works but it is really slow.

Also I didn't find a way to count how many files remains at the root to estimate how much time this will take.

Does anyone have a faster way to delete the files, or a way to count them ?

like image 234
Florian Avatar asked Nov 19 '25 23:11

Florian


2 Answers

Google Drive REST API provides a batch method for making multiple requests at once, and completes faster than doing your requests one at a time. You can see the full reference for making batch requests here, including code snippets for supported languages. You can use the below python code snippet to make a batch delete request to the Drive REST API:

listOfFileIDs = [] 

def callback(request_id, response, exception):
    if not exception:
        print(response)
    else:
        print(exception)

batch = vals.service.new_batch_http_request(callback = callback)

for file_id in listOfFileIDs:
    batch.add(vals.service.files().delete(fileId = file_id))

print(batch.execute())

You should populate listOfFileIDs by making a Files: list request to get a list of the IDs of the files you want to delete.

Things to remember:

  • You can only make up to 100 calls per batch so you will still need to make multiple batch requests to the server
  • listOfFileIDs can also only be of maximum 100 length due to this batch limit
  • Batch requests are quicker and use less of your quota than individual requests

You can keep a track of your project's requests at the Developer's Console. The app quota has a courtesy limit of 1000 requests per 100 seconds per user, as well as a few other quota limits which you can see on the quota tab for you application. You also might need to throttle your request rate using Exponential backoff if you keep receiving 403: Rate limit exceeded or 403: User rate limit exceeded.

like image 101
I hope this is helpful to you Avatar answered Nov 22 '25 20:11

I hope this is helpful to you


This is probably going to be your only option. Yes the api is slow but its a free api that you are not paying for.

I would do a file.list first with a q parameter of something like this

'root' in parents and mimeType != 'application/vnd.google-apps.folder'

This will return all of the files that are in your root directory. Then you can just loop though deleting them one at a time.

like image 43
DaImTo Avatar answered Nov 22 '25 20:11

DaImTo



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!