I've noticed that lately some databases or entire website aren't backing up correctly by the configured cron job already integrated into Google Drive.
It understands that there is the bank or the website but it does not finish successfully, it presents an error as shown in this example, see.
|-Start backup[2022-01-03 02:30:07]
|-Database size: 720.00 KB
|-Database character set: utf8
|-Partition / available disk space is: 20.85 GB, available Inode is: 2675935
|-Start exporting database: 2022-01-03 02:30:07
|-Compression completed, took 0.05 seconds, compressed package size: 10.80 KB
|-Uploading to Google Drive, please wait ...
|-Error：File upload error: <HttpError 403 when requesting https://www.googleapis.com/drive/v3/files?pageSize=10&q=name%3D%27bt_backup%27+and+mimeType%3D%27application%2Fvnd.google-apps.folder%27&fields=nextPageToken%2C+files%28id%2C+name%29&alt=json returned "Rate Limit Exceeded">
From what I read there seems to be some API limitation? Is there any way to increase this limitation, any configuration?