Pull (+100k) data from Google sheets | Community
Skip to main content
RaulEr
New Participant
July 24, 2023
Solved

Pull (+100k) data from Google sheets

  • July 24, 2023
  • 1 reply
  • 1602 views

Hi community!

 

I have a Google Sheets document with data from different sources that will enhance my database, however, I tried using Zapier but it breaks if there are more than 10,000 rows. They literally told me "It’s likely that much data in a GSheet will cause issues/errors in the Zaps." 

 

So, does anyone knows of a solution for this scenario?

 

-Raul

This post is no longer active and is closed to new replies. Need help? Start a new post to ask your question.
Best answer by Darshil_Shah1

AFAICT, Google spreadsheets in the first place are not optimized for large volumes of information, and on top of that reading that volume of data via Zapier from G-sheets would likely cause errors. IMHO you could try decreasing the batch size by running the Zap more often (i.e., increasing the frequency, if you're running this batch job 1x a day, maybe do 2-3 times a day). This can put you in a safe zone, but as you can imagine, theoretically it's not a guarantee that you'd never encounter 100k rows for processing. Alternatively, you could also use Google Sheets API (using the custom API request action) to loop through the smaller pieces of data until you're done processing the entire file.

 

1 reply

Darshil_Shah1
Darshil_Shah1Accepted solution
Community Manager
July 24, 2023

AFAICT, Google spreadsheets in the first place are not optimized for large volumes of information, and on top of that reading that volume of data via Zapier from G-sheets would likely cause errors. IMHO you could try decreasing the batch size by running the Zap more often (i.e., increasing the frequency, if you're running this batch job 1x a day, maybe do 2-3 times a day). This can put you in a safe zone, but as you can imagine, theoretically it's not a guarantee that you'd never encounter 100k rows for processing. Alternatively, you could also use Google Sheets API (using the custom API request action) to loop through the smaller pieces of data until you're done processing the entire file.

 

RaulEr
RaulErAuthor
New Participant
July 24, 2023

Thanks @darshil_shah1, I'll try the Google Sheets API. These files will always load one column from 75k to 100k records on a daily basis.