jason jason - 4 years ago 92
Python Question

Google sheets: splitting a task into many sub tasks to parallel update

In Google Sheets, I have a python task that goes through and copy a formula from one template and pastes it into all other sheets by specifying a key. The list of keys is about 1000 keys long and it updates pretty slowly.

Right now, I can manually chop up the list and run the code 10 times at once so I have 10 parallel instances updating all at once. But is there a way to avoid doing it manually? Here is the code

def copy_formula(self,source_key,target_key,sheetname,ranges):
sh = self.gsheets.gc.open_by_key(source_key)
wks= sh.worksheet(sheetname)
cell_list = wks.range(ranges)

sh1 = self.gsheets.gc.open_by_key(target_key)
wks1= sh1.worksheet(sheetname)
cell_list1 = wks1.range(ranges)

for i in range(0,len(cell_list1)):
cell_list1[i].value=cell_list[i].input_value

wks1.update_cells(cell_list1)


gsheets is from
gspread

Answer Source

I think you probably want to check out multiprocessing, as discussed here, but I would caution that simultaneous calls to Google could be tricky. Even using the newer Google Sheets API v4 (which gspread may not avail itself of yet), there is at least one documented concurrency issue when appending data.

Having said that, Google clearly does support concurrent editing in the web UI (including by multiple users), so I'd be surprised if you couldn't figure something out (possibly with the help of gsheets, a newer Python wrapper around the newer Google API).

Recommended from our users: Dynamic Network Monitoring from WhatsUp Gold from IPSwitch. Free Download