Scenaio with a large amount of records to process times out after 40 minutes.
I have a scenario that is meant to take in a CSV file (picked up from Sharepoint as a starting trigger.) Based on the data in this spreadsheet, it is meant to create or update records in Salesforce.
The CSVs can have upto 100,000 rows of data. I was expecting that it could take some time to process all this data eg 5 hours or so. However, the scenario always stops working and crashes after 40 minutes.
Each row in the spreadsheet takes approx 1-2 seconds to process. I have tried breaking down the csvs into smaller amounts but even at 1000 rows of data it is hit and miss whether it completes or times out.
What can I do to ensure the scenario completes all exceutions for a large data amount?