Instead of scheduling all the scripts that need to be run every day, just schedule one chain-scripts task that triggers all your scripts.
With this option, you are sure of the execution order of your scripts and it's a lot simpler to understand what happens.
You must be warned when something goes wrong in your scheduled automations.
For each scheduled automation, in its Execution tab, check the field 'Send email on failure' and provide an email (or several to manage issues when you're out of office).
You can also receive a Slack message when a script fails. To do that, you need to enable the Slack integration into the Admin menu.
For each Python script, in the tab Usages, you should provide the input and output datasources which are used in your Python script.
When you do that, this script will be referenced in the tab 'Usages' of these datasources and also in the dataflow.
So when you don't remember how a given datasource is filled, you can immediatly see in this tab which Python script filled it. If you didn't fill the fields, the Python script doesn't show up in the datasource Usages tab, and you have to read all your Python scripts to figure out which one filled this datasource, which can be very painful!