Best practices and guidelines
In my previous experience, we would always keep the heavy data loads and data processing outside of Airflow like in aws batch, spark and orchestrate the jobs via Airflow as a workflow engine.
Is it the same practice with Windmill.dev is there any docs or guidelines that explains best practices and what to do and what not do at enterprise scale ? Eg: processing millions of records per hour