concurrency limits with key ???
Re-scheduled job to 2025-09-08 08:04:30.392325 UTC due to concurrency limits with key dataworks/script/f/Orion/read_kafka and limit 1 in the last 60 seconds (min_started_at: 2025-09-08 08:01:27.392325 UTC, avg script duration: Some(0), number of time windows full: 2)
Re-scheduled job to 2025-09-08 08:05:31.393106 UTC due to concurrency limits with key dataworks/script/f/Orion/read_kafka and limit 1 in the last 60 seconds (min_started_at: 2025-09-08 08:04:30.393106 UTC, avg script duration: Some(0), number of time windows full: 0)
Re-scheduled job to 2025-09-08 08:08:34.393783 UTC due to concurrency limits with key dataworks/script/f/Orion/read_kafka and limit 1 in the last 60 seconds (min_started_at: 2025-09-08 08:05:31.393783 UTC, avg script duration: Some(0), number of time windows full: 2)
Re-scheduled job to 2025-09-08 08:09:35.397793 UTC due to concurrency limits with key dataworks/script/f/Orion/read_kafka and limit 1 in the last 60 seconds (min_started_at: 2025-09-08 08:08:34.397793 UTC, avg script duration: Some(0), number of time windows full: 0)
Re-scheduled job to 2025-09-08 08:12:38.409289 UTC due to concurrency limits with key dataworks/script/f/Orion/read_kafka and limit 1 in the last 60 seconds (min_started_at: 2025-09-08 08:09:35.409289 UTC, avg script duration: Some(0), number of time windows full: 2)
=====================
I'm facing with this trouble now. Is it possibly because i having another script with the same name running on schedule ?
7 Replies
Hi, yes you list the scripts with that concurrency key in the runs page and filter by concurrency key
I'm not the one who created this flow. So how can i fix this ? Should i increase the concurrency limits
Possibly if you need more than the concurrent limit running at the same time
where can i find this concurrent limit setting ?
Concurrency limits | Windmill
The Concurrency limits feature allows you to define concurrency limits for scripts, flows and inline scripts within flows. Its primary goal is to prevent exceeding the API Limit of the targeted API, eliminating the need for complex workarounds using worker groups.
i saw this doc but cannot find it similar in my case
or maybe i can change the key of the script if i want to keep the concurrent limit ?
they are in the runtime settings
of script/flows