Lüku
Lüku3w ago

Error after Big For-Loop

Hi everybody! I have a flow with a For-Loop that works great if the loop gets 50-100 items. But if I let it run with 2000 iterations that takes more than a half an hour to complete, I get the following error when exiting the loop : Canceled: Job canceled: Flow 01987a41-3a20-f043-8a10-f54cf4f071d9 was cancelled because it was hanging in between 2 steps. If I dig deeper, I find the following error: ERROR src/monitor.rs:619: Error inserting log file: Database(PgDatabaseError { severity: Error, code: "23505", message: "duplicate key value violates unique constraint "log_file_pkey"", detail: Some("Key (hostname, log_ts)=(1bf8a802fb14, 2025-08-05 14:06:00) already exists."), hint: None, position: None, where: None, schema: Some("public"), table: Some("log_file"), column: None, data_type: None, constraint: Some("log_file_pkey"), file: Some("nbtinsert.c"), line: Some(666), routine: Some("_bt_check_unique") }) I ran it multiple times while using the parallel and non-parallel option. I have no idea where to look next. Thanks for the help! Lukas docker-compose Setup; v1.483.1
4 Replies
rubenf
rubenf3w ago
the error you've linked is absolutely unrelated The log message of the error above should have some details about when was the last ping, you should look for the logs of the workers around that time for any clue or if one worker restarted, etc
Lüku
LükuOP3w ago
Oh, my fault. I copied the first error on the first try and the following error on the next one. I check for the corresponding
rubenf
rubenf3w ago
also worth looking at the size of the result of each iteration, how heavy is the return of each iteration?
Lüku
LükuOP3w ago
I see nothing else in the logs. Size is a good idea. I will check it without returns. Thanks! Result size I was. Thanks for the help!

Did you find this page helpful?