"Bare metal" workers
Is it possible to run workers outside of Docker or Kubernetes that connect to the windmill server?
Sub module not working with Python
pip install beem
from beemapi.exceptions import NumRetriesReached
pip install beem
from beemapi.exceptions import NumRetriesReached
Install Python module from GitHub
I have a module that exists on PyPi, but we have a fixed version on GitHub.
Is there a way to use this with Windmill? Normally, would do a pip install git+https://github.com/x/x.git to make sure we grab the right one.
Since you are auto installing packages, how do you tell it the right one to get?
...
Generating tokens for apps?
I have three different "apps" that will be using the APIs I'm building in our WM instance, two web apps and one server app. I want each of them to use their own API token and for the WM Runs view to show which app triggered the run. But right now tokens are tied to users. If I want to do what I describe - is my only option to create a per-app "user" in WM and create the token in that user account?
sleep (seconds) -> sleepUntil (specific date)
Is there a way to wait to execute something until a specific date without needing to calculate sleep in seconds or use cron?
Something similar to what inngest does with sleepUntil (https://www.inngest.com/docs/reference/functions/step-sleep-until)....
how can PowerShell returns an json array?
Is there a way that powershell return an json arry?
I have written a script, that gets aa json array from an restful api. but when i wrote Write-Output $result (the array of json), the return value of the powershell script is on "]" 😦...
somebody already use elasticsearch a destination?
i want to collect some retfull api sources to elasticsearch, but the elasticsearch dest. is not implementent.
Problem with sleep
Hey guys,
We are facing an issue with sleep in the flow. Scenario is we have a step in the flow which needs to sleep for 6 hours and then execute the next step. Problem is after completing the sleep time flow will shut down. See the example in the screenshot it shows the error "Job Cancelled" and http connection was broke by sanket. But we are not touching the flow when it's sleeping.
My observation is it is always crashing after completing the sleep time....
Scripts missing from flow after update
I've just updated for the first time in a while and now when I try to add a workspace script to a flow I can only see two (of 5) folders.
The rest of my folders and their contents are still visible in the workspace home, and the scripts that had been added to the flow before I updated are still there. When I open any of the "missing" python scripts I'm also asked to save or discard changes despite making no alterations....
Open API Key
OpenAPI says it now uses project based keys instead of user key's. When I create a project based key with all permissions, Windmill is saying incorrect API key provided.. am I creating this key in the wrong place in OpenAI?
Staying in sync
I've read through the docs on git sync and wmill cli sync. I have wmill syncing working. When I make changes via the wmill UI, if I want them in my local copy of the workspace I run wmill sync pull. If I make changes to the local workspace, I run wmill sync push, approve the diff and the changes go back to the wmill workspace storage. If I look at the history of a changed file, I see the history of changes. Now I want this workspace to also be synced to a git repo and I have some questions:
If I add git sync to the workspace, do I also need to clone the repo to work locally or do I just use wmill cli for local work and push changes back to the workspace?
For git sync, what if changes happen to the repo outside of wmill, how do I get wmill to pull the latest code from the repo?...
Multiple azure_blob resources
Is it possible to have more than one azure_blob resource per workspace that can work with the S3 File Uploader app component?
The goal is to have the uploaded files in a different container than the one the first resource points to.
I created a second azure_blob resource and configured the workspace S3 storage settings for it to be used as secondary storage. When using the S3 File Uploader in an app it uses the first azure_blob resource by default and I see no way to change it....
clear worker queue
Is there a way to purge the queue? We have hundreds of stuck tasks and cancel all does not do anything
Google Auth Tokens Exposed?
I am working on a project that exposes some API endpoints but would like to secure it on the server-side. Does Windmill Cloud App expose a way to fetch OAuth tokens/create SSO that I could pass to the API endpoint for user verification?
Bun installer appears to fail during script launch.
I just finished setting up a self-host install of WM. I can run a deno typescript test without problems, but if I run a bun test, the mem just keeps building then eventually I see this:
ExecutionErr: error during execution of the script:
process terminated by signal: Some(
9,...
Python imports
I am trying to import Azure's DocumentIntelligenceClient with the statement:
from azure.ai.documentintelligence import DocumentIntelligenceClient
but keep getting the error: ModuleNotFoundError: No module named 'azure'
...Triggering a script using a synchronous webhook from another script within a self-hosted Docker
Hi all!
Windmill and Docker semi-noob here, running into an issue for the last 2 hours I can't seem to figure out. I am trying to call a script from another script within the same docker container. I am able to get it to work async, but when I try to do it async, it seems to trigger the script, but it runs indefinitely and never returns anything. Nor can I see any logs of relevance. I'm using http://windmill_server:8000 as the base of the route, and feel like the issue is related to them both being on the same docker/network.
If anyone has any insight on how to use syncronous webhooks to trigger scripts I'd be immensely appreciative. Thank you!...