Currently, we are using deno as a forked process rather than an integrated lib. The deno project doesn't provide deno as a lib so I forked it: https://crates.io/crates/deno_cli_lib_windmill I have hope to achieve ultimate performance through this scheme. Transpiling deno into js at time of saving the script (which is impossible currently since deno doesn't expose that as primitive), and pre-warming v8 isolate in a pool to not have to load them at execution time. Furthermore, we will gain a lot by avoiding a system fork. Will update this thread as I make progress. Goal: naive deno execution under 10ms
It's not possible to have the "main worker" live in the server and invoke each deno script. So you do not to invoke deno run?
From a user perspective it's sure is nice if things is standardizes so you know that things are battled tested and you are able to move scripts between services.
What I had in mind is a deploy function: It takes a set of script, a snapshot of the resources/variables, and produce a bundle that you can deploy anywhere. And we include a service to deploy it alongside windmill using cloudflare workers or this
So I think you'd get the full loop, you prototype within windmill and can use it staight up to a certain point and once you're at the million of req/s you can still keep windmill as the source of truth to bundle and deploy stuff. You wouldn't even need to use the service, the cli can probably do the bundling