Trevor Sullivan
Trevor Sullivan•16mo ago

Suggestion: Deploy Windmill with self-signed TLS certificate by default?

When I deployed Windmill using Docker Compose, it didn't have a self-signed TLS certificate. This would be a good "semi-secure by default" option, that could be disabled if desired. When I self-host applications, I like to know that the network layer is encrypted, even if it is by an untrusted certificate. It mitigates at least the most basic of man-in-the-middle attacks.
17 Replies
rubenf
rubenf•16mo ago
We thought about that but most people use an external TLS termination so we should write a guide on how to run a second container that do TLS termination rather than embed it into the docker-compose
Trevor Sullivan
Trevor Sullivan•16mo ago
I use Cloudflare Tunnels to encrypt my public endpoint. It would still be nice to encrypt local connections though. I noticed you already have a Caddy container, so maybe a self-signed certificate could just be added to that.
rubenf
rubenf•16mo ago
Why would you want to use tls within your own vpc ?
Trevor Sullivan
Trevor Sullivan•16mo ago
In case someone with a malicious device were able to connect to my wifi network and sniff traffic.
Trevor Sullivan
Trevor Sullivan•16mo ago
Looks like Caddy already has self-signed certificates, by default. Maybe it's just a matter of exposing TCP 443 from the Caddy container?
No description
rubenf
rubenf•16mo ago
ah yeah I was more thinking of the cloud case. It shouldn't be the default but also caddy only does it for localhost host
Trevor Sullivan
Trevor Sullivan•16mo ago
Connecting to unencrypted HTTP endpoints, even on my local network, just feels "dirty" 😆
rubenf
rubenf•16mo ago
I mean there is a high cost to TLS so for sure it won't be the default but it would be nice to have a guide on how to do so with the current docker-compose
Trevor Sullivan
Trevor Sullivan•16mo ago
My recommendation would be enabling it by default, but I understand why you don't want to. I think the Caddy local thing you're talking about is automatically adding the self-signed certificate to the local trust store. I think it still enables a self-signed certificate by default, on external interfaces. Unless I'm interpreting the docs wrong.
rubenf
rubenf•16mo ago
It's not as easy as that when caddy is within docker-compose
Trevor Sullivan
Trevor Sullivan•16mo ago
Are you referring to high cost for CPU cycles / network performance?
rubenf
rubenf•16mo ago
establishing the connection
Trevor Sullivan
Trevor Sullivan•16mo ago
The thing is, I don't care if the self-signed cert is trusted by my local system or not. I still want to connect to the endpoint, and click through the browser warning, or use an HTTP client from code that has the option to ignore certificate checks. I don't run Windmill locally anyway. It's running inside a Linux VM on one of my Linux bare metal hosts.
rubenf
rubenf•16mo ago
I think a guide would be nice and everyone can chose what they prefer
Trevor Sullivan
Trevor Sullivan•16mo ago
I use Windows on the client side 😆
CookieMonster
CookieMonster•16mo ago
Just ran into this myself. @pcgeek86 have you tried using the zero trust warp client. That way you get full end-to-end encryption from your local box to your private endpoint within your cloud VPC. You can still have your public hostname and just put it behind the zero trust access L7. The client allows you to split traffic to make sure it goes through the tunnel and not out to the internet and then back to the CDN. Ruben just put in a feature that allows you to use the wmill sync functions locally too by setting CF Service Auth Token Headers since the fetch API cant follow OAuth grant flows to authenticate with windmill API.
zsnmwy
zsnmwy•16mo ago
GitHub
GitHub - SteveLTN/https-portal: A fully automated HTTPS server powe...
A fully automated HTTPS server powered by Nginx, Let's Encrypt and Docker. - GitHub - SteveLTN/https-portal: A fully automated HTTPS server powered by Nginx, Let's Encrypt and Docker.