WindmillWWindmill
Powered by
NackiN
Windmill•11mo ago•
2 replies
Nacki

using o3-mini for chat/code gives error

when configuring the AI to use OpenAI's o3-mini model, the following error is given when trying to use AI features in the editor:

Failed to send request: 400 Unsupported parameter: 'max_tokens' is not supported with this model. Use 'max_completion_tokens' instead.
image.png
WindmillJoin
3,362Members
Resources
Recent Announcements

Similar Threads

Was this page helpful?
Recent Announcements
henri-c

Weekly kenote to tell you about our latest updates https://discord.com/channels/930051556043276338/1278977038430240813 https://youtube.com/live/2dGd9TdT8xs?feature=share

henri-c · 4d ago

Pyra

### HTTP tracing (EE) Capture HTTP requests made by job scripts as observability spans Features: - View HTTP request traces (method, URL, status, timing) in the job details UI - Auto-instrumentation for Native TypeScript, MITM proxy for other languages - Integrates with external OpenTelemetry collectors changelog: https://www.windmill.dev/changelog/http-tracing docs: https://www.windmill.dev/docs/advanced/instance_settings#http-tracing Additionally jobs memory metrics are now fully OSS!

Pyra · 2w ago

henri-c

First keynote of the year here https://discord.com/channels/930051556043276338/1278977038430240813 🙂

henri-c · 4w ago

Similar Threads

Internal server error when using ai code generator
sigcoreSsigcore / help
3y ago
Shared logic import in deno gives lsp error , though it actually works
ym1198Yym1198 / help
3y ago
`Use pre-made failure script` gives empty search result for Workspace scripts filter
ym1198Yym1198 / help
3y ago
VS code extension returns an error when pushing changes
atooAatoo / help
2y ago