Stefan-neo
Stefan-neo8mo ago

GraphQL schema length causes token limit to be exceeded

Hi, I'm trying to write GraphQL queries via GPT-4 code generation but run into an issue caused by the size of the GraphQL schema (from Linear) that is automatically used as context in the prompt. The size of the GraphQL schema bloats the prompt, resulting in a token size of is 138134 tokens, thereby exceeding the GPT-4 12k token limit. Any ideas how the context size can be adjusted?
4 Replies
rubenf
rubenf8mo ago
We will take a look tomorrow @Hugo
Stefan-neo
Stefan-neo8mo ago
Alright, thanks a lot for looking into it ruben Edit: For context, here's the linear graphql schema: https://github.com/linear/linear/blob/master/packages/sdk/src/schema.graphql
GitHub
linear/packages/sdk/src/schema.graphql at master · linear/linear
Tools, SDK's and plugins for Linear. Contribute to linear/linear development by creating an account on GitHub.
Stefan-neo
Stefan-neo8mo ago
Hi @rubenf, just a brief poke as follow-up. Can I help out by filing an issue in https://github.com/windmill-labs/windmill/issues? Edit: Just found commit https://github.com/windmill-labs/windmill/commit/4557e7beb40de12b3d26708968f855ecbf49e7b5, thanks! Looking forward to test on v1.292.4, current cloud plan seems to be on EE v1.292.0-2-gb3e53de94
GitHub
Issues · windmill-labs/windmill
Open-source developer platform to turn scripts into workflows and UIs. Fastest workflow engine (5x vs Airflow). Open-source alternative to Airplane and Retool. - Issues · windmill-labs/windmill
Hugo C.
Hugo C.8mo ago
Sorry for the lack of response, we now truncate the schema to approx 100k tokens with a warning in the AI popup menu