GraphQL schema length causes token limit to be exceeded
Hi, I'm trying to write GraphQL queries via GPT-4 code generation but run into an issue caused by the size of the GraphQL schema (from Linear) that is automatically used as context in the prompt. The size of the GraphQL schema bloats the prompt, resulting in a token size of is 138134 tokens, thereby exceeding the GPT-4 12k token limit.
Any ideas how the context size can be adjusted?
Any ideas how the context size can be adjusted?