generated from actions/typescript-action
-
Notifications
You must be signed in to change notification settings - Fork 35
Open
Description
I've been using the ai-inference
action in one of my repos for about a month now. I use the same scenario repeatedly in my demos to show its capabilities:
- Introduce a simple type error that will break CI builds.
- Create a PR to trigger CI.
- Wait for the failure to trigger my workflow that runs
ai-inference
.
Although this has worked in the past (using the same type error), I'm now getting the following issue from ai-inference
:
413 Request body too large for gpt-4.1 model. Max size: 16000 tokens.
Note that I've made sure I'm using the latest version of the Action and updated to gpt-5
to see if that fixes the problem, but that model seems to have an even lower max size (4000 tokens).
Is anyone able to help me understand why this failure is happening/which model(s) may fix it?
florinacheron
Metadata
Metadata
Assignees
Labels
No labels