Skip to content

error: 413 Request body too large for gpt-4.1 model. #128

@ldennington

Description

@ldennington

I've been using the ai-inference action in one of my repos for about a month now. I use the same scenario repeatedly in my demos to show its capabilities:

  1. Introduce a simple type error that will break CI builds.
  2. Create a PR to trigger CI.
  3. Wait for the failure to trigger my workflow that runs ai-inference.

Although this has worked in the past (using the same type error), I'm now getting the following issue from ai-inference:

413 Request body too large for gpt-4.1 model. Max size: 16000 tokens.

Note that I've made sure I'm using the latest version of the Action and updated to gpt-5 to see if that fixes the problem, but that model seems to have an even lower max size (4000 tokens).

Is anyone able to help me understand why this failure is happening/which model(s) may fix it?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions