-
Notifications
You must be signed in to change notification settings - Fork 2.1k
fix(LiteLlm): add fallback user message to prevent empty content error with include_contents='none' #3255
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
fix(LiteLlm): add fallback user message to prevent empty content error with include_contents='none' #3255
Conversation
…messages empty due to include_contents='none'
Summary of ChangesHello @ishanrajsingh, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! This pull request resolves a Highlights
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
|
Response from ADK Triaging Agent Hello @ishanrajsingh, thank you for creating this PR! To help reviewers to review your PR more efficiently, could you please provide the followings:
Thanks! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request addresses an issue where the LiteLlm integration would error when include_contents='none' due to an empty content payload. The fix adds a fallback user message to ensure a non-empty content array is always sent to the LLM provider. I have identified a potential issue with the placement of the fallback logic that could lead to unexpected behavior.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
handle empty file content
|
GitHub Issue: Testing Plan: I verified the fix by running a scenario where an LlmAgent uses LiteLlm with include_contents='none'. The test confirms that the LLM request no longer sends empty content, preventing the BadRequestError from the LLM provider. Additionally, I ran existing LiteLlm unit tests to ensure no regressions occur. For thoroughness, a new test can be added simulating include_contents='none' to confirm fallback content insertion. |
|
/gemini review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request addresses a critical issue where the LiteLlm integration fails when include_contents='none' due to an empty content payload. The fix introduces a fallback user message to ensure a non-empty content array is always sent to the LLM provider. I have added a review comment to suggest a minor improvement to the fallback message.
…messages empty due to include_contents='none'
|
/gemini review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
This pull request addresses a critical issue where LiteLlm integration with include_contents='none' results in an empty content payload, causing a BadRequestError. The fix introduces a fallback user message when no messages are included, ensuring a non-empty content array is always sent. This review focuses on the correctness and maintainability of the implemented solution, ensuring it effectively prevents the error and adheres to best practices.
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
Co-authored-by: gemini-code-assist[bot] <176961590+gemini-code-assist[bot]@users.noreply.github.com>
This fix addresses an issue in the LiteLlm integration where setting include_contents='none' causes the content payload sent to the LLM provider to be empty, resulting in a BadRequestError. The patch adds a minimal fallback user message in the generate_content_async method when no messages are included, ensuring a non-empty content array is always sent. This prevents errors from providers that require non-empty input and improves compatibility when using LiteLlm with the Agent Development Kit.
GitHub Issue:
This PR addresses the bug reported in issue #3242 titled "LiteLlm + include_contents='none' results in empty content sent to the provider -> Error". The issue is already created and linked here: #3242
Testing Plan:
I verified the fix by running a scenario where an LlmAgent uses LiteLlm with include_contents='none'.
The test confirms that the LLM request no longer sends empty content, preventing the BadRequestError from the LLM provider.
Additionally, I ran existing LiteLlm unit tests to ensure no regressions occur.
For thoroughness, a new test can be added simulating include_contents='none' to confirm fallback content insertion.