Skip to content

Conversation

aothelal
Copy link

@aothelal aothelal commented Oct 1, 2025

What this does

The change introduces a way to handle generation params that might be different from a provider to another, in this specific PR there is also a fix for issue #386 related specifically to how Gemini handles max tokens.

This handles the issue with Gemini, but might be with other providers as well. I'm open to check the rest if the the maintainers are good with the idea.

Type of change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Performance improvement

Scope check

  • I read the Contributing Guide
  • This aligns with RubyLLM's focus on LLM communication
  • This isn't application-specific logic that belongs in user code
  • This benefits most users, not just my specific use case

Quality check

  • I ran overcommit --install and all hooks pass
  • I tested my changes thoroughly
    • For provider changes: Re-recorded VCR cassettes with bundle exec rake vcr:record[provider_name]
    • All tests pass: bundle exec rspec
  • I updated documentation if needed
  • I didn't modify auto-generated files manually (models.json, aliases.json)

API changes

  • Breaking change
  • New public methods/classes
  • Changed method signatures
  • No API changes

Related issues

Fixes #386

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[BUG] Gemini provider sends incorrect max_tokens parameter
1 participant