Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Testing on latency optimized inference feature is prevented from the parameter check on ChatBedrockConverse._converse_params #307

Open
haandol opened this issue Dec 18, 2024 · 1 comment

Comments

@haandol
Copy link

haandol commented Dec 18, 2024

boto3 supports performanceConfigLatency parameter on converse API according to the[doc] (https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/bedrock-runtime/client/converse.html)

but _converse_params does not take that param and occurs error.

ChatBedrockConverse._converse_params() got an unexpected keyword argument 'performanceConfigLatency'
@haandol haandol changed the title ChatBedrockConverse occurs error on kwargs, performanceConfigLatency. Testing on latency optimized inference feature is prevented by the parameter check on ChatBedrockConverse._converse_params Dec 18, 2024
@haandol haandol changed the title Testing on latency optimized inference feature is prevented by the parameter check on ChatBedrockConverse._converse_params Testing on latency optimized inference feature is prevented from the parameter check on ChatBedrockConverse._converse_params Dec 18, 2024
@haandol haandol closed this as completed Dec 18, 2024
@haandol haandol reopened this Dec 18, 2024
@carl-krikorian
Copy link

carl-krikorian commented Jan 2, 2025

Any updates on this? Is there a way to run latency optimized inference on langchain_aws?
EDIT: I found this, #315 . Can we get this merged? Let me know if I can help in that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants