VinF Hybrid Inference #1: Update the API#8874
VinF Hybrid Inference #1: Update the API#8874erikeldridge wants to merge 17 commits intovertexai-hybridinference-integrationfrom
Conversation
|
Size Report 1Affected Products
Test Logs |
Size Analysis Report 1Affected Products
Test Logs |
Vertex AI Mock Responses Check
|
| "lib": [ | ||
| "ESNext" | ||
| "ESNext", | ||
| "dom" |
There was a problem hiding this comment.
Without this, adding the @types/dom-chromium-ai dependency causes the postsubmit script to fail. I'll try reproing the issue on a stand-alone branch.
| } | ||
|
|
||
| /** | ||
| * Toggles hybrid inference. |
There was a problem hiding this comment.
Could you add the @public tag here and in other JSDoc comments for public APIs?
| /** | ||
| * Defines the name of the default in-cloud model to use for hybrid inference. | ||
| */ | ||
| static DEFAULT_HYBRID_IN_CLOUD_MODEL = 'gemini-2.0-flash-lite'; |
There was a problem hiding this comment.
Could we avoid defining a default model? If this model string is deprecated in the future, and a user is using an old version of the SDK, would using this default result in an error?
This change updates the getGenerativeModel getter to accept
ModelParamsorHybridParams. TheHybridParams.modefield indicates whether hybrid inference is intended.This change just updates the inputs. All existing tests pass, so the diff should be relatively easy to understand. The next change will use the inputs.