feat: add ml.llm.Claude3TextGenerator model#901
Conversation
This comment was marked as resolved.
This comment was marked as resolved.
| _GEMINI_1P5_PRO_FLASH_PREVIEW_ENDPOINT, | ||
| ) | ||
|
|
||
| _CLAUDE_3_SONNET_ENDPOINT = "claude-3-sonnet" |
There was a problem hiding this comment.
Could you add all BQML-supported models?
There was a problem hiding this comment.
Done. Leaving tests to be added.
There was a problem hiding this comment.
Sorry, I thought you were going to add tests in this PR itself. Are you going to send another PR for this?
There was a problem hiding this comment.
Yes. need to setup the connection for other regions.
| (https://cloud.google.com/products#product-launch-stages). | ||
|
|
||
| Args: | ||
| model_name (str, Default to "claude-3-sonnet"): |
There was a problem hiding this comment.
We use "Default to" in all the APIs... Keeping it for now.
|
|
||
| @log_adapter.class_logger | ||
| class Claude3TextGenerator(base.BaseEstimator): | ||
| """Claude3 text generator LLM model. |
There was a problem hiding this comment.
Looks like "Consumer Procurement Entitlement Manager Identity and Access Management (IAM) role" is an additional requirement https://cloud.google.com/vertex-ai/generative-ai/docs/partner-models/use-partner-models#set-permissions, we should document this in the class docstring and after the release in the reference docs https://cloud.google.com/bigquery/docs/use-bigquery-dataframes#remote-models
|
|
||
|
|
||
| @pytest.fixture(scope="session") | ||
| def session_us_east5() -> Generator[bigframes.Session, None, None]: |
There was a problem hiding this comment.
forgot to remove in this PR, but will use in PR of adding tests.
| _GEMINI_1P5_PRO_FLASH_PREVIEW_ENDPOINT, | ||
| ) | ||
|
|
||
| _CLAUDE_3_SONNET_ENDPOINT = "claude-3-sonnet" |
There was a problem hiding this comment.
Sorry, I thought you were going to add tests in this PR itself. Are you going to send another PR for this?
Thank you for opening a Pull Request! Before submitting your PR, there are a few things you can do to make sure it goes smoothly:
Fixes b/359901494 🦕