Get help from the marimo community

Updated 3 months ago

Google Vertex AI

At a glance

The community members are discussing how to enable the Claude AI assistant from Google Vertex AI in their workflow. They are exploring options like using the CLINE VSCode extension, integrating with LiteLLM to manage different AI providers, and checking if Vertex AI's Claude is compatible with the OpenAI API format used by the Gemini AI assistant in the Marimo notebook environment. The community members also mention issues with hot reloading changes made in the Python file to the Marimo notebook. There is no explicitly marked answer, but the community members provide suggestions and guidance on how to potentially integrate Vertex AI's Claude into their workflow.

Useful resources
Is there a way for Claude from Google Vertex AI to be enabled?

Not sure if I can get an api key for this as this is authenticated through a CLI and browser sign in for it to work in CLINE VSCode extension
H
c
7 comments
Do you mean for it to be available as an option for AI assist (like how support exists for OpenAI, Google Gemini and Claude)?

If you want to learn about AI support in marimo; I would refer you to this comment on GitHub - https://github.com/marimo-team/marimo/issues/2773#issuecomment-2453905633; might help in understanding which of the options you want Vertex AI to see included in?
Mhm I'll look into it, but for now I'm thinking about a workaround of making the cells in marimo, but opening the actual py file in VSCode with my copilot and CLINE with Claude from Vertex, but I've noticed that simply refreshing notebook after making a change in the py file and saving doesn't trigger any hot reload, not even page refresh, only kernel restart, any ideas?
also there are a bunch of other free (to some extent) APIs and then there is LiteLLM that can change APIs from one to another or something , think it could be integrated into the marimo to run in a separate terminal on the same venv and transpose API calls from one schema to another, but this probably doesnt apply to the Google Vertex AI hosted Claude as that one is authenticated on the machine level or something

---

LiteLLM is a versatile tool designed to streamline interactions with Large Language Models (LLMs) by providing a unified interface and efficient management features.

Key Features:

Unified Interface: LiteLLM standardizes API calls across over 100 LLM providers, including OpenAI, Azure, Hugging Face, and Anthropic, by mapping them to the OpenAI ChatCompletion format.
LITELLM DOCS

Load Balancing and Fallbacks: It manages load balancing and implements fallback mechanisms to ensure reliable and efficient LLM usage.
LITELLM

Cost Tracking and Budgeting: LiteLLM offers tools to monitor and control spending, allowing users to set budgets and track expenses across different projects and API keys.
PYPI

Proxy Server (LLM Gateway): It provides a proxy server that acts as a central service to access multiple LLMs, facilitating load balancing, cost tracking, and customizable logging and guardrails per project.
LITELLM DOCS

Python SDK: For developers, LiteLLM offers a Python SDK to integrate LLM functionalities into applications, supporting features like streaming responses and asynchronous operations.
PYPI
I haven't looked too deeply into LiteLLM except their Model Providers page; might be worth looking into sometime. Especially since it could serve as a single source/point of contact to help integrate various model providers for AI support.
Thanks for the summary.
summary is from the Open AI /Search

but any idea about hot realoading a marimo notebook on a change in the python file? I had to restart kernel for changes to show in the notebook after CLINE (vscode extension) edited the .py notebook file, changes from notebook to py seem instant, but the other way kernel reload is a bit of a pain
I don't believe changing code directly in the .py file is intended/right. It is meant to be generated (shows changes instantly) from when you write code in the notebook; if you want really want AI support; there is a lot of support for AI in marimo; autocomplete, AI assist (supporting popular providers) and chat interface.
If you really want the CLINE based extension; or Vertex AI hosted claude AI support in the marimo notebook; you could raise an issue for the same,

However, there was a recent update where Gemini is now compatible with the OpenAI API format; could you also check with Vertex for the same? If it is; then you could make changes as suggested in the docs here - https://docs.marimo.io/guides/editor_features/ai_completion.html#using-other-ai-providers
Add a reply
Sign up and join the conversation on Discord