Get help from the marimo community

Updated 4 days ago

Allow for additional AI config options in the .toml file

Hello! My organization uses an internal network for some of our business processes, and many pages are secured with SSL/TLS certificates! We have an openAI compatible API that I’m trying to configure, however it requires a specific Certificate Authority bundle to verify the connection.

I wasn’t sure if it would be difficult to add in support for additional arguments/config keywords that get passed to the connection client, including custom CA bundles, ssl_verify arguments, etc.


As is right now, i think Marimo uses the OpenAI python package which itself uses httpx under the hood to build the connection. Httpx does support a specific argument for a ca bundle! As is, without being able to specify that bundle I get an error:

httpcore.ConnectError: [SSL: CERTIFICATE_VERIFY_FAILED] certificate verify failed, unable to get local issuer certificate. (_ssl.c:1007)

My proposed solution would be allowing additional arguments in the config toml for ssl_verify and caBundlePath which would be passed to the appropriate OpenAI client creation segment in the code.
R
M
k
16 comments
Did a bit more digging, looks like in the marimo-> _ai -> llm -> _impl.py Lines 133-137 ( https://github.com/marimo-team/marimo/blob/b845f5d3e33b43a2d9850eb81d63dbebed5f26b1/marimo/_ai/llm/_impl.py#L134) is where the open AI client is created! Not sure if it’s possible to add something like “ssl_verify=self.ssl_verify” and “verify=self.caBundlePath” to those lines, or if we’d need to pass an httpx.Client object with those parameters
hey Ryan, that sounds find to add
are you running this locally or in webasembly?
Locally using marimo edit —no-token in a venv!
I’m guessing if it was web assembly the pyodide conversion of the python to JS xhr requests would probably work, based on a few “apps” I’ve cobbled together, although that starts running into CORS which is a whole other thing!
yea you will definitely hit other CORS issues
we have 2 AI entry points:
  1. AI features built into the editor. (e.g. the AI "generate cell" and chat sidebar,)
  2. AI widgets (mo.ui.chat, and mo.ai.llm)
i think you'll want to change (1) to unblock, but the code you located is for (2).

if you'd like to make the contributions, the places to change would be:
marimo/_server/api/endpoints/ai.py
marimo/_config/config.py
Ahhhhh, gotcha! Yeah I’d def want the stuff from 1!

I honestly do not think I currently have the GitHub knowledge to do a full on contribution via a PR. At the very least though I could open an issue on GitHub, after I get off work today
Thanks again for the clarification!
Hi Ryan. I faced the exact same problem at work, and my solution ended up being writing (or, well, having AI write me) a proxy server that I run locally. In my org we also need to put some extra headers on each request, plus the authorization key is a token that only lasts an hour.
So with the proxy running on port 8989, I set the openai base url to http://localhost:8989 and then the proxy routes all requests to our internal API with the right headers, plus takes care of refreshing the key. Now all the AI features in Marimo work seamlessly.
Aider exposes not using verifying certs, https://aider.chat/docs/config/options.html#--verify-ssl, but it's really annoying that httpx doesn't allow you to just set an env var.
Yeah, one of the workarounds suggested by the folks hosting the model is using a flask app as a proxy, andwhile that might work in one offs, it's a lot more work than simply pip install marimo and then adding the right config arguments to the TOML.
Honestly might ask for the httpx to support all of the SSL options in httpx: https://www.python-httpx.org/advanced/ssl/, but for now going to put in an issue for the edits to the right files.
Issue created! Not sure if I can contribute "formally" or not, as my org has some weird corporate rules about contributing to open source projects. https://github.com/marimo-team/marimo/issues/4071
(I found workable code that meets what I need, I added it onto the issue, still tbd if I can do the PR myself, I hope to get an answer by next week, in the meantime I have a monkey patch script that goes in an does the necessary changes)
Add a reply
Sign up and join the conversation on Discord