Get help from the marimo community

batch/array both clone their elements. Why is that needed? What would happen if an UIElement with children wouldn't do that?
1 comment
y
My code creates a long-running thread in the background and I'd like to have all the output of the thread go to a single cell. Currently just the first message goes to the cell. Demo code:

Plain Text
 python
from threading import Thread
import time

def long_running():
    print("Thread starting:")
    i = 0
    while True:
        print(f"Loop {i}...")
        i += 1
        time.sleep(1)
    print("Thread exiting.")

t = Thread(target=long_running)
t.start()


Is there any way to capture the output from my thread in a specific cell?

(side note: this code also seems to cause auto export to raise an exception)
3 comments
a
Is there a way for my framework to detect if it is currently running in Marimo? e.g. - this is how it is done in Jupyter: https://stackoverflow.com/questions/15411967/how-can-i-check-if-code-is-executed-in-the-ipython-notebook
2 comments
A
a
@RJ Budzyński , since the discussion in general has become sprawling, I'm making this thread to consolidate discussion on playground notebook permalinks.
9 comments
A
M
R
I have a table from which I want to select some values, and use this to download attachments, zip them, then present a download button. The process takes time, so I tried using mo.lazy to show a spinner, but it doesn't update more than once, when it's inside an accordion. Therefore, I tried hacking my own solution using the output module. However, it struck me that it's not possible to get the the output object from a cell. I'm thinking something like ```spinner=mo.status.spinner()
mo.output.append(spinner)
_download_attachments = new_download_attachments_ui(heatpump_table)
download_attachments = await _download_attachments()
1 comment
j
Hi, the documentation for AI Completion says that we can only use either Copilot or Codeium for autocompletion. Is this still true? Or is there a way to use an OpenAI Compatible Model instead?

There is a github issue related to this but hasn't been resolved: https://github.com/marimo-team/marimo/issues/2193
7 comments
M
J
Hi everyone!

I am very excited about the experimental real-time collaboration features that I am trying to use with my university students learning Python!

However I am experiencing stability issues: the feature works as expected but after a short while (a few minutes) the remote marimo notebooks are displaying some Failed to save / Failed to fetch, Failed to run / Failed to fetch, errors and new connections are impossible ("Secure connection failed" ; see screenshots). The local notebook is still working properly.

Any clue on what may be happening? Is it something anyone else also experienced?

More context :
  • I am running the marimo notebook on localhost and exposing it via ngrok to a public address. The issue could be related to ngrok but in my personnal experience it is quite stable (when used to expose other services, such as web servers).
  • I tried the same experiment with the jupyter stack (and the jupyter-collaboration package) and I ran into the same issues (it works as expected, but for a few minutes only).
Best regards,

Sébastien
7 comments
M
b
As you can see in the picture attached from my notebook data flow the cell where I create my DB connection is not linked to the SQL cell below it. This is to be expected because these two cells don't share variables but at the same is wrong because these cells depend on each other for execution. I don't know what a good solution for this is at the moment.

I think I suggested in another post to have a DB connection tab on the left side menu, this wouldn't solve this particular issue but might be beneficial. @Myles Scolnick
13 comments
M
l
Hello, I'm new to Marimo.
I'm working with machine learning using CatBoost library for gradient boosting.
Some CatBoost features allows to visualize some widgets. CatBoost widgets do work in JupyterLab but not in marimo.
I have installed anywidget package this does not work.

To reproduce the problem you can run this simple code:

from catboost import CatBoostClassifier, Pool train_data = [[1, 3], [0, 4], [1, 7], [0, 3]] train_labels = [1, 0, 1, 1] model = CatBoostClassifier(learning_rate=0.03) model.fit(train_data, train_labels, verbose=False, plot=True)

The output is a widget with Logloss graph

Can someone help me with this?
Tanks

Ivan
3 comments
H
M
Plain Text
➜ ~/projects/nbs git:(lcr-mer-772) ✗ VPN ✓
$ head -n 2 zon.py                                                                 16:21:45
# /// script
# requires-python = ">=3.11"
➜ ~/projects/nbs git:(lcr-mer-772) ✗ VPN ✓
$ uv run marimo run --sandbox zon.py                                               16:21:47
warning: No `requires-python` value found in the workspace. Defaulting to `>=3.12`.


I am a bit confused at this warning message, I have defined a valid requires-python attribute yet this is happening.

On another front, is it incorrect to use sandboxes the way I am doing it? i.e. using uv run before the marimo call?
12 comments
M
l
I'm running a FastAPI server with Marimo and using SessionMiddleware to manage authentication via a cookie containing the access token. Here's the FastAPI setup:

Plain Text
server = marimo.create_asgi_app()
apps_dir = os.path.join(os.path.dirname(__file__), "apps")
for filename in sorted(os.listdir(apps_dir)):
    if filename.endswith(".py"):
        app_name = os.path.splitext(filename)[0]
        app_path = os.path.join(apps_dir, filename)
        server = server.with_app(path=f"/{app_name}", root=app_path)

app = FastAPI()
app.add_middleware(SessionMiddleware, secret_key="test")
app.add_middleware(auth_middleware)

labs_app.mount("/", server.build())


In the Marimo app code, I can access the access_token from the session middleware, but I can't figure out how to access the same value within a Marimo cell. Here's the cell code:

Plain Text
import marimo

app = marimo.App(width="medium")

access_token = context["access_token"]

@app.cell
def _():
    import marimo as mo
    mo.md(access_token)  # Trying to use the access_token from above but fails
    return mo

if __name__ == "__main__":
    app.run()



  1. How can I properly access the values outside the cell (like access_token) in a Marimo cell?
  2. Is there a recommended approach for passing session values from FastAPI's middleware into Marimo's context or cells?
Any guidance or examples would be appreciated!
9 comments
H
c
M
m
I have an image represented as SVG text, and I want it rendered in a cell of a wasm notebook. How?
2 comments
e
R
Has anyone tried using a local API endpoint for an AI assistant instead of the OpenAI API?
I have Ollama running on my laptop, along with several models to assist with my tasks. That’s why I want to use this local model instead of relying on Anthropic, OpenAI, or Google. By the way, I’m using Llama and Open WebUI.
9 comments
e
C
Hi, quick question possibly dumb question, on the latest update 0.10.10, does enter creates a new line in insert mode ? It doesn't seem to do it anymore for me. Thanks!
4 comments
A
M
When running a marimo notebook locally on my laptop copilot works as it should, but when I use the Visual Studio Code port forwarding feature and I connect to the same marimo notebook using the forwarded address, copilot stops working and I get a websockets.exceptions.InvalidURI: http://localhost:2918/copilot isn't a valid URI: scheme isn't ws or wss.
Any idea why this is happening? And how can this be fixed?
Hi everyone,
Here’s what I want to do:
I have a list a = []
I also have b = mo.ui.text().
I want a to keep appending b.value every time I enter new text into b.
The only working solution I’ve found so far is using mo.state().
I’m wondering if anyone has a better solution that doesn’t require mo.state().

Thank you! 🙏
1 comment
S
I am trying to serve marimo apps on a company server (python, not webassembly).
I have fastapi working fine, but I am running up against a problem with imports.

Originally I thought the sandbox imports would work, but it looks like this is not the case (I found a message on this discord).

That leaves a few options to solve the import problem:

  • use mo.install() to somehow pull in the sandbox imports? But if all the apps are in the same marimo process does this create conflicting imports? Also how would I read in the imports?
  • use marimo run --sandbox and have each app have its own process and port that i have to forward to
  • use a global uv environment for all apps and keep it synced across apps somehow. This will eventually lead to conflicts.
Obviously my preference would be something where i can just have one marimo fastapi app,
but I am not sure what the best way to achieve this is, while keeping apps sandboxed.
Generally each app will have many imports.

Thanks for any advice!
5 comments
M
b
How come some cells have a fullscreen button on the top lefthand side, but some don't? Is it generated based off a heuristic or something?
2 comments
t
M
For example, given
Plain Text
src/my_proj/processors/simple_processor.py
notebooks/nb1.py

I'd like to set the project root as src/proj and, from notebook nb1.py, import
Plain Text
from processors.simple_processor import SimpleProcessor
5 comments
M
A
Is there a way to let two elements align in a way that jusfies both left and right hand side of the composite component? Currently, I get something like this. But I would like to extend the elements sides so they match:
5 comments
H
M
j
A
Hello,

I have been trying Marimo over the weekend and I find it great.
All the problems bothering me in Jupyter Notes and some I didn't even realize I had seem to be addressed in Marimo.

There is one thing I cannot seem to figure out:

If I create a plot (the most basic plot) using matplotlib it shows correctly as long as I'm in edit mode but as soon as
I switch to View mode, or slide mode, or I run marimo run my_notebook.py the matlib plots are not shown anymore.

I have also tried with plotly and I get the same problem, the plots are only visible when the notebook is in edit mode.

altair plots seem to be fine though ( they are also visible when I switch to view mode).

I thought this might be a limitation but then I saw examples where matplotlib is being used and the plots are successfully displayed,
like the example: Neural Networks with Micrograd
https://marimo.io/p/@marimo/micrograd

I've seen that the exmaple uses WASM and tried to generate a WASM application from my notebook using:
Plain Text
marimo export html-wasm notebook.py -o notebook.wasm.html

and then serving the generated html.

This did not seem to make any difference, the matplotlib plots still do not show.

Am I missing something ?

Thank you in advance for any answers or suggestions.
7 comments
H
R
I have been using an online tool called Desmos to investigate various mathemetical equations in the last year. This has been very useful but runs into some issues when examining computationally heavy expressions (e.g integrals, etc).

For that reason, I wanted to look into more performant options, and Marimo seemed like an interesting option to play with this idea. I was wondering, though, if anyone would know whether achieving a smoothly varying plot is feasible in Marimo?

Here is an example of a smoothly varying plot: https://www.desmos.com/calculator/4dkhcbjost

My hope is that I can use Sympy along with Marimo to achieve this, but I am still ramping up on Marimo basics before I try it, so figured I'd ask in the meantime.

Thanks!
3 comments
G
O
I did following

  • uv init
  • uv venv
  • source .venv/bin/activate
  • uv add marimo (same problem if using ipykernel or jupyterlab)
  • Open marimo and getting this error:
ImportError: Cannot import 'TreeArgumentsWrapper' from 'jedi.inference.arguments' due to circular import.

marimo and jupyterlab works fine when installed and using with pip.
Hello there! Super glad to use this underrated feature (much easier to get started than jupyter!). I have this situation, where the plot is a bit small, even in fullscreen. How can I make that bigger?
3 comments
M
l
I use log messages to indicate where in the process the script is. Today, I've noticed that the more log messages get printed to one cell, the slower Marimo becomes. I don't think this affects when running from command line, but when in edit mode, if I allow all messages to be printed, then the notebook becomes completely unresponsive. Prior to Christmas I don't believe I had this issue and would be able to print all the messages without any issues, but I'm not completely sure about that.
2 comments
M
J