-
Notifications
You must be signed in to change notification settings - Fork 4.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reopen pull request for conflict resolution on notebook beta feature #553
Conversation
I think this is great idea, but I think it has some challenges with consistent internal state that we might need to address because the code is running in the Open Interpreter subprocess and not in the Jupyter Notebook. It looks like the model assumes that it can use code like it would in a Jupyter Notebook, (for example: OpenInterpreter-NotebookMode.mp4 |
Yeah - have been playing with it and I agree..
It also loses state between calls - so losing the sub process pip that is kept open.
I have been looking into proxy display etc … but not got a solution yet.
Thanks
…On 2 Oct 2023 at 19:22 +0100, Eric Allen ***@***.***>, wrote:
I think this is great idea, but I think it has some challenges with consistent internal state that we might need to address because the code is running in the Open Interpreter subprocess and not in the Jupyter Notebook.
It looks like the model assumes that it can use code like it would in a Jupyter Notebook, (for example: import pandas as pd in one cell making pd available in another), and so you run into a lot of failed and repeated execution of code when you ask for a slightly more more complex task than just "Draw a graph."
https://github.com/KillianLucas/open-interpreter/assets/1667415/787dafea-0cf5-4315-8984-a79587da6b17
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
I was going to try a really hacky, temporary solution and add instructions to the model to include code from each step in the next step to make sure each code block is fully encapsulated. Some other thoughts that might be totally off base: I also wondered if we could leverage some more advanced Jupyter functionality to actually create and execute dynamic cells. Maybe there’s some combination of programmatic cell execution and programmatic cell creation that could work, but I’m not sure if we’d need to or be able to pause and resume the main interpreter thread for that, though. It looks like there could be a way to run it in the background with multiprocessing, but I don’t think we can get intermediate outputs from it. If any of my tests prove to be useful, I’ll report back with my findings. Just super excited about this functionality. Thanks so much for tackling it. |
We defo need to get it in - the problem is the Marlon backend and the %matplotlib magic - we could use bokeh - which has a remote server capability and suggest to the llm to display all graphs in bokeh
Will play with it though and see if we can get it working
Thanks
…On 3 Oct 2023 at 14:38 +0100, Eric Allen ***@***.***>, wrote:
I was going to try a really hacky, temporary solution and add instructions to the model to include code from each step in the next step to make sure each code block is fully encapsulated.
Some other thoughts that might be totally off base:
I also wondered if we could leverage some more advanced Jupyter functionality to actually create and execute dynamic cells.
Maybe there’s some combination of programmatic cell execution and programmatic cell creation that could work, but I’m not sure if we’d need to or be able to pause and resume the main interpreter thread for that, though. It looks like there could be a way to run it in the background with multiprocessing, but I don’t think we can get intermediate outputs from it.
If any of my tests prove to be useful, I’ll report back with my findings.
Just super excited about this functionality. Thanks so much for tackling it.
—
Reply to this email directly, view it on GitHub, or unsubscribe.
You are receiving this because you authored the thread.Message ID: ***@***.***>
|
Have a working example using the JSON serialisation built into Plotly - so using plotly as the graphing library - much nicer regardless. Matplotlib is total ancient rubbish with totally non understandable API :-(. But serialising the graph to JSON then handing back and having a handler intercept and calling show in hosting notebook environment - will get it pushed after some tests. |
Right I have committed this with use for serialisation of Plotly graphs - it now executes in the sub process as per the main architecture and attempts to pass the JSON graph back out to hosting notebook |
Describe the changes you have made:
Sorry I accidently closed original request - have resolved new conflicts and re-opened.
Reference any relevant issue (Fixes #000)
I have tested the code on the following OS:
AI Language Model (if applicable)