AI assisted Script Editiing: removed dependencies, refactored OpenAI code, added anthropic Claude#70
AI assisted Script Editiing: removed dependencies, refactored OpenAI code, added anthropic Claude#70haesleinhuepf wants to merge 1 commit intomasterfrom
Conversation
|
This pull request has been mentioned on Image.sc Forum. There might be relevant details there: https://forum.image.sc/t/ai-assisted-code-generation-in-fijis-script-editor/100877/1 |
|
This pull request has been mentioned on Image.sc Forum. There might be relevant details there: https://forum.image.sc/t/globias-bioimage-analysis-seminar-series-fiji-with-python-power/113118/1 |
|
@haesleinhuepf did you continue this development at all? I currently get I am currently working on a workshop for introducing novice image analysts to LLMs, and am interested in exploring Fiji integration further in the near future |
|
Hi @hinerm , I hardly use Fiji anymore. Http 401 errors with OpenAI API sound like you have no API KEY configured or it is outdated. Feel free to open an issue if you need further assistance with this. Best, |
|
This pull request has been mentioned on Image.sc Forum. There might be relevant details there: |
|
Hi @ctrueden , I'm just curious: Is this something worth merging and distributing? Thanks! Best, |
|
@haesleinhuepf I don't want to merge this without more testing. When I tried it, I was using the same API key in napari/Omega in the same time period, so it was either a problem with the access or how we were handling key settings. Regardless, this is a great proof of concept, but I would like to explore a robust LLM integration that is not tied to the script editor (which is why I didn't debug further). I started gathering requirements. This is high on my personal priority list so I'm hoping to start on it in the near future. If anyone wants to use this as a stopgap in the meantime then I can revisit and troubleshoot; I expect it is a minor issue. |
Hi Curtis @ctrueden ,
I was working on LLM integration in the Script Editor again. The library we used before was archived by its developer and had a lot of dependencies, that's why I removed it. I'm now accessing the LLMs directly via their REST API. Thus, we have now minimal dependencies, no additional jar files that need to be shipped via the update site.
Furthermore, I used the opportunity to refactor code, and add support for Anthropic's claude. Support for Gemini, Ollama etc seems trivial, and could be added via a follow-up PR. I also renamed "OpenAI Options" to "LLM Service Options" and also optimized the default prompt a bit.
Let me know what you think!
Best,
Robert