mitch

joined 4 months ago
[–] [email protected] 1 points 1 week ago

@gabek @obsidianmd @emory I do not. It seems to function with all the features when you use local inferencing via Ollama.

[–] [email protected] 1 points 1 week ago (7 children)

@emory @obsidianmd i will be honest that is a question you might be more equipped to answer than i, but here are the links if you wanna check it out. i see it has a folder named LLMproviders, but i am not sure if that is what you mean or not.

obsidian://show-plugin?id=copilot

https://github.com/logancyang/obsidian-copilot

https://github.com/logancyang/obsidian-copilot/tree/master/src/LLMProviders

[–] [email protected] 1 points 1 week ago (9 children)

@emory @obsidianmd there is a great plugin called Copilot that has RAG built-in as well. I can personally vouch for it with local inferencing.