Add initial LLM model support#2629
Open
MatthewKhouzam wants to merge 1 commit intoeclipse-platform:masterfrom
Open
Add initial LLM model support#2629MatthewKhouzam wants to merge 1 commit intoeclipse-platform:masterfrom
MatthewKhouzam wants to merge 1 commit intoeclipse-platform:masterfrom
Conversation
This creates a way to have a default llm provider so plugins can make simple inference. This was assisted by Claude Opus 4.7 Signed-off-by: Matthew Khouzam <matthew.khouzam@ericsson.com>
Author
|
I want to use this to have a one place to query for events for trace compass. It could be in the wrong place, but I strongly doubt anyone can have a wider need for AI than this, |
Contributor
Test Results 57 files + 6 57 suites +6 36m 25s ⏱️ -41s For more details on these errors, see this check. Results for commit 76f1859. ± Comparison against base commit 9e8a07f. |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
This creates a way to have a default llm provider so plugins can make simple inference.
The intent is to have a way for all llms to work in a extensions that need inference. E.g. "explain this anomaly"
This was assisted by Claude Opus 4.7