-
Notifications
You must be signed in to change notification settings - Fork 8.5k
[Obs Ai Assistant] Add system message #209773
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Obs Ai Assistant] Add system message #209773
Conversation
|
Pinging @elastic/obs-ai-assistant (Team:Obs AI Assistant) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sorenlouv On chatComplete, we always replace the system message using replaceSystemMessage with the one generated by getSystemMessageFromInstructions. I've added an argument systemMessage and save it in the conversation but still use the generated system message from getSystemMessageFromInstructions to passing a system to the LLM (inference.chat). I’m not sure if this is the intended behavior, should we always use the provided systemMessage, or is overriding it the correct approach?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should not pass the system message to the complete function because adHocInstructions are updated inside complete.
Instead the system message should be constructed from applicationInstructions, userInstructions and adHocInstructions at the time when they are all available.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That makes sense. So instead of passing a predefined systemMessage to complete, we are always constructing it dynamically using applicationInstructions, userInstructions, and adHocInstructions.
I've removed the systemMessage argument from complete, we are still saving systemMessage in the conversation.
x-pack/platform/plugins/shared/observability_ai_assistant/server/routes/conversations/route.ts
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@sorenlouv As you can see here, this was redundant since it gets replaced by the same message.
|
@sorenlouv - the Lines 187 to 210 in 8a26cf6
|
|
@arturoliduena the system message is stored in the conversation for debug purposes, we don't strictly require to store it but it's useful when folks copy the conversation. |
Could we add a new |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we add a new system (or perhaps systemMessage) field to the conversation index in order to properly separate it from the list of messages? This would align well with the inference plugin.
@sorenlouv The message with system role is removed from the list of messages and a new property systemMessage is stored within the conversation. The same when updating a conversation, ensuring better separation and alignment with the inference plugin.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I like this change - but let's make sure we test this properly. There are likely places where we do things like messages.length > 1 (not great anyway) to see if there are user messages
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am working on updating the current tests and I’ll also add tests to verify the expected behavior.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @arturoliduena !
x-pack/platform/packages/shared/kbn-ai-assistant/src/chat/chat_body.tsx
Outdated
Show resolved
Hide resolved
⏳ Build in-progress, with failures
Failed CI StepsHistory
|
...latform/plugins/shared/observability_ai_assistant/public/service/create_chat_service.test.ts
Outdated
Show resolved
Hide resolved
x-pack/platform/plugins/shared/observability_ai_assistant/public/service/create_chat_service.ts
Outdated
Show resolved
Hide resolved
x-pack/platform/plugins/shared/observability_ai_assistant/public/types.ts
Outdated
Show resolved
Hide resolved
...red/observability_ai_assistant/server/functions/get_dataset_info/get_relevant_field_names.ts
Outdated
Show resolved
Hide resolved
...olutions/observability/plugins/observability_ai_assistant_app/server/rule_connector/index.ts
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Would it make sense to move this to getSystemMessageFromInstructions ?
adHocInstructions.push({
instruction_type: 'application_instruction',
text: `This conversation will be persisted in Kibana and available at this url: ${
kibanaPublicUrl + `/app/observabilityAIAssistant/conversations/${conversationId}`
}.`,
});
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
There are cases where this instruction should not be added to the system message:
kibana/x-pack/platform/plugins/shared/observability_ai_assistant/server/routes/functions/route.ts
Lines 65 to 80 in 652385c
| const functionDefinitions = functionClient.getFunctions().map((fn) => fn.definition); | |
| const availableFunctionNames = functionDefinitions.map((def) => def.name); | |
| return { | |
| functionDefinitions, | |
| systemMessage: getSystemMessageFromInstructions({ | |
| applicationInstructions: functionClient.getInstructions(), | |
| userInstructions, | |
| adHocInstructions: functionClient.getAdhocInstructions(), | |
| availableFunctionNames, | |
| }), | |
| }; | |
| }, | |
| }); | |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we can get rid of getAdhocInstructions and registerAdhocInstruction and just use getInstructions and registerInstruction instead.
adhocInstructions should be limited to those instructions coming via the API. And perhaps we should just treat those as user instructions, and get rid of the concept of adhoc instructions entirely.
🤖 GitHub commentsExpand to view the GitHub comments
Just comment with:
|
67e8d2a to
e810b83
Compare
8506ad3 to
9d5b2fd
Compare
@sorenlouv We are already validating that test: cases: it('calls the llm to generate a new title', () => {
expect(inferenceClientMock.chatComplete.mock.calls[0]).toEqual([
expect.objectContaining({
connectorId: 'foo',
stream: false,
system:
'You are a helpful assistant for Elastic Observability. Assume the following message is the start of a conversation between you and a user; give this conversation a title based on the content below. DO NOT UNDER ANY CIRCUMSTANCES wrap this title in single or double quotes. This title is shown in a list of conversations to the user, so title it for the user, not for you.',
... it('calls the llm again with the messages', () => {
expect(inferenceClientMock.chatComplete.mock.calls[1]).toEqual([
{
connectorId: 'foo',
stream: true,
system: EXPECTED_STORED_SYSTEM_MESSAGE,
... |
⏳ Build in-progress
History
|
|
Starting backport for target branches: 8.x, 9.0 |
Fix: System Message Missing in Inference Plugin Closes elastic#209548 ## Summary A regression was introduced in 8.18 ([elastic#199286](elastic#199286)), where the system message is no longer passed to the inference plugin and, consequently, the LLM. Currently, only user messages are being sent, which impacts conversation guidance and guardrails. The system message is crucial for steering responses and maintaining contextual integrity. The filtering of the system message happens here: https://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512 Fix Approach - Ensure the `system` message is included as a parameter in `inferenceClient.chatComplete.` ```typescript const options = { connectorId, system, messages: convertMessagesForInference(messages), toolChoice, tools, functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode, }; if (stream) { return defer(() => this.dependencies.inferenceClient.chatComplete({ ...options, stream: true, }) ).pipe( convertInferenceEventsToStreamingEvents(), instrumentAndCountTokens(name), failOnNonExistingFunctionCall({ functions }), tap((event) => { if ( event.type === StreamingChatResponseEventType.ChatCompletionChunk && this.dependencies.logger.isLevelEnabled('trace') ) { this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`); } }), shareReplay() ) as TStream extends true ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent> : never; } else { return this.dependencies.inferenceClient.chatComplete({ ...options, stream: false, }) as TStream extends true ? never : Promise<ChatCompleteResponse>; } } ``` - Add an API test to verify that the system message is correctly passed to the LLM. (cherry picked from commit 0ae28aa)
Fix: System Message Missing in Inference Plugin Closes elastic#209548 ## Summary A regression was introduced in 8.18 ([elastic#199286](elastic#199286)), where the system message is no longer passed to the inference plugin and, consequently, the LLM. Currently, only user messages are being sent, which impacts conversation guidance and guardrails. The system message is crucial for steering responses and maintaining contextual integrity. The filtering of the system message happens here: https://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512 Fix Approach - Ensure the `system` message is included as a parameter in `inferenceClient.chatComplete.` ```typescript const options = { connectorId, system, messages: convertMessagesForInference(messages), toolChoice, tools, functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode, }; if (stream) { return defer(() => this.dependencies.inferenceClient.chatComplete({ ...options, stream: true, }) ).pipe( convertInferenceEventsToStreamingEvents(), instrumentAndCountTokens(name), failOnNonExistingFunctionCall({ functions }), tap((event) => { if ( event.type === StreamingChatResponseEventType.ChatCompletionChunk && this.dependencies.logger.isLevelEnabled('trace') ) { this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`); } }), shareReplay() ) as TStream extends true ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent> : never; } else { return this.dependencies.inferenceClient.chatComplete({ ...options, stream: false, }) as TStream extends true ? never : Promise<ChatCompleteResponse>; } } ``` - Add an API test to verify that the system message is correctly passed to the LLM. (cherry picked from commit 0ae28aa)
💚 All backports created successfully
Note: Successful backport PRs will be merged automatically after passing CI. Questions ?Please refer to the Backport tool documentation |
# Backport This will backport the following commits from `main` to `9.0`: - [[Obs Ai Assistant] Add system message (#209773)](#209773) <!--- Backport version: 9.4.3 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sqren/backport) <!--BACKPORT [{"author":{"name":"Arturo Lidueña","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-17T10:12:56Z","message":"[Obs Ai Assistant] Add system message (#209773)\n\nFix: System Message Missing in Inference Plugin\r\nCloses #209548\r\n## Summary\r\n\r\nA regression was introduced in 8.18\r\n([#199286](#199286)), where the\r\nsystem message is no longer passed to the inference plugin and,\r\nconsequently, the LLM.\r\n\r\nCurrently, only user messages are being sent, which impacts conversation\r\nguidance and guardrails. The system message is crucial for steering\r\nresponses and maintaining contextual integrity.\r\n\r\nThe filtering of the system message happens here:\r\n\r\nhttps://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512\r\n\r\nFix Approach\r\n- Ensure the `system` message is included as a parameter in\r\n`inferenceClient.chatComplete.`\r\n```typescript\r\nconst options = {\r\n connectorId,\r\n system,\r\n messages: convertMessagesForInference(messages),\r\n toolChoice,\r\n tools,\r\n functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode,\r\n };\r\n if (stream) {\r\n return defer(() =>\r\n this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n stream: true,\r\n })\r\n ).pipe(\r\n convertInferenceEventsToStreamingEvents(),\r\n instrumentAndCountTokens(name),\r\n failOnNonExistingFunctionCall({ functions }),\r\n tap((event) => {\r\n if (\r\n event.type === StreamingChatResponseEventType.ChatCompletionChunk &&\r\n this.dependencies.logger.isLevelEnabled('trace')\r\n ) {\r\n this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`);\r\n }\r\n }),\r\n shareReplay()\r\n ) as TStream extends true\r\n ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent>\r\n : never;\r\n } else {\r\n return this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n stream: false,\r\n }) as TStream extends true ? never : Promise<ChatCompleteResponse>;\r\n }\r\n }\r\n ```\r\n- Add an API test to verify that the system message is correctly passed to the LLM.","sha":"0ae28aa8bc73ce682166ffd5666828b1fdb7c017","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:fix","v9.0.0","Team:Obs AI Assistant","ci:project-deploy-observability","backport:version","v9.1.0","v8.19.0"],"title":"[Obs Ai Assistant] Add system message","number":209773,"url":"https://github.com/elastic/kibana/pull/209773","mergeCommit":{"message":"[Obs Ai Assistant] Add system message (#209773)\n\nFix: System Message Missing in Inference Plugin\r\nCloses #209548\r\n## Summary\r\n\r\nA regression was introduced in 8.18\r\n([#199286](#199286)), where the\r\nsystem message is no longer passed to the inference plugin and,\r\nconsequently, the LLM.\r\n\r\nCurrently, only user messages are being sent, which impacts conversation\r\nguidance and guardrails. The system message is crucial for steering\r\nresponses and maintaining contextual integrity.\r\n\r\nThe filtering of the system message happens here:\r\n\r\nhttps://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512\r\n\r\nFix Approach\r\n- Ensure the `system` message is included as a parameter in\r\n`inferenceClient.chatComplete.`\r\n```typescript\r\nconst options = {\r\n connectorId,\r\n system,\r\n messages: convertMessagesForInference(messages),\r\n toolChoice,\r\n tools,\r\n functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode,\r\n };\r\n if (stream) {\r\n return defer(() =>\r\n this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n stream: true,\r\n })\r\n ).pipe(\r\n convertInferenceEventsToStreamingEvents(),\r\n instrumentAndCountTokens(name),\r\n failOnNonExistingFunctionCall({ functions }),\r\n tap((event) => {\r\n if (\r\n event.type === StreamingChatResponseEventType.ChatCompletionChunk &&\r\n this.dependencies.logger.isLevelEnabled('trace')\r\n ) {\r\n this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`);\r\n }\r\n }),\r\n shareReplay()\r\n ) as TStream extends true\r\n ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent>\r\n : never;\r\n } else {\r\n return this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n stream: false,\r\n }) as TStream extends true ? never : Promise<ChatCompleteResponse>;\r\n }\r\n }\r\n ```\r\n- Add an API test to verify that the system message is correctly passed to the LLM.","sha":"0ae28aa8bc73ce682166ffd5666828b1fdb7c017"}},"sourceBranch":"main","suggestedTargetBranches":["9.0","8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/209773","number":209773,"mergeCommit":{"message":"[Obs Ai Assistant] Add system message (#209773)\n\nFix: System Message Missing in Inference Plugin\r\nCloses #209548\r\n## Summary\r\n\r\nA regression was introduced in 8.18\r\n([#199286](#199286)), where the\r\nsystem message is no longer passed to the inference plugin and,\r\nconsequently, the LLM.\r\n\r\nCurrently, only user messages are being sent, which impacts conversation\r\nguidance and guardrails. The system message is crucial for steering\r\nresponses and maintaining contextual integrity.\r\n\r\nThe filtering of the system message happens here:\r\n\r\nhttps://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512\r\n\r\nFix Approach\r\n- Ensure the `system` message is included as a parameter in\r\n`inferenceClient.chatComplete.`\r\n```typescript\r\nconst options = {\r\n connectorId,\r\n system,\r\n messages: convertMessagesForInference(messages),\r\n toolChoice,\r\n tools,\r\n functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode,\r\n };\r\n if (stream) {\r\n return defer(() =>\r\n this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n stream: true,\r\n })\r\n ).pipe(\r\n convertInferenceEventsToStreamingEvents(),\r\n instrumentAndCountTokens(name),\r\n failOnNonExistingFunctionCall({ functions }),\r\n tap((event) => {\r\n if (\r\n event.type === StreamingChatResponseEventType.ChatCompletionChunk &&\r\n this.dependencies.logger.isLevelEnabled('trace')\r\n ) {\r\n this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`);\r\n }\r\n }),\r\n shareReplay()\r\n ) as TStream extends true\r\n ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent>\r\n : never;\r\n } else {\r\n return this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n stream: false,\r\n }) as TStream extends true ? never : Promise<ChatCompleteResponse>;\r\n }\r\n }\r\n ```\r\n- Add an API test to verify that the system message is correctly passed to the LLM.","sha":"0ae28aa8bc73ce682166ffd5666828b1fdb7c017"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}] BACKPORT--> Co-authored-by: Arturo Lidueña <[email protected]>
# Backport This will backport the following commits from `main` to `8.x`: - [[Obs Ai Assistant] Add system message (#209773)](#209773) <!--- Backport version: 9.4.3 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sqren/backport) <!--BACKPORT [{"author":{"name":"Arturo Lidueña","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-17T10:12:56Z","message":"[Obs Ai Assistant] Add system message (#209773)\n\nFix: System Message Missing in Inference Plugin\r\nCloses #209548\r\n## Summary\r\n\r\nA regression was introduced in 8.18\r\n([#199286](#199286)), where the\r\nsystem message is no longer passed to the inference plugin and,\r\nconsequently, the LLM.\r\n\r\nCurrently, only user messages are being sent, which impacts conversation\r\nguidance and guardrails. The system message is crucial for steering\r\nresponses and maintaining contextual integrity.\r\n\r\nThe filtering of the system message happens here:\r\n\r\nhttps://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512\r\n\r\nFix Approach\r\n- Ensure the `system` message is included as a parameter in\r\n`inferenceClient.chatComplete.`\r\n```typescript\r\nconst options = {\r\n connectorId,\r\n system,\r\n messages: convertMessagesForInference(messages),\r\n toolChoice,\r\n tools,\r\n functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode,\r\n };\r\n if (stream) {\r\n return defer(() =>\r\n this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n stream: true,\r\n })\r\n ).pipe(\r\n convertInferenceEventsToStreamingEvents(),\r\n instrumentAndCountTokens(name),\r\n failOnNonExistingFunctionCall({ functions }),\r\n tap((event) => {\r\n if (\r\n event.type === StreamingChatResponseEventType.ChatCompletionChunk &&\r\n this.dependencies.logger.isLevelEnabled('trace')\r\n ) {\r\n this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`);\r\n }\r\n }),\r\n shareReplay()\r\n ) as TStream extends true\r\n ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent>\r\n : never;\r\n } else {\r\n return this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n stream: false,\r\n }) as TStream extends true ? never : Promise<ChatCompleteResponse>;\r\n }\r\n }\r\n ```\r\n- Add an API test to verify that the system message is correctly passed to the LLM.","sha":"0ae28aa8bc73ce682166ffd5666828b1fdb7c017","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:fix","v9.0.0","Team:Obs AI Assistant","ci:project-deploy-observability","backport:version","v9.1.0","v8.19.0"],"title":"[Obs Ai Assistant] Add system message","number":209773,"url":"https://github.com/elastic/kibana/pull/209773","mergeCommit":{"message":"[Obs Ai Assistant] Add system message (#209773)\n\nFix: System Message Missing in Inference Plugin\r\nCloses #209548\r\n## Summary\r\n\r\nA regression was introduced in 8.18\r\n([#199286](#199286)), where the\r\nsystem message is no longer passed to the inference plugin and,\r\nconsequently, the LLM.\r\n\r\nCurrently, only user messages are being sent, which impacts conversation\r\nguidance and guardrails. The system message is crucial for steering\r\nresponses and maintaining contextual integrity.\r\n\r\nThe filtering of the system message happens here:\r\n\r\nhttps://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512\r\n\r\nFix Approach\r\n- Ensure the `system` message is included as a parameter in\r\n`inferenceClient.chatComplete.`\r\n```typescript\r\nconst options = {\r\n connectorId,\r\n system,\r\n messages: convertMessagesForInference(messages),\r\n toolChoice,\r\n tools,\r\n functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode,\r\n };\r\n if (stream) {\r\n return defer(() =>\r\n this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n stream: true,\r\n })\r\n ).pipe(\r\n convertInferenceEventsToStreamingEvents(),\r\n instrumentAndCountTokens(name),\r\n failOnNonExistingFunctionCall({ functions }),\r\n tap((event) => {\r\n if (\r\n event.type === StreamingChatResponseEventType.ChatCompletionChunk &&\r\n this.dependencies.logger.isLevelEnabled('trace')\r\n ) {\r\n this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`);\r\n }\r\n }),\r\n shareReplay()\r\n ) as TStream extends true\r\n ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent>\r\n : never;\r\n } else {\r\n return this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n stream: false,\r\n }) as TStream extends true ? never : Promise<ChatCompleteResponse>;\r\n }\r\n }\r\n ```\r\n- Add an API test to verify that the system message is correctly passed to the LLM.","sha":"0ae28aa8bc73ce682166ffd5666828b1fdb7c017"}},"sourceBranch":"main","suggestedTargetBranches":["9.0","8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/209773","number":209773,"mergeCommit":{"message":"[Obs Ai Assistant] Add system message (#209773)\n\nFix: System Message Missing in Inference Plugin\r\nCloses #209548\r\n## Summary\r\n\r\nA regression was introduced in 8.18\r\n([#199286](#199286)), where the\r\nsystem message is no longer passed to the inference plugin and,\r\nconsequently, the LLM.\r\n\r\nCurrently, only user messages are being sent, which impacts conversation\r\nguidance and guardrails. The system message is crucial for steering\r\nresponses and maintaining contextual integrity.\r\n\r\nThe filtering of the system message happens here:\r\n\r\nhttps://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512\r\n\r\nFix Approach\r\n- Ensure the `system` message is included as a parameter in\r\n`inferenceClient.chatComplete.`\r\n```typescript\r\nconst options = {\r\n connectorId,\r\n system,\r\n messages: convertMessagesForInference(messages),\r\n toolChoice,\r\n tools,\r\n functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode,\r\n };\r\n if (stream) {\r\n return defer(() =>\r\n this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n stream: true,\r\n })\r\n ).pipe(\r\n convertInferenceEventsToStreamingEvents(),\r\n instrumentAndCountTokens(name),\r\n failOnNonExistingFunctionCall({ functions }),\r\n tap((event) => {\r\n if (\r\n event.type === StreamingChatResponseEventType.ChatCompletionChunk &&\r\n this.dependencies.logger.isLevelEnabled('trace')\r\n ) {\r\n this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`);\r\n }\r\n }),\r\n shareReplay()\r\n ) as TStream extends true\r\n ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent>\r\n : never;\r\n } else {\r\n return this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n stream: false,\r\n }) as TStream extends true ? never : Promise<ChatCompleteResponse>;\r\n }\r\n }\r\n ```\r\n- Add an API test to verify that the system message is correctly passed to the LLM.","sha":"0ae28aa8bc73ce682166ffd5666828b1fdb7c017"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}] BACKPORT--> Co-authored-by: Arturo Lidueña <[email protected]>
@arturoliduena We should also have a test that checks what we send to the LLM. We should not blindly trust the inference plugin. It could change in some way that we do not understand causing it to no longer pass the system message to the LLM - a little bit like what just happened. It doesn't need to be a big convoluted test - just simply checking that the system message is sent along. A think this is good practice to do for most of what we do since it's so easy with the LLM Proxy |
Closes elastic#212006 ## Problem Due to elastic#209773 we no longer add the system message to the output from the "Copy conversation" button. (cherry picked from commit 4783925)
Closes elastic#212006 ## Problem Due to elastic#209773 we no longer add the system message to the output from the "Copy conversation" button. (cherry picked from commit 4783925)
…212071) # Backport This will backport the following commits from `main` to `9.0`: - [add system message in copy conversation JSON payload (#212009)](#212009) <!--- Backport version: 9.6.6 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sorenlouv/backport) <!--BACKPORT [{"author":{"name":"Arturo Lidueña","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-21T12:48:15Z","message":"add system message in copy conversation JSON payload (#212009)\n\nCloses #212006\n## Problem\nDue to #209773 we no longer add\nthe system message to the output from the \"Copy conversation\" button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:fix","v9.0.0","Team:Obs AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"add system message in copy conversation JSON payload","number":212009,"url":"https://github.com/elastic/kibana/pull/212009","mergeCommit":{"message":"add system message in copy conversation JSON payload (#212009)\n\nCloses #212006\n## Problem\nDue to #209773 we no longer add\nthe system message to the output from the \"Copy conversation\" button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a"}},"sourceBranch":"main","suggestedTargetBranches":["9.0","8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/212009","number":212009,"mergeCommit":{"message":"add system message in copy conversation JSON payload (#212009)\n\nCloses #212006\n## Problem\nDue to #209773 we no longer add\nthe system message to the output from the \"Copy conversation\" button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}] BACKPORT--> Co-authored-by: Arturo Lidueña <[email protected]>
…212070) # Backport This will backport the following commits from `main` to `8.x`: - [add system message in copy conversation JSON payload (#212009)](#212009) <!--- Backport version: 9.6.6 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sorenlouv/backport) <!--BACKPORT [{"author":{"name":"Arturo Lidueña","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-21T12:48:15Z","message":"add system message in copy conversation JSON payload (#212009)\n\nCloses #212006\n## Problem\nDue to #209773 we no longer add\nthe system message to the output from the \"Copy conversation\" button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:fix","v9.0.0","Team:Obs AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"add system message in copy conversation JSON payload","number":212009,"url":"https://github.com/elastic/kibana/pull/212009","mergeCommit":{"message":"add system message in copy conversation JSON payload (#212009)\n\nCloses #212006\n## Problem\nDue to #209773 we no longer add\nthe system message to the output from the \"Copy conversation\" button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a"}},"sourceBranch":"main","suggestedTargetBranches":["9.0","8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/212009","number":212009,"mergeCommit":{"message":"add system message in copy conversation JSON payload (#212009)\n\nCloses #212006\n## Problem\nDue to #209773 we no longer add\nthe system message to the output from the \"Copy conversation\" button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}] BACKPORT--> Co-authored-by: Arturo Lidueña <[email protected]>
) (#212181) # Backport This will backport the following commits from `main` to `9.0`: - [[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)](#212077) <!--- Backport version: 9.6.6 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sorenlouv/backport) <!--BACKPORT [{"author":{"name":"Søren Louv-Jansen","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-22T11:50:14Z","message":"[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)\n\nCloses https://github.com/elastic/kibana/issues/212005\n\nRegression introduced in:\nhttps://github.com//pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n- Add a test that would have caught this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:skip","v9.0.0","Team:Obs AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"[Obs AI Assistant] Fix bug with `get_alerts_dataset_info`","number":212077,"url":"https://github.com/elastic/kibana/pull/212077","mergeCommit":{"message":"[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)\n\nCloses https://github.com/elastic/kibana/issues/212005\n\nRegression introduced in:\nhttps://github.com//pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n- Add a test that would have caught this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526"}},"sourceBranch":"main","suggestedTargetBranches":["9.0","8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/212077","number":212077,"mergeCommit":{"message":"[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)\n\nCloses https://github.com/elastic/kibana/issues/212005\n\nRegression introduced in:\nhttps://github.com//pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n- Add a test that would have caught this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}] BACKPORT--> Co-authored-by: Søren Louv-Jansen <[email protected]>
) (#212215) # Backport This will backport the following commits from `main` to `8.x`: - [[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)](#212077) <!--- Backport version: 9.6.6 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sorenlouv/backport) <!--BACKPORT [{"author":{"name":"Søren Louv-Jansen","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-22T11:50:14Z","message":"[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)\n\nCloses https://github.com/elastic/kibana/issues/212005\n\nRegression introduced in:\nhttps://github.com//pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n- Add a test that would have caught this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:skip","v9.0.0","Team:Obs AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"[Obs AI Assistant] Fix bug with `get_alerts_dataset_info`","number":212077,"url":"https://github.com/elastic/kibana/pull/212077","mergeCommit":{"message":"[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)\n\nCloses https://github.com/elastic/kibana/issues/212005\n\nRegression introduced in:\nhttps://github.com//pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n- Add a test that would have caught this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526"}},"sourceBranch":"main","suggestedTargetBranches":["8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"url":"https://github.com/elastic/kibana/pull/212181","number":212181,"state":"MERGED","mergeCommit":{"sha":"349b68495d6b78547b476aae106a974466a20427","message":"[9.0] [Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077) (#212181)\n\n# Backport\n\nThis will backport the following commits from `main` to `9.0`:\n- [[Obs AI Assistant] Fix bug with `get_alerts_dataset_info`\n(#212077)](https://github.com/elastic/kibana/pull/212077)\n\n\n\n### Questions ?\nPlease refer to the [Backport tool\ndocumentation](https://github.com/sorenlouv/backport)\n\n\n\nCo-authored-by: Søren Louv-Jansen <[email protected]>"}},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/212077","number":212077,"mergeCommit":{"message":"[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)\n\nCloses https://github.com/elastic/kibana/issues/212005\n\nRegression introduced in:\nhttps://github.com//pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n- Add a test that would have caught this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}] BACKPORT--> Co-authored-by: Søren Louv-Jansen <[email protected]>
Closes elastic#212006 ## Problem Due to elastic#209773 we no longer add the system message to the output from the "Copy conversation" button.
…12009) (elastic#212070) # Backport This will backport the following commits from `main` to `8.x`: - [add system message in copy conversation JSON payload (elastic#212009)](elastic#212009) <!--- Backport version: 9.6.6 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sorenlouv/backport) <!--BACKPORT [{"author":{"name":"Arturo Lidueña","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-21T12:48:15Z","message":"add system message in copy conversation JSON payload (elastic#212009)\n\nCloses elastic#212006\n## Problem\nDue to elastic#209773 we no longer add\nthe system message to the output from the \"Copy conversation\" button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:fix","v9.0.0","Team:Obs AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"add system message in copy conversation JSON payload","number":212009,"url":"https://github.com/elastic/kibana/pull/212009","mergeCommit":{"message":"add system message in copy conversation JSON payload (elastic#212009)\n\nCloses elastic#212006\n## Problem\nDue to elastic#209773 we no longer add\nthe system message to the output from the \"Copy conversation\" button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a"}},"sourceBranch":"main","suggestedTargetBranches":["9.0","8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/212009","number":212009,"mergeCommit":{"message":"add system message in copy conversation JSON payload (elastic#212009)\n\nCloses elastic#212006\n## Problem\nDue to elastic#209773 we no longer add\nthe system message to the output from the \"Copy conversation\" button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}] BACKPORT--> Co-authored-by: Arturo Lidueña <[email protected]>
…tic#212077) (elastic#212215) # Backport This will backport the following commits from `main` to `8.x`: - [[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (elastic#212077)](elastic#212077) <!--- Backport version: 9.6.6 --> ### Questions ? Please refer to the [Backport tool documentation](https://github.com/sorenlouv/backport) <!--BACKPORT [{"author":{"name":"Søren Louv-Jansen","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-22T11:50:14Z","message":"[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (elastic#212077)\n\nCloses https://github.com/elastic/kibana/issues/212005\n\nRegression introduced in:\nhttps://github.com/elastic/pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n- Add a test that would have caught this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:skip","v9.0.0","Team:Obs AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"[Obs AI Assistant] Fix bug with `get_alerts_dataset_info`","number":212077,"url":"https://github.com/elastic/kibana/pull/212077","mergeCommit":{"message":"[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (elastic#212077)\n\nCloses https://github.com/elastic/kibana/issues/212005\n\nRegression introduced in:\nhttps://github.com/elastic/pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n- Add a test that would have caught this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526"}},"sourceBranch":"main","suggestedTargetBranches":["8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"url":"https://github.com/elastic/kibana/pull/212181","number":212181,"state":"MERGED","mergeCommit":{"sha":"349b68495d6b78547b476aae106a974466a20427","message":"[9.0] [Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (elastic#212077) (elastic#212181)\n\n# Backport\n\nThis will backport the following commits from `main` to `9.0`:\n- [[Obs AI Assistant] Fix bug with `get_alerts_dataset_info`\n(elastic#212077)](https://github.com/elastic/kibana/pull/212077)\n\n\n\n### Questions ?\nPlease refer to the [Backport tool\ndocumentation](https://github.com/sorenlouv/backport)\n\n\n\nCo-authored-by: Søren Louv-Jansen <[email protected]>"}},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/212077","number":212077,"mergeCommit":{"message":"[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (elastic#212077)\n\nCloses https://github.com/elastic/kibana/issues/212005\n\nRegression introduced in:\nhttps://github.com/elastic/pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n- Add a test that would have caught this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}] BACKPORT--> Co-authored-by: Søren Louv-Jansen <[email protected]>
Fix: System Message Missing in Inference Plugin Closes elastic#209548 ## Summary A regression was introduced in 8.18 ([elastic#199286](elastic#199286)), where the system message is no longer passed to the inference plugin and, consequently, the LLM. Currently, only user messages are being sent, which impacts conversation guidance and guardrails. The system message is crucial for steering responses and maintaining contextual integrity. The filtering of the system message happens here: https://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512 Fix Approach - Ensure the `system` message is included as a parameter in `inferenceClient.chatComplete.` ```typescript const options = { connectorId, system, messages: convertMessagesForInference(messages), toolChoice, tools, functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode, }; if (stream) { return defer(() => this.dependencies.inferenceClient.chatComplete({ ...options, stream: true, }) ).pipe( convertInferenceEventsToStreamingEvents(), instrumentAndCountTokens(name), failOnNonExistingFunctionCall({ functions }), tap((event) => { if ( event.type === StreamingChatResponseEventType.ChatCompletionChunk && this.dependencies.logger.isLevelEnabled('trace') ) { this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`); } }), shareReplay() ) as TStream extends true ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent> : never; } else { return this.dependencies.inferenceClient.chatComplete({ ...options, stream: false, }) as TStream extends true ? never : Promise<ChatCompleteResponse>; } } ``` - Add an API test to verify that the system message is correctly passed to the LLM.
Closes elastic#212006 ## Problem Due to elastic#209773 we no longer add the system message to the output from the "Copy conversation" button.
Fix: System Message Missing in Inference Plugin
Closes #209548
Summary
A regression was introduced in 8.18 (#199286), where the system message is no longer passed to the inference plugin and, consequently, the LLM.
Currently, only user messages are being sent, which impacts conversation guidance and guardrails. The system message is crucial for steering responses and maintaining contextual integrity.
The filtering of the system message happens here:
kibana/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts
Lines 510 to 512 in 771a080
Fix Approach
systemmessage is included as a parameter ininferenceClient.chatComplete.Checklist
Check the PR satisfies following conditions.
Reviewers should verify this PR satisfies this list as well.
release_note:breakinglabel should be applied in these situations.release_note:*label is applied per the guidelinesIdentify risks
Does this PR introduce any risks? For example, consider risks like hard to test bugs, performance regression, potential of data loss.
Describe the risk, its severity, and mitigation for each identified risk. Invite stakeholders and evaluate how to proceed before merging.