Skip to content

Conversation

@arturoliduena
Copy link
Contributor

@arturoliduena arturoliduena commented Feb 5, 2025

Fix: System Message Missing in Inference Plugin
Closes #209548

Summary

A regression was introduced in 8.18 (#199286), where the system message is no longer passed to the inference plugin and, consequently, the LLM.

Currently, only user messages are being sent, which impacts conversation guidance and guardrails. The system message is crucial for steering responses and maintaining contextual integrity.

The filtering of the system message happens here:

messages: convertMessagesForInference(
messages.filter((message) => message.message.role !== MessageRole.System)
),

Fix Approach

  • Ensure the system message is included as a parameter in inferenceClient.chatComplete.
const options = {
      connectorId,
      system,
      messages: convertMessagesForInference(messages),
      toolChoice,
      tools,
      functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode,
    };
    if (stream) {
      return defer(() =>
        this.dependencies.inferenceClient.chatComplete({
          ...options,
          stream: true,
        })
      ).pipe(
        convertInferenceEventsToStreamingEvents(),
        instrumentAndCountTokens(name),
        failOnNonExistingFunctionCall({ functions }),
        tap((event) => {
          if (
            event.type === StreamingChatResponseEventType.ChatCompletionChunk &&
            this.dependencies.logger.isLevelEnabled('trace')
          ) {
            this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`);
          }
        }),
        shareReplay()
      ) as TStream extends true
        ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent>
        : never;
    } else {
      return this.dependencies.inferenceClient.chatComplete({
        ...options,
        stream: false,
      }) as TStream extends true ? never : Promise<ChatCompleteResponse>;
    }
  }
  • Add an API test to verify that the system message is correctly passed to the LLM.

Checklist

Check the PR satisfies following conditions.

Reviewers should verify this PR satisfies this list as well.

  • Any text added follows EUI's writing guidelines, uses sentence case text and includes i18n support
  • Documentation was added for features that require explanation or tutorials
  • Unit or functional tests were updated or added to match the most common scenarios
  • If a plugin configuration key changed, check if it needs to be allowlisted in the cloud and added to the docker list
  • This was checked for breaking HTTP API changes, and any breaking changes have been approved by the breaking-change committee. The release_note:breaking label should be applied in these situations.
  • Flaky Test Runner was used on any tests changed
  • The PR description includes the appropriate Release Notes section, and the correct release_note:* label is applied per the guidelines

Identify risks

Does this PR introduce any risks? For example, consider risks like hard to test bugs, performance regression, potential of data loss.

Describe the risk, its severity, and mitigation for each identified risk. Invite stakeholders and evaluate how to proceed before merging.

@arturoliduena arturoliduena added release_note:fix Team:Obs AI Assistant Observability AI Assistant backport:version Backport to applied version labels v8.18.0 labels Feb 5, 2025
@arturoliduena arturoliduena requested review from a team as code owners February 5, 2025 13:55
@elasticmachine
Copy link
Contributor

Pinging @elastic/obs-ai-assistant (Team:Obs AI Assistant)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sorenlouv On chatComplete, we always replace the system message using replaceSystemMessage with the one generated by getSystemMessageFromInstructions. I've added an argument systemMessage and save it in the conversation but still use the generated system message from getSystemMessageFromInstructions to passing a system to the LLM (inference.chat). I’m not sure if this is the intended behavior, should we always use the provided systemMessage, or is overriding it the correct approach?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we should not pass the system message to the complete function because adHocInstructions are updated inside complete.

Instead the system message should be constructed from applicationInstructions, userInstructions and adHocInstructions at the time when they are all available.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That makes sense. So instead of passing a predefined systemMessage to complete, we are always constructing it dynamically using applicationInstructions, userInstructions, and adHocInstructions.

I've removed the systemMessage argument from complete, we are still saving systemMessage in the conversation.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@sorenlouv As you can see here, this was redundant since it gets replaced by the same message.

@arturoliduena
Copy link
Contributor Author

@sorenlouv - the systemMessage is generated in the UI and then replaced in the chatComplete method. If the system message is needed in the UI, we should clarify whether it should persist or be overridden later. otherwise just remove it:

async initialize() {
this.functionRegistry = new Map();
const systemMessages: string[] = [];
const scopePromise = this.apiClient('GET /internal/observability_ai_assistant/functions', {
signal: this.abortSignal,
params: {
query: {
scopes: this.getScopes(),
},
},
}).then(({ functionDefinitions, systemMessage }) => {
functionDefinitions.forEach((fn) => this.functionRegistry.set(fn.name, fn));
systemMessages.push(systemMessage);
});
await Promise.all([
scopePromise,
...this.registrations.map((registration) => {
return registration({
registerRenderFunction: (name, renderFn) => {
this.renderFunctionRegistry.set(name, renderFn);
},
});
}),

@dgieselaar
Copy link
Contributor

@arturoliduena the system message is stored in the conversation for debug purposes, we don't strictly require to store it but it's useful when folks copy the conversation.

@sorenlouv
Copy link
Member

sorenlouv commented Feb 5, 2025

@arturoliduena the system message is stored in the conversation for debug purposes, we don't strictly require to store it but it's useful when folks copy the conversation.

Could we add a new system (or perhaps systemMessage) field to the conversation index in order to properly separate it from the list of messages? This would align well with the inference plugin.

Copy link
Contributor Author

@arturoliduena arturoliduena Feb 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could we add a new system (or perhaps systemMessage) field to the conversation index in order to properly separate it from the list of messages? This would align well with the inference plugin.

@sorenlouv The message with system role is removed from the list of messages and a new property systemMessage is stored within the conversation. The same when updating a conversation, ensuring better separation and alignment with the inference plugin.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like this change - but let's make sure we test this properly. There are likely places where we do things like messages.length > 1 (not great anyway) to see if there are user messages

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am working on updating the current tests and I’ll also add tests to verify the expected behavior.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks @arturoliduena !

@elasticmachine
Copy link
Contributor

elasticmachine commented Feb 6, 2025

⏳ Build in-progress, with failures

Failed CI Steps

History

  • 💔 Build #273567 failed 92a6ce739626f0f86b975a5db99454a1181fe6e5

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would it make sense to move this to getSystemMessageFromInstructions ?

          adHocInstructions.push({
            instruction_type: 'application_instruction',
            text: `This conversation will be persisted in Kibana and available at this url: ${
              kibanaPublicUrl + `/app/observabilityAIAssistant/conversations/${conversationId}`
            }.`,
          });

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are cases where this instruction should not be added to the system message:

const functionDefinitions = functionClient.getFunctions().map((fn) => fn.definition);
const availableFunctionNames = functionDefinitions.map((def) => def.name);
return {
functionDefinitions,
systemMessage: getSystemMessageFromInstructions({
applicationInstructions: functionClient.getInstructions(),
userInstructions,
adHocInstructions: functionClient.getAdhocInstructions(),
availableFunctionNames,
}),
};
},
});

Copy link
Member

@sorenlouv sorenlouv Feb 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can get rid of getAdhocInstructions and registerAdhocInstruction and just use getInstructions and registerInstruction instead.

adhocInstructions should be limited to those instructions coming via the API. And perhaps we should just treat those as user instructions, and get rid of the concept of adhoc instructions entirely.

@botelastic botelastic bot added the ci:project-deploy-observability Create an Observability project label Feb 7, 2025
@github-actions
Copy link
Contributor

github-actions bot commented Feb 7, 2025

🤖 GitHub comments

Expand to view the GitHub comments

Just comment with:

  • /oblt-deploy : Deploy a Kibana instance using the Observability test environments.
  • run docs-build : Re-trigger the docs validation. (use unformatted text in the comment!)

@arturoliduena arturoliduena force-pushed the obs-ai-assistant-209548-System-message-missing branch 2 times, most recently from 67e8d2a to e810b83 Compare February 11, 2025 15:51
@arturoliduena arturoliduena force-pushed the obs-ai-assistant-209548-System-message-missing branch from 8506ad3 to 9d5b2fd Compare February 16, 2025 18:39
@arturoliduena
Copy link
Contributor Author

Is the LLMProxy used for any of these tests? We shouldn't just mock the calls to the inference plugin. We need to validate that we actually send the system message to the LLM.

@sorenlouv We are already validating that inferenceClient.chatComplete is called twice (once for the title and once for the conversation). Additionally, we verify that each call includes the expected system message. Wouldn’t this be sufficient for the AI Assistant side? Why do we also need to check if it is passed to the LLM?

test: x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.test.ts

cases:

it('calls the llm to generate a new title', () => {
        expect(inferenceClientMock.chatComplete.mock.calls[0]).toEqual([
          expect.objectContaining({
            connectorId: 'foo',
            stream: false,
            system:
              'You are a helpful assistant for Elastic Observability. Assume the following message is the start of a conversation between you and a user; give this conversation a title based on the content below. DO NOT UNDER ANY CIRCUMSTANCES wrap this title in single or double quotes. This title is shown in a list of conversations to the user, so title it for the user, not for you.',
...
      it('calls the llm again with the messages', () => {
        expect(inferenceClientMock.chatComplete.mock.calls[1]).toEqual([
          {
            connectorId: 'foo',
            stream: true,
            system: EXPECTED_STORED_SYSTEM_MESSAGE,
...

@elasticmachine
Copy link
Contributor

⏳ Build in-progress

  • Buildkite Build
  • Commit: 66b4a12
  • Kibana Serverless Image: docker.elastic.co/kibana-ci/kibana-serverless:pr-209773-66b4a12578e8

History

@arturoliduena arturoliduena merged commit 0ae28aa into elastic:main Feb 17, 2025
10 checks passed
@kibanamachine
Copy link
Contributor

kibanamachine pushed a commit to kibanamachine/kibana that referenced this pull request Feb 17, 2025
Fix: System Message Missing in Inference Plugin
Closes elastic#209548
## Summary

A regression was introduced in 8.18
([elastic#199286](elastic#199286)), where the
system message is no longer passed to the inference plugin and,
consequently, the LLM.

Currently, only user messages are being sent, which impacts conversation
guidance and guardrails. The system message is crucial for steering
responses and maintaining contextual integrity.

The filtering of the system message happens here:

https://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512

Fix Approach
- Ensure the `system` message is included as a parameter in
`inferenceClient.chatComplete.`
```typescript
const options = {
      connectorId,
      system,
      messages: convertMessagesForInference(messages),
      toolChoice,
      tools,
      functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode,
    };
    if (stream) {
      return defer(() =>
        this.dependencies.inferenceClient.chatComplete({
          ...options,
          stream: true,
        })
      ).pipe(
        convertInferenceEventsToStreamingEvents(),
        instrumentAndCountTokens(name),
        failOnNonExistingFunctionCall({ functions }),
        tap((event) => {
          if (
            event.type === StreamingChatResponseEventType.ChatCompletionChunk &&
            this.dependencies.logger.isLevelEnabled('trace')
          ) {
            this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`);
          }
        }),
        shareReplay()
      ) as TStream extends true
        ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent>
        : never;
    } else {
      return this.dependencies.inferenceClient.chatComplete({
        ...options,
        stream: false,
      }) as TStream extends true ? never : Promise<ChatCompleteResponse>;
    }
  }
 ```
- Add an API test to verify that the system message is correctly passed to the LLM.

(cherry picked from commit 0ae28aa)
kibanamachine pushed a commit to kibanamachine/kibana that referenced this pull request Feb 17, 2025
Fix: System Message Missing in Inference Plugin
Closes elastic#209548
## Summary

A regression was introduced in 8.18
([elastic#199286](elastic#199286)), where the
system message is no longer passed to the inference plugin and,
consequently, the LLM.

Currently, only user messages are being sent, which impacts conversation
guidance and guardrails. The system message is crucial for steering
responses and maintaining contextual integrity.

The filtering of the system message happens here:

https://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512

Fix Approach
- Ensure the `system` message is included as a parameter in
`inferenceClient.chatComplete.`
```typescript
const options = {
      connectorId,
      system,
      messages: convertMessagesForInference(messages),
      toolChoice,
      tools,
      functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode,
    };
    if (stream) {
      return defer(() =>
        this.dependencies.inferenceClient.chatComplete({
          ...options,
          stream: true,
        })
      ).pipe(
        convertInferenceEventsToStreamingEvents(),
        instrumentAndCountTokens(name),
        failOnNonExistingFunctionCall({ functions }),
        tap((event) => {
          if (
            event.type === StreamingChatResponseEventType.ChatCompletionChunk &&
            this.dependencies.logger.isLevelEnabled('trace')
          ) {
            this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`);
          }
        }),
        shareReplay()
      ) as TStream extends true
        ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent>
        : never;
    } else {
      return this.dependencies.inferenceClient.chatComplete({
        ...options,
        stream: false,
      }) as TStream extends true ? never : Promise<ChatCompleteResponse>;
    }
  }
 ```
- Add an API test to verify that the system message is correctly passed to the LLM.

(cherry picked from commit 0ae28aa)
@kibanamachine
Copy link
Contributor

💚 All backports created successfully

Status Branch Result
8.x
9.0

Note: Successful backport PRs will be merged automatically after passing CI.

Questions ?

Please refer to the Backport tool documentation

kibanamachine added a commit that referenced this pull request Feb 17, 2025
# Backport

This will backport the following commits from `main` to `9.0`:
- [[Obs Ai Assistant] Add system message
(#209773)](#209773)

<!--- Backport version: 9.4.3 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sqren/backport)

<!--BACKPORT [{"author":{"name":"Arturo
Lidueña","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-17T10:12:56Z","message":"[Obs
Ai Assistant] Add system message (#209773)\n\nFix: System Message
Missing in Inference Plugin\r\nCloses #209548\r\n## Summary\r\n\r\nA
regression was introduced in
8.18\r\n([#199286](#199286)),
where the\r\nsystem message is no longer passed to the inference plugin
and,\r\nconsequently, the LLM.\r\n\r\nCurrently, only user messages are
being sent, which impacts conversation\r\nguidance and guardrails. The
system message is crucial for steering\r\nresponses and maintaining
contextual integrity.\r\n\r\nThe filtering of the system message happens
here:\r\n\r\nhttps://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512\r\n\r\nFix
Approach\r\n- Ensure the `system` message is included as a parameter
in\r\n`inferenceClient.chatComplete.`\r\n```typescript\r\nconst options
= {\r\n connectorId,\r\n system,\r\n messages:
convertMessagesForInference(messages),\r\n toolChoice,\r\n tools,\r\n
functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as
FunctionCallingMode,\r\n };\r\n if (stream) {\r\n return defer(() =>\r\n
this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n
stream: true,\r\n })\r\n ).pipe(\r\n
convertInferenceEventsToStreamingEvents(),\r\n
instrumentAndCountTokens(name),\r\n failOnNonExistingFunctionCall({
functions }),\r\n tap((event) => {\r\n if (\r\n event.type ===
StreamingChatResponseEventType.ChatCompletionChunk &&\r\n
this.dependencies.logger.isLevelEnabled('trace')\r\n ) {\r\n
this.dependencies.logger.trace(`Received chunk:
${JSON.stringify(event.message)}`);\r\n }\r\n }),\r\n shareReplay()\r\n
) as TStream extends true\r\n ? Observable<ChatCompletionChunkEvent |
TokenCountEvent | ChatCompletionMessageEvent>\r\n : never;\r\n } else
{\r\n return this.dependencies.inferenceClient.chatComplete({\r\n
...options,\r\n stream: false,\r\n }) as TStream extends true ? never :
Promise<ChatCompleteResponse>;\r\n }\r\n }\r\n ```\r\n- Add an API test
to verify that the system message is correctly passed to the
LLM.","sha":"0ae28aa8bc73ce682166ffd5666828b1fdb7c017","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:fix","v9.0.0","Team:Obs
AI
Assistant","ci:project-deploy-observability","backport:version","v9.1.0","v8.19.0"],"title":"[Obs
Ai Assistant] Add system
message","number":209773,"url":"https://github.com/elastic/kibana/pull/209773","mergeCommit":{"message":"[Obs
Ai Assistant] Add system message (#209773)\n\nFix: System Message
Missing in Inference Plugin\r\nCloses #209548\r\n## Summary\r\n\r\nA
regression was introduced in
8.18\r\n([#199286](#199286)),
where the\r\nsystem message is no longer passed to the inference plugin
and,\r\nconsequently, the LLM.\r\n\r\nCurrently, only user messages are
being sent, which impacts conversation\r\nguidance and guardrails. The
system message is crucial for steering\r\nresponses and maintaining
contextual integrity.\r\n\r\nThe filtering of the system message happens
here:\r\n\r\nhttps://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512\r\n\r\nFix
Approach\r\n- Ensure the `system` message is included as a parameter
in\r\n`inferenceClient.chatComplete.`\r\n```typescript\r\nconst options
= {\r\n connectorId,\r\n system,\r\n messages:
convertMessagesForInference(messages),\r\n toolChoice,\r\n tools,\r\n
functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as
FunctionCallingMode,\r\n };\r\n if (stream) {\r\n return defer(() =>\r\n
this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n
stream: true,\r\n })\r\n ).pipe(\r\n
convertInferenceEventsToStreamingEvents(),\r\n
instrumentAndCountTokens(name),\r\n failOnNonExistingFunctionCall({
functions }),\r\n tap((event) => {\r\n if (\r\n event.type ===
StreamingChatResponseEventType.ChatCompletionChunk &&\r\n
this.dependencies.logger.isLevelEnabled('trace')\r\n ) {\r\n
this.dependencies.logger.trace(`Received chunk:
${JSON.stringify(event.message)}`);\r\n }\r\n }),\r\n shareReplay()\r\n
) as TStream extends true\r\n ? Observable<ChatCompletionChunkEvent |
TokenCountEvent | ChatCompletionMessageEvent>\r\n : never;\r\n } else
{\r\n return this.dependencies.inferenceClient.chatComplete({\r\n
...options,\r\n stream: false,\r\n }) as TStream extends true ? never :
Promise<ChatCompleteResponse>;\r\n }\r\n }\r\n ```\r\n- Add an API test
to verify that the system message is correctly passed to the
LLM.","sha":"0ae28aa8bc73ce682166ffd5666828b1fdb7c017"}},"sourceBranch":"main","suggestedTargetBranches":["9.0","8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/209773","number":209773,"mergeCommit":{"message":"[Obs
Ai Assistant] Add system message (#209773)\n\nFix: System Message
Missing in Inference Plugin\r\nCloses #209548\r\n## Summary\r\n\r\nA
regression was introduced in
8.18\r\n([#199286](#199286)),
where the\r\nsystem message is no longer passed to the inference plugin
and,\r\nconsequently, the LLM.\r\n\r\nCurrently, only user messages are
being sent, which impacts conversation\r\nguidance and guardrails. The
system message is crucial for steering\r\nresponses and maintaining
contextual integrity.\r\n\r\nThe filtering of the system message happens
here:\r\n\r\nhttps://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512\r\n\r\nFix
Approach\r\n- Ensure the `system` message is included as a parameter
in\r\n`inferenceClient.chatComplete.`\r\n```typescript\r\nconst options
= {\r\n connectorId,\r\n system,\r\n messages:
convertMessagesForInference(messages),\r\n toolChoice,\r\n tools,\r\n
functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as
FunctionCallingMode,\r\n };\r\n if (stream) {\r\n return defer(() =>\r\n
this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n
stream: true,\r\n })\r\n ).pipe(\r\n
convertInferenceEventsToStreamingEvents(),\r\n
instrumentAndCountTokens(name),\r\n failOnNonExistingFunctionCall({
functions }),\r\n tap((event) => {\r\n if (\r\n event.type ===
StreamingChatResponseEventType.ChatCompletionChunk &&\r\n
this.dependencies.logger.isLevelEnabled('trace')\r\n ) {\r\n
this.dependencies.logger.trace(`Received chunk:
${JSON.stringify(event.message)}`);\r\n }\r\n }),\r\n shareReplay()\r\n
) as TStream extends true\r\n ? Observable<ChatCompletionChunkEvent |
TokenCountEvent | ChatCompletionMessageEvent>\r\n : never;\r\n } else
{\r\n return this.dependencies.inferenceClient.chatComplete({\r\n
...options,\r\n stream: false,\r\n }) as TStream extends true ? never :
Promise<ChatCompleteResponse>;\r\n }\r\n }\r\n ```\r\n- Add an API test
to verify that the system message is correctly passed to the
LLM.","sha":"0ae28aa8bc73ce682166ffd5666828b1fdb7c017"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}]
BACKPORT-->

Co-authored-by: Arturo Lidueña <[email protected]>
kibanamachine added a commit that referenced this pull request Feb 17, 2025
# Backport

This will backport the following commits from `main` to `8.x`:
- [[Obs Ai Assistant] Add system message
(#209773)](#209773)

<!--- Backport version: 9.4.3 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sqren/backport)

<!--BACKPORT [{"author":{"name":"Arturo
Lidueña","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-17T10:12:56Z","message":"[Obs
Ai Assistant] Add system message (#209773)\n\nFix: System Message
Missing in Inference Plugin\r\nCloses #209548\r\n## Summary\r\n\r\nA
regression was introduced in
8.18\r\n([#199286](#199286)),
where the\r\nsystem message is no longer passed to the inference plugin
and,\r\nconsequently, the LLM.\r\n\r\nCurrently, only user messages are
being sent, which impacts conversation\r\nguidance and guardrails. The
system message is crucial for steering\r\nresponses and maintaining
contextual integrity.\r\n\r\nThe filtering of the system message happens
here:\r\n\r\nhttps://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512\r\n\r\nFix
Approach\r\n- Ensure the `system` message is included as a parameter
in\r\n`inferenceClient.chatComplete.`\r\n```typescript\r\nconst options
= {\r\n connectorId,\r\n system,\r\n messages:
convertMessagesForInference(messages),\r\n toolChoice,\r\n tools,\r\n
functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as
FunctionCallingMode,\r\n };\r\n if (stream) {\r\n return defer(() =>\r\n
this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n
stream: true,\r\n })\r\n ).pipe(\r\n
convertInferenceEventsToStreamingEvents(),\r\n
instrumentAndCountTokens(name),\r\n failOnNonExistingFunctionCall({
functions }),\r\n tap((event) => {\r\n if (\r\n event.type ===
StreamingChatResponseEventType.ChatCompletionChunk &&\r\n
this.dependencies.logger.isLevelEnabled('trace')\r\n ) {\r\n
this.dependencies.logger.trace(`Received chunk:
${JSON.stringify(event.message)}`);\r\n }\r\n }),\r\n shareReplay()\r\n
) as TStream extends true\r\n ? Observable<ChatCompletionChunkEvent |
TokenCountEvent | ChatCompletionMessageEvent>\r\n : never;\r\n } else
{\r\n return this.dependencies.inferenceClient.chatComplete({\r\n
...options,\r\n stream: false,\r\n }) as TStream extends true ? never :
Promise<ChatCompleteResponse>;\r\n }\r\n }\r\n ```\r\n- Add an API test
to verify that the system message is correctly passed to the
LLM.","sha":"0ae28aa8bc73ce682166ffd5666828b1fdb7c017","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:fix","v9.0.0","Team:Obs
AI
Assistant","ci:project-deploy-observability","backport:version","v9.1.0","v8.19.0"],"title":"[Obs
Ai Assistant] Add system
message","number":209773,"url":"https://github.com/elastic/kibana/pull/209773","mergeCommit":{"message":"[Obs
Ai Assistant] Add system message (#209773)\n\nFix: System Message
Missing in Inference Plugin\r\nCloses #209548\r\n## Summary\r\n\r\nA
regression was introduced in
8.18\r\n([#199286](#199286)),
where the\r\nsystem message is no longer passed to the inference plugin
and,\r\nconsequently, the LLM.\r\n\r\nCurrently, only user messages are
being sent, which impacts conversation\r\nguidance and guardrails. The
system message is crucial for steering\r\nresponses and maintaining
contextual integrity.\r\n\r\nThe filtering of the system message happens
here:\r\n\r\nhttps://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512\r\n\r\nFix
Approach\r\n- Ensure the `system` message is included as a parameter
in\r\n`inferenceClient.chatComplete.`\r\n```typescript\r\nconst options
= {\r\n connectorId,\r\n system,\r\n messages:
convertMessagesForInference(messages),\r\n toolChoice,\r\n tools,\r\n
functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as
FunctionCallingMode,\r\n };\r\n if (stream) {\r\n return defer(() =>\r\n
this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n
stream: true,\r\n })\r\n ).pipe(\r\n
convertInferenceEventsToStreamingEvents(),\r\n
instrumentAndCountTokens(name),\r\n failOnNonExistingFunctionCall({
functions }),\r\n tap((event) => {\r\n if (\r\n event.type ===
StreamingChatResponseEventType.ChatCompletionChunk &&\r\n
this.dependencies.logger.isLevelEnabled('trace')\r\n ) {\r\n
this.dependencies.logger.trace(`Received chunk:
${JSON.stringify(event.message)}`);\r\n }\r\n }),\r\n shareReplay()\r\n
) as TStream extends true\r\n ? Observable<ChatCompletionChunkEvent |
TokenCountEvent | ChatCompletionMessageEvent>\r\n : never;\r\n } else
{\r\n return this.dependencies.inferenceClient.chatComplete({\r\n
...options,\r\n stream: false,\r\n }) as TStream extends true ? never :
Promise<ChatCompleteResponse>;\r\n }\r\n }\r\n ```\r\n- Add an API test
to verify that the system message is correctly passed to the
LLM.","sha":"0ae28aa8bc73ce682166ffd5666828b1fdb7c017"}},"sourceBranch":"main","suggestedTargetBranches":["9.0","8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/209773","number":209773,"mergeCommit":{"message":"[Obs
Ai Assistant] Add system message (#209773)\n\nFix: System Message
Missing in Inference Plugin\r\nCloses #209548\r\n## Summary\r\n\r\nA
regression was introduced in
8.18\r\n([#199286](#199286)),
where the\r\nsystem message is no longer passed to the inference plugin
and,\r\nconsequently, the LLM.\r\n\r\nCurrently, only user messages are
being sent, which impacts conversation\r\nguidance and guardrails. The
system message is crucial for steering\r\nresponses and maintaining
contextual integrity.\r\n\r\nThe filtering of the system message happens
here:\r\n\r\nhttps://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512\r\n\r\nFix
Approach\r\n- Ensure the `system` message is included as a parameter
in\r\n`inferenceClient.chatComplete.`\r\n```typescript\r\nconst options
= {\r\n connectorId,\r\n system,\r\n messages:
convertMessagesForInference(messages),\r\n toolChoice,\r\n tools,\r\n
functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as
FunctionCallingMode,\r\n };\r\n if (stream) {\r\n return defer(() =>\r\n
this.dependencies.inferenceClient.chatComplete({\r\n ...options,\r\n
stream: true,\r\n })\r\n ).pipe(\r\n
convertInferenceEventsToStreamingEvents(),\r\n
instrumentAndCountTokens(name),\r\n failOnNonExistingFunctionCall({
functions }),\r\n tap((event) => {\r\n if (\r\n event.type ===
StreamingChatResponseEventType.ChatCompletionChunk &&\r\n
this.dependencies.logger.isLevelEnabled('trace')\r\n ) {\r\n
this.dependencies.logger.trace(`Received chunk:
${JSON.stringify(event.message)}`);\r\n }\r\n }),\r\n shareReplay()\r\n
) as TStream extends true\r\n ? Observable<ChatCompletionChunkEvent |
TokenCountEvent | ChatCompletionMessageEvent>\r\n : never;\r\n } else
{\r\n return this.dependencies.inferenceClient.chatComplete({\r\n
...options,\r\n stream: false,\r\n }) as TStream extends true ? never :
Promise<ChatCompleteResponse>;\r\n }\r\n }\r\n ```\r\n- Add an API test
to verify that the system message is correctly passed to the
LLM.","sha":"0ae28aa8bc73ce682166ffd5666828b1fdb7c017"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}]
BACKPORT-->

Co-authored-by: Arturo Lidueña <[email protected]>
@sorenlouv
Copy link
Member

Is the LLMProxy used for any of these tests? We shouldn't just mock the calls to the inference plugin. We need to validate that we actually send the system message to the LLM.

@sorenlouv We are already validating that inferenceClient.chatComplete is called twice (once for the title and once for the conversation). Additionally, we verify that each call includes the expected system message. Wouldn’t this be sufficient for the AI Assistant side? Why do we also need to check if it is passed to the LLM?

@arturoliduena We should also have a test that checks what we send to the LLM. We should not blindly trust the inference plugin. It could change in some way that we do not understand causing it to no longer pass the system message to the LLM - a little bit like what just happened.

It doesn't need to be a big convoluted test - just simply checking that the system message is sent along. A think this is good practice to do for most of what we do since it's so easy with the LLM Proxy

arturoliduena added a commit that referenced this pull request Feb 21, 2025
Closes #212006
## Problem
Due to #209773 we no longer add
the system message to the output from the "Copy conversation" button.
kibanamachine pushed a commit to kibanamachine/kibana that referenced this pull request Feb 21, 2025
Closes elastic#212006
## Problem
Due to elastic#209773 we no longer add
the system message to the output from the "Copy conversation" button.

(cherry picked from commit 4783925)
kibanamachine pushed a commit to kibanamachine/kibana that referenced this pull request Feb 21, 2025
Closes elastic#212006
## Problem
Due to elastic#209773 we no longer add
the system message to the output from the "Copy conversation" button.

(cherry picked from commit 4783925)
kibanamachine added a commit that referenced this pull request Feb 21, 2025
…212071)

# Backport

This will backport the following commits from `main` to `9.0`:
- [add system message in copy conversation JSON payload
(#212009)](#212009)

<!--- Backport version: 9.6.6 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sorenlouv/backport)

<!--BACKPORT [{"author":{"name":"Arturo
Lidueña","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-21T12:48:15Z","message":"add
system message in copy conversation JSON payload (#212009)\n\nCloses
#212006\n## Problem\nDue to
#209773 we no longer add\nthe
system message to the output from the \"Copy conversation\"
button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:fix","v9.0.0","Team:Obs
AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"add system
message in copy conversation JSON
payload","number":212009,"url":"https://github.com/elastic/kibana/pull/212009","mergeCommit":{"message":"add
system message in copy conversation JSON payload (#212009)\n\nCloses
#212006\n## Problem\nDue to
#209773 we no longer add\nthe
system message to the output from the \"Copy conversation\"
button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a"}},"sourceBranch":"main","suggestedTargetBranches":["9.0","8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/212009","number":212009,"mergeCommit":{"message":"add
system message in copy conversation JSON payload (#212009)\n\nCloses
#212006\n## Problem\nDue to
#209773 we no longer add\nthe
system message to the output from the \"Copy conversation\"
button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}]
BACKPORT-->

Co-authored-by: Arturo Lidueña <[email protected]>
kibanamachine added a commit that referenced this pull request Feb 21, 2025
…212070)

# Backport

This will backport the following commits from `main` to `8.x`:
- [add system message in copy conversation JSON payload
(#212009)](#212009)

<!--- Backport version: 9.6.6 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sorenlouv/backport)

<!--BACKPORT [{"author":{"name":"Arturo
Lidueña","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-21T12:48:15Z","message":"add
system message in copy conversation JSON payload (#212009)\n\nCloses
#212006\n## Problem\nDue to
#209773 we no longer add\nthe
system message to the output from the \"Copy conversation\"
button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:fix","v9.0.0","Team:Obs
AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"add system
message in copy conversation JSON
payload","number":212009,"url":"https://github.com/elastic/kibana/pull/212009","mergeCommit":{"message":"add
system message in copy conversation JSON payload (#212009)\n\nCloses
#212006\n## Problem\nDue to
#209773 we no longer add\nthe
system message to the output from the \"Copy conversation\"
button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a"}},"sourceBranch":"main","suggestedTargetBranches":["9.0","8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/212009","number":212009,"mergeCommit":{"message":"add
system message in copy conversation JSON payload (#212009)\n\nCloses
#212006\n## Problem\nDue to
#209773 we no longer add\nthe
system message to the output from the \"Copy conversation\"
button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}]
BACKPORT-->

Co-authored-by: Arturo Lidueña <[email protected]>
kibanamachine added a commit that referenced this pull request Feb 22, 2025
) (#212181)

# Backport

This will backport the following commits from `main` to `9.0`:
- [[Obs AI Assistant] Fix bug with `get_alerts_dataset_info`
(#212077)](#212077)

<!--- Backport version: 9.6.6 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sorenlouv/backport)

<!--BACKPORT [{"author":{"name":"Søren
Louv-Jansen","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-22T11:50:14Z","message":"[Obs
AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)\n\nCloses
https://github.com/elastic/kibana/issues/212005\n\nRegression introduced
in:\nhttps://github.com//pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n-
Add a test that would have caught
this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:skip","v9.0.0","Team:Obs
AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"[Obs AI
Assistant] Fix bug with
`get_alerts_dataset_info`","number":212077,"url":"https://github.com/elastic/kibana/pull/212077","mergeCommit":{"message":"[Obs
AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)\n\nCloses
https://github.com/elastic/kibana/issues/212005\n\nRegression introduced
in:\nhttps://github.com//pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n-
Add a test that would have caught
this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526"}},"sourceBranch":"main","suggestedTargetBranches":["9.0","8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/212077","number":212077,"mergeCommit":{"message":"[Obs
AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)\n\nCloses
https://github.com/elastic/kibana/issues/212005\n\nRegression introduced
in:\nhttps://github.com//pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n-
Add a test that would have caught
this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}]
BACKPORT-->

Co-authored-by: Søren Louv-Jansen <[email protected]>
kibanamachine added a commit that referenced this pull request Feb 24, 2025
) (#212215)

# Backport

This will backport the following commits from `main` to `8.x`:
- [[Obs AI Assistant] Fix bug with `get_alerts_dataset_info`
(#212077)](#212077)

<!--- Backport version: 9.6.6 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sorenlouv/backport)

<!--BACKPORT [{"author":{"name":"Søren
Louv-Jansen","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-22T11:50:14Z","message":"[Obs
AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)\n\nCloses
https://github.com/elastic/kibana/issues/212005\n\nRegression introduced
in:\nhttps://github.com//pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n-
Add a test that would have caught
this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:skip","v9.0.0","Team:Obs
AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"[Obs AI
Assistant] Fix bug with
`get_alerts_dataset_info`","number":212077,"url":"https://github.com/elastic/kibana/pull/212077","mergeCommit":{"message":"[Obs
AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)\n\nCloses
https://github.com/elastic/kibana/issues/212005\n\nRegression introduced
in:\nhttps://github.com//pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n-
Add a test that would have caught
this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526"}},"sourceBranch":"main","suggestedTargetBranches":["8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"url":"https://github.com/elastic/kibana/pull/212181","number":212181,"state":"MERGED","mergeCommit":{"sha":"349b68495d6b78547b476aae106a974466a20427","message":"[9.0]
[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)
(#212181)\n\n# Backport\n\nThis will backport the following commits from
`main` to `9.0`:\n- [[Obs AI Assistant] Fix bug with
`get_alerts_dataset_info`\n(#212077)](https://github.com/elastic/kibana/pull/212077)\n\n\n\n###
Questions ?\nPlease refer to the [Backport
tool\ndocumentation](https://github.com/sorenlouv/backport)\n\n\n\nCo-authored-by:
Søren Louv-Jansen
<[email protected]>"}},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/212077","number":212077,"mergeCommit":{"message":"[Obs
AI Assistant] Fix bug with `get_alerts_dataset_info` (#212077)\n\nCloses
https://github.com/elastic/kibana/issues/212005\n\nRegression introduced
in:\nhttps://github.com//pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n-
Add a test that would have caught
this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}]
BACKPORT-->

Co-authored-by: Søren Louv-Jansen <[email protected]>
JoseLuisGJ pushed a commit to JoseLuisGJ/kibana that referenced this pull request Feb 27, 2025
Closes elastic#212006
## Problem
Due to elastic#209773 we no longer add
the system message to the output from the "Copy conversation" button.
SoniaSanzV pushed a commit to SoniaSanzV/kibana that referenced this pull request Mar 4, 2025
…12009) (elastic#212070)

# Backport

This will backport the following commits from `main` to `8.x`:
- [add system message in copy conversation JSON payload
(elastic#212009)](elastic#212009)

<!--- Backport version: 9.6.6 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sorenlouv/backport)

<!--BACKPORT [{"author":{"name":"Arturo
Lidueña","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-21T12:48:15Z","message":"add
system message in copy conversation JSON payload (elastic#212009)\n\nCloses
elastic#212006\n## Problem\nDue to
elastic#209773 we no longer add\nthe
system message to the output from the \"Copy conversation\"
button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:fix","v9.0.0","Team:Obs
AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"add system
message in copy conversation JSON
payload","number":212009,"url":"https://github.com/elastic/kibana/pull/212009","mergeCommit":{"message":"add
system message in copy conversation JSON payload (elastic#212009)\n\nCloses
elastic#212006\n## Problem\nDue to
elastic#209773 we no longer add\nthe
system message to the output from the \"Copy conversation\"
button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a"}},"sourceBranch":"main","suggestedTargetBranches":["9.0","8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"state":"NOT_CREATED"},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/212009","number":212009,"mergeCommit":{"message":"add
system message in copy conversation JSON payload (elastic#212009)\n\nCloses
elastic#212006\n## Problem\nDue to
elastic#209773 we no longer add\nthe
system message to the output from the \"Copy conversation\"
button.","sha":"4783925e567681d1d2f9410e18ffa24050e4b69a"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}]
BACKPORT-->

Co-authored-by: Arturo Lidueña <[email protected]>
SoniaSanzV pushed a commit to SoniaSanzV/kibana that referenced this pull request Mar 4, 2025
…tic#212077) (elastic#212215)

# Backport

This will backport the following commits from `main` to `8.x`:
- [[Obs AI Assistant] Fix bug with `get_alerts_dataset_info`
(elastic#212077)](elastic#212077)

<!--- Backport version: 9.6.6 -->

### Questions ?
Please refer to the [Backport tool
documentation](https://github.com/sorenlouv/backport)

<!--BACKPORT [{"author":{"name":"Søren
Louv-Jansen","email":"[email protected]"},"sourceCommit":{"committedDate":"2025-02-22T11:50:14Z","message":"[Obs
AI Assistant] Fix bug with `get_alerts_dataset_info` (elastic#212077)\n\nCloses
https://github.com/elastic/kibana/issues/212005\n\nRegression introduced
in:\nhttps://github.com/elastic/pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n-
Add a test that would have caught
this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526","branchLabelMapping":{"^v9.1.0$":"main","^v8.19.0$":"8.x","^v(\\d+).(\\d+).\\d+$":"$1.$2"}},"sourcePullRequest":{"labels":["release_note:skip","v9.0.0","Team:Obs
AI Assistant","backport:version","v9.1.0","v8.19.0"],"title":"[Obs AI
Assistant] Fix bug with
`get_alerts_dataset_info`","number":212077,"url":"https://github.com/elastic/kibana/pull/212077","mergeCommit":{"message":"[Obs
AI Assistant] Fix bug with `get_alerts_dataset_info` (elastic#212077)\n\nCloses
https://github.com/elastic/kibana/issues/212005\n\nRegression introduced
in:\nhttps://github.com/elastic/pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n-
Add a test that would have caught
this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526"}},"sourceBranch":"main","suggestedTargetBranches":["8.x"],"targetPullRequestStates":[{"branch":"9.0","label":"v9.0.0","branchLabelMappingKey":"^v(\\d+).(\\d+).\\d+$","isSourceBranch":false,"url":"https://github.com/elastic/kibana/pull/212181","number":212181,"state":"MERGED","mergeCommit":{"sha":"349b68495d6b78547b476aae106a974466a20427","message":"[9.0]
[Obs AI Assistant] Fix bug with `get_alerts_dataset_info` (elastic#212077)
(elastic#212181)\n\n# Backport\n\nThis will backport the following commits from
`main` to `9.0`:\n- [[Obs AI Assistant] Fix bug with
`get_alerts_dataset_info`\n(elastic#212077)](https://github.com/elastic/kibana/pull/212077)\n\n\n\n###
Questions ?\nPlease refer to the [Backport
tool\ndocumentation](https://github.com/sorenlouv/backport)\n\n\n\nCo-authored-by:
Søren Louv-Jansen
<[email protected]>"}},{"branch":"main","label":"v9.1.0","branchLabelMappingKey":"^v9.1.0$","isSourceBranch":true,"state":"MERGED","url":"https://github.com/elastic/kibana/pull/212077","number":212077,"mergeCommit":{"message":"[Obs
AI Assistant] Fix bug with `get_alerts_dataset_info` (elastic#212077)\n\nCloses
https://github.com/elastic/kibana/issues/212005\n\nRegression introduced
in:\nhttps://github.com/elastic/pull/209773/files#diff-83722bb07633512b20beb965628b18290628de985d399989011cc20c82fa483cL116\n\nTODO:\n\n-
Add a test that would have caught
this","sha":"54f96d0070306bf13eeace9a55a157b5a8341526"}},{"branch":"8.x","label":"v8.19.0","branchLabelMappingKey":"^v8.19.0$","isSourceBranch":false,"state":"NOT_CREATED"}]}]
BACKPORT-->

Co-authored-by: Søren Louv-Jansen <[email protected]>
CAWilson94 pushed a commit to CAWilson94/kibana that referenced this pull request Mar 22, 2025
Fix: System Message Missing in Inference Plugin
Closes elastic#209548
## Summary

A regression was introduced in 8.18
([elastic#199286](elastic#199286)), where the
system message is no longer passed to the inference plugin and,
consequently, the LLM.

Currently, only user messages are being sent, which impacts conversation
guidance and guardrails. The system message is crucial for steering
responses and maintaining contextual integrity.

The filtering of the system message happens here:

https://github.com/elastic/kibana/blob/771a080ffa99e501e72c9cb98c833795769483ae/x-pack/platform/plugins/shared/observability_ai_assistant/server/service/client/index.ts#L510-L512

Fix Approach
- Ensure the `system` message is included as a parameter in
`inferenceClient.chatComplete.`
```typescript
const options = {
      connectorId,
      system,
      messages: convertMessagesForInference(messages),
      toolChoice,
      tools,
      functionCalling: (simulateFunctionCalling ? 'simulated' : 'native') as FunctionCallingMode,
    };
    if (stream) {
      return defer(() =>
        this.dependencies.inferenceClient.chatComplete({
          ...options,
          stream: true,
        })
      ).pipe(
        convertInferenceEventsToStreamingEvents(),
        instrumentAndCountTokens(name),
        failOnNonExistingFunctionCall({ functions }),
        tap((event) => {
          if (
            event.type === StreamingChatResponseEventType.ChatCompletionChunk &&
            this.dependencies.logger.isLevelEnabled('trace')
          ) {
            this.dependencies.logger.trace(`Received chunk: ${JSON.stringify(event.message)}`);
          }
        }),
        shareReplay()
      ) as TStream extends true
        ? Observable<ChatCompletionChunkEvent | TokenCountEvent | ChatCompletionMessageEvent>
        : never;
    } else {
      return this.dependencies.inferenceClient.chatComplete({
        ...options,
        stream: false,
      }) as TStream extends true ? never : Promise<ChatCompleteResponse>;
    }
  }
 ```
- Add an API test to verify that the system message is correctly passed to the LLM.
CAWilson94 pushed a commit to CAWilson94/kibana that referenced this pull request Mar 22, 2025
Closes elastic#212006
## Problem
Due to elastic#209773 we no longer add
the system message to the output from the "Copy conversation" button.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

backport:version Backport to applied version labels ci:project-deploy-observability Create an Observability project release_note:fix Team:Obs AI Assistant Observability AI Assistant v8.19.0 v9.0.0 v9.1.0

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[Obs AI Assistant] System message is missing

6 participants