Compare commits
13 Commits
70a60edf1c
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
| 29c6dce0e5 | |||
| 5855b7edb8 | |||
| ac6d55f617 | |||
| 1e045db7f4 | |||
| 12b3d8c5ad | |||
| bd0200ac98 | |||
| 0c9b4d1ed3 | |||
| 30656842a7 | |||
| 8b580fd3e1 | |||
| 195e157e1a | |||
| c5dbd12587 | |||
| be072fd46d | |||
| f514c42de6 |
@@ -12,6 +12,9 @@ services:
|
||||
OPENAI_API_KEY: ${OPENAI_API_KEY:-}
|
||||
ANTHROPIC_API_KEY: ${ANTHROPIC_API_KEY:-}
|
||||
XAI_API_KEY: ${XAI_API_KEY:-}
|
||||
HERMES_AGENT_API_BASE_URL: ${HERMES_AGENT_API_BASE_URL:-http://127.0.0.1:8642/v1}
|
||||
HERMES_AGENT_API_KEY: ${HERMES_AGENT_API_KEY:-}
|
||||
HERMES_AGENT_MODEL: ${HERMES_AGENT_MODEL:-}
|
||||
EXA_API_KEY: ${EXA_API_KEY:-}
|
||||
CHAT_WEB_SEARCH_ENGINE: ${CHAT_WEB_SEARCH_ENGINE:-exa}
|
||||
SEARXNG_BASE_URL: ${SEARXNG_BASE_URL:-}
|
||||
|
||||
@@ -33,11 +33,29 @@ Chat upload limits:
|
||||
"providers": {
|
||||
"openai": { "models": ["gpt-4.1-mini"], "loadedAt": "2026-02-14T00:00:00.000Z", "error": null },
|
||||
"anthropic": { "models": ["claude-3-5-sonnet-latest"], "loadedAt": null, "error": null },
|
||||
"xai": { "models": ["grok-3-mini"], "loadedAt": null, "error": null }
|
||||
"xai": { "models": ["grok-3-mini"], "loadedAt": null, "error": null },
|
||||
"hermes-agent": { "models": ["hermes-agent"], "loadedAt": null, "error": null }
|
||||
}
|
||||
}
|
||||
```
|
||||
- OpenAI model lists are filtered to models that are expected to work with the backend's Responses API implementation.
|
||||
- `hermes-agent` is included only when `HERMES_AGENT_API_KEY` is configured. Set it to Hermes `API_SERVER_KEY`, or any non-empty value if that local server does not require auth. `HERMES_AGENT_API_BASE_URL` defaults to `http://127.0.0.1:8642/v1`; set `HERMES_AGENT_MODEL` only when you need an additional fallback/override model id.
|
||||
|
||||
## Active Runs
|
||||
|
||||
### `GET /v1/active-runs`
|
||||
- Response:
|
||||
```json
|
||||
{
|
||||
"chats": ["chat-id-with-active-stream"],
|
||||
"searches": ["search-id-with-active-stream"]
|
||||
}
|
||||
```
|
||||
|
||||
Behavior notes:
|
||||
- Lists in-memory chat/search streams that are still running on this server process.
|
||||
- Clients should use this after app start or page refresh to restore per-row generating indicators.
|
||||
- The lists are not durable across server restarts.
|
||||
|
||||
## Chats
|
||||
|
||||
@@ -49,7 +67,7 @@ Chat upload limits:
|
||||
```json
|
||||
{
|
||||
"title": "optional title",
|
||||
"provider": "optional openai|anthropic|xai",
|
||||
"provider": "optional openai|anthropic|xai|hermes-agent",
|
||||
"model": "optional model id",
|
||||
"messages": [
|
||||
{
|
||||
@@ -136,7 +154,7 @@ Notes:
|
||||
```json
|
||||
{
|
||||
"chatId": "optional-chat-id",
|
||||
"provider": "openai|anthropic|xai",
|
||||
"provider": "openai|anthropic|xai|hermes-agent",
|
||||
"model": "string",
|
||||
"messages": [
|
||||
{
|
||||
@@ -190,11 +208,12 @@ Behavior notes:
|
||||
- Text files are forwarded as explicit text blocks rather than provider-managed file references. Large text attachments should already be truncated client-side before submission.
|
||||
- For `openai`, backend calls OpenAI's Responses API and enables internal tool use with an internal system instruction.
|
||||
- For `xai`, backend calls xAI's OpenAI-compatible Chat Completions API and enables internal tool use with the same internal system instruction.
|
||||
- For `hermes-agent`, backend calls the configured Hermes Agent OpenAI-compatible Chat Completions API without adding Sybil-managed tool definitions; Hermes Agent handles its own tools server-side.
|
||||
- For `openai`, image attachments are sent as Responses `input_image` items and text attachments are sent as `input_text` items.
|
||||
- For `xai`, image attachments are sent as Chat Completions content parts alongside text.
|
||||
- For `xai` and `hermes-agent`, image attachments are sent as Chat Completions content parts alongside text.
|
||||
- For `openai`, Responses calls that can enter the server-managed tool loop use `store: true` so reasoning and function-call items can be passed between tool rounds.
|
||||
- For `anthropic`, image attachments are sent as Messages API `image` blocks using base64 source data; text attachments are added as `text` blocks.
|
||||
- Available tool calls for chat: `web_search` and `fetch_url`. When `CHAT_CODEX_TOOL_ENABLED=true`, `codex_exec` is also available. When `CHAT_SHELL_TOOL_ENABLED=true`, `shell_exec` is also available.
|
||||
- Available Sybil-managed tool calls for `openai` and `xai`: `web_search` and `fetch_url`. When `CHAT_CODEX_TOOL_ENABLED=true`, `codex_exec` is also available. When `CHAT_SHELL_TOOL_ENABLED=true`, `shell_exec` is also available.
|
||||
- `web_search` returns ranked results with per-result summaries/snippets. Its backend engine is selected by `CHAT_WEB_SEARCH_ENGINE` (`exa` default, or `searxng` with `SEARXNG_BASE_URL` set). SearXNG mode requires the instance to allow `format=json`.
|
||||
- `fetch_url` fetches a URL and returns plaintext page content (HTML converted to text server-side).
|
||||
- `codex_exec` delegates coding, shell, repository inspection, and other complex software tasks to a persistent remote Codex CLI workspace over SSH. The server runs `codex exec --dangerously-bypass-approvals-and-sandbox --skip-git-repo-check <non-interactive wrapped prompt>` on the configured devbox inside `CHAT_CODEX_REMOTE_WORKDIR`, with SSH stdin closed.
|
||||
@@ -260,6 +279,32 @@ Search run notes:
|
||||
- Persists answer text/citations + ranked results.
|
||||
- If both search and answer fail, endpoint returns an error.
|
||||
|
||||
### `POST /v1/searches/:searchId/run/stream`
|
||||
- Body: same as `POST /v1/searches/:searchId/run`
|
||||
- Response: `text/event-stream`
|
||||
|
||||
Events:
|
||||
- `search_results`: `{ "requestId": string|null, "results": SearchResultItem[] }`
|
||||
- `search_error`: `{ "error": string }`
|
||||
- `answer`: `{ "answerText": string|null, "answerRequestId": string|null, "answerCitations": SearchDetail["answerCitations"] }`
|
||||
- `answer_error`: `{ "error": string }`
|
||||
- terminal `done`: `{ "search": SearchDetail }`
|
||||
- terminal `error`: `{ "message": string }`
|
||||
|
||||
Behavior notes:
|
||||
- The stream is owned by the backend after it starts. If the original HTTP client disconnects, the backend keeps running and persists the final search state.
|
||||
- While a search stream is active, `GET /v1/active-runs` includes the `searchId`.
|
||||
- If a stream is already active for the same `searchId`, this endpoint attaches to the existing stream instead of starting a second run.
|
||||
|
||||
### `POST /v1/searches/:searchId/run/stream/attach`
|
||||
- Body: none
|
||||
- Response: `text/event-stream` with the same event names as `POST /v1/searches/:searchId/run/stream`
|
||||
- Not found: `404 { "message": "active search stream not found" }`
|
||||
|
||||
Behavior notes:
|
||||
- Replays buffered events for the active in-memory stream, then emits new events until `done` or `error`.
|
||||
- Intended for clients that discovered a pending search via `GET /v1/active-runs`, such as after browser refresh.
|
||||
|
||||
## Type Shapes
|
||||
|
||||
`ChatSummary`
|
||||
@@ -269,9 +314,9 @@ Search run notes:
|
||||
"title": null,
|
||||
"createdAt": "...",
|
||||
"updatedAt": "...",
|
||||
"initiatedProvider": "openai|anthropic|xai|null",
|
||||
"initiatedProvider": "openai|anthropic|xai|hermes-agent|null",
|
||||
"initiatedModel": "string|null",
|
||||
"lastUsedProvider": "openai|anthropic|xai|null",
|
||||
"lastUsedProvider": "openai|anthropic|xai|hermes-agent|null",
|
||||
"lastUsedModel": "string|null"
|
||||
}
|
||||
```
|
||||
@@ -317,9 +362,9 @@ Search run notes:
|
||||
"title": null,
|
||||
"createdAt": "...",
|
||||
"updatedAt": "...",
|
||||
"initiatedProvider": "openai|anthropic|xai|null",
|
||||
"initiatedProvider": "openai|anthropic|xai|hermes-agent|null",
|
||||
"initiatedModel": "string|null",
|
||||
"lastUsedProvider": "openai|anthropic|xai|null",
|
||||
"lastUsedProvider": "openai|anthropic|xai|hermes-agent|null",
|
||||
"lastUsedModel": "string|null",
|
||||
"messages": [Message]
|
||||
}
|
||||
|
||||
@@ -4,6 +4,7 @@ This document defines the server-sent events (SSE) contract for chat completions
|
||||
|
||||
Endpoint:
|
||||
- `POST /v1/chat-completions/stream`
|
||||
- `POST /v1/chats/:chatId/stream/attach`
|
||||
|
||||
Transport:
|
||||
- HTTP response uses `Content-Type: text/event-stream; charset=utf-8`
|
||||
@@ -20,7 +21,7 @@ Authentication:
|
||||
{
|
||||
"chatId": "optional-chat-id",
|
||||
"persist": true,
|
||||
"provider": "openai|anthropic|xai",
|
||||
"provider": "openai|anthropic|xai|hermes-agent",
|
||||
"model": "string",
|
||||
"messages": [
|
||||
{
|
||||
@@ -61,6 +62,23 @@ Notes:
|
||||
- For persisted streams, backend stores only new non-assistant input history rows to avoid duplicates.
|
||||
- Attachments are optional and are persisted under `message.metadata.attachments` on stored user messages when `persist` is `true`.
|
||||
|
||||
Persisted chat streams with a `chatId` are backend-owned active runs:
|
||||
- Once started, the backend keeps the stream running even if the HTTP client disconnects or refreshes.
|
||||
- While running, `GET /v1/active-runs` includes the `chatId`.
|
||||
- Starting a second persisted stream for the same active `chatId` returns `409`.
|
||||
- Clients can reattach with `POST /v1/chats/:chatId/stream/attach`.
|
||||
|
||||
## Attach Endpoint
|
||||
|
||||
`POST /v1/chats/:chatId/stream/attach`
|
||||
- Body: none.
|
||||
- Response uses the same `text/event-stream` transport and event names as `POST /v1/chat-completions/stream`.
|
||||
- Replays buffered events for the active in-memory stream, then emits new events until `done` or `error`.
|
||||
- Returns `404 { "message": "active chat stream not found" }` if no stream is currently active for that chat.
|
||||
- Authentication is the same as all other API endpoints.
|
||||
|
||||
This endpoint is intended for clients that restored an active `chatId` from `GET /v1/active-runs`, especially after browser refresh. Replayed `delta` events may include text that was originally emitted before the client attached.
|
||||
|
||||
## Event Stream Contract
|
||||
|
||||
Event order:
|
||||
@@ -134,8 +152,9 @@ For `persist: false` streams, `chatId` and `callId` are `null`.
|
||||
|
||||
- `openai`: backend uses OpenAI's Responses API and may execute internal function tool calls (`web_search`, `fetch_url`, optional `codex_exec`, and optional `shell_exec`) before producing final text.
|
||||
- `xai`: backend uses xAI's OpenAI-compatible Chat Completions API and may execute the same internal tool calls before producing final text.
|
||||
- `hermes-agent`: backend uses the configured Hermes Agent OpenAI-compatible Chat Completions API. Sybil does not add its own tool definitions for this provider; Hermes Agent handles its own tools server-side. Custom Hermes stream events are normalized away unless they produce text deltas in this SSE contract.
|
||||
- `openai`: image attachments are sent as Responses `input_image` items; text attachments are sent as `input_text` items.
|
||||
- `xai`: image attachments are sent as Chat Completions content parts; text attachments are inlined as text parts.
|
||||
- `xai` and `hermes-agent`: image attachments are sent as Chat Completions content parts; text attachments are inlined as text parts.
|
||||
- `openai`: Responses calls that can enter the server-managed tool loop use `store: true` so reasoning and function-call items can be passed between tool rounds.
|
||||
- `anthropic`: streamed via event stream; emits `delta` from `content_block_delta` with `text_delta`. Image attachments are sent as base64 `image` blocks and text attachments are appended as `text` blocks.
|
||||
- `web_search` uses `CHAT_WEB_SEARCH_ENGINE` (`exa` default, or `searxng` with `SEARXNG_BASE_URL` set). SearXNG mode requires the instance to allow `format=json`. This only affects chat-mode tool calls, not search-mode endpoints.
|
||||
|
||||
@@ -51,3 +51,4 @@ Instructions for work under `/Users/buzzert/src/sybil-2/ios`.
|
||||
- OpenAI: `gpt-4.1-mini`
|
||||
- Anthropic: `claude-3-5-sonnet-latest`
|
||||
- xAI: `grok-3-mini`
|
||||
- Hermes Agent: `hermes-agent`
|
||||
|
||||
17
ios/Apps/Sybil/Info.plist
Normal file
17
ios/Apps/Sybil/Info.plist
Normal file
@@ -0,0 +1,17 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
|
||||
<plist version="1.0">
|
||||
<dict>
|
||||
<key>UIApplicationShortcutItems</key>
|
||||
<array>
|
||||
<dict>
|
||||
<key>UIApplicationShortcutItemType</key>
|
||||
<string>net.buzzert.sybil2.quick-question</string>
|
||||
<key>UIApplicationShortcutItemTitle</key>
|
||||
<string>Quick question</string>
|
||||
<key>UIApplicationShortcutItemIconSymbolName</key>
|
||||
<string>sparkles</string>
|
||||
</dict>
|
||||
</array>
|
||||
</dict>
|
||||
</plist>
|
||||
@@ -5,6 +5,8 @@ import UIKit
|
||||
@main
|
||||
struct SybilApp: App
|
||||
{
|
||||
@UIApplicationDelegateAdaptor(SybilAppDelegate.self) private var appDelegate
|
||||
|
||||
var body: some Scene {
|
||||
WindowGroup {
|
||||
SplitView()
|
||||
@@ -14,3 +16,79 @@ struct SybilApp: App
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@MainActor
|
||||
final class SybilAppDelegate: NSObject, UIApplicationDelegate {
|
||||
func application(
|
||||
_ application: UIApplication,
|
||||
didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]? = nil
|
||||
) -> Bool {
|
||||
SybilHomeScreenQuickActionHandler.configureQuickActions()
|
||||
return true
|
||||
}
|
||||
|
||||
func application(
|
||||
_ application: UIApplication,
|
||||
configurationForConnecting connectingSceneSession: UISceneSession,
|
||||
options: UIScene.ConnectionOptions
|
||||
) -> UISceneConfiguration {
|
||||
let configuration = UISceneConfiguration(
|
||||
name: "Default Configuration",
|
||||
sessionRole: connectingSceneSession.role
|
||||
)
|
||||
configuration.delegateClass = SybilSceneDelegate.self
|
||||
return configuration
|
||||
}
|
||||
|
||||
func application(
|
||||
_ application: UIApplication,
|
||||
performActionFor shortcutItem: UIApplicationShortcutItem,
|
||||
completionHandler: @escaping (Bool) -> Void
|
||||
) {
|
||||
completionHandler(SybilHomeScreenQuickActionHandler.handle(shortcutItem))
|
||||
}
|
||||
}
|
||||
|
||||
@MainActor
|
||||
final class SybilSceneDelegate: NSObject, UIWindowSceneDelegate {
|
||||
func scene(
|
||||
_ scene: UIScene,
|
||||
willConnectTo session: UISceneSession,
|
||||
options connectionOptions: UIScene.ConnectionOptions
|
||||
) {
|
||||
if let shortcutItem = connectionOptions.shortcutItem {
|
||||
_ = SybilHomeScreenQuickActionHandler.handle(shortcutItem)
|
||||
}
|
||||
}
|
||||
|
||||
func windowScene(
|
||||
_ windowScene: UIWindowScene,
|
||||
performActionFor shortcutItem: UIApplicationShortcutItem,
|
||||
completionHandler: @escaping (Bool) -> Void
|
||||
) {
|
||||
completionHandler(SybilHomeScreenQuickActionHandler.handle(shortcutItem))
|
||||
}
|
||||
|
||||
func sceneWillResignActive(_ scene: UIScene) {
|
||||
SybilHomeScreenQuickActionHandler.configureQuickActions()
|
||||
}
|
||||
}
|
||||
|
||||
@MainActor
|
||||
private enum SybilHomeScreenQuickActionHandler {
|
||||
static func configureQuickActions() {
|
||||
// The quick question action is static in Info.plist so it is available before first launch.
|
||||
UIApplication.shared.shortcutItems = []
|
||||
}
|
||||
|
||||
static func handle(_ shortcutItem: UIApplicationShortcutItem) -> Bool {
|
||||
guard shortcutItem.type == SybilHomeScreenQuickAction.quickQuestionType else {
|
||||
return false
|
||||
}
|
||||
|
||||
Task { @MainActor in
|
||||
SybilQuickActionRouter.shared.requestQuickQuestionPresentation()
|
||||
}
|
||||
return true
|
||||
}
|
||||
}
|
||||
|
||||
@@ -22,9 +22,10 @@ targets:
|
||||
SUPPORTS_MAC_DESIGNED_FOR_IPHONE_IPAD: NO
|
||||
TARGETED_DEVICE_FAMILY: "1,2,6"
|
||||
GENERATE_INFOPLIST_FILE: YES
|
||||
INFOPLIST_FILE: Apps/Sybil/Info.plist
|
||||
ASSETCATALOG_COMPILER_APPICON_NAME: AppIcon
|
||||
MARKETING_VERSION: 1.5
|
||||
CURRENT_PROJECT_VERSION: 6
|
||||
MARKETING_VERSION: 1.8
|
||||
CURRENT_PROJECT_VERSION: 9
|
||||
INFOPLIST_KEY_CFBundleDisplayName: Sybil
|
||||
INFOPLIST_KEY_ITSAppUsesNonExemptEncryption: NO
|
||||
INFOPLIST_KEY_UIApplicationSupportsIndirectInputEvents: YES
|
||||
|
||||
@@ -2,10 +2,14 @@ import SwiftUI
|
||||
|
||||
public struct SplitView: View {
|
||||
@State private var viewModel = SybilViewModel()
|
||||
@ObservedObject private var quickActionRouter = SybilQuickActionRouter.shared
|
||||
@Environment(\.horizontalSizeClass) private var horizontalSizeClass
|
||||
@Environment(\.scenePhase) private var scenePhase
|
||||
@State private var shouldRefreshOnForeground = false
|
||||
@State private var composerFocusRequest = 0
|
||||
@State private var quickQuestionFocusRequest = 0
|
||||
@State private var hasPendingQuickQuestionPresentation = false
|
||||
@State private var isQuickQuestionPresented = false
|
||||
@State private var columnVisibility: NavigationSplitViewVisibility = .automatic
|
||||
|
||||
private var keyboardActions: SybilKeyboardActions? {
|
||||
@@ -74,8 +78,28 @@ public struct SplitView: View {
|
||||
.font(.sybil(.body))
|
||||
.preferredColorScheme(.dark)
|
||||
.focusedSceneValue(\.sybilKeyboardActions, keyboardActions)
|
||||
.sheet(isPresented: $isQuickQuestionPresented, onDismiss: handleQuickQuestionDismissed) {
|
||||
SybilQuickQuestionView(
|
||||
viewModel: viewModel,
|
||||
focusRequest: quickQuestionFocusRequest
|
||||
)
|
||||
.presentationDragIndicator(.visible)
|
||||
}
|
||||
.task {
|
||||
await viewModel.bootstrap()
|
||||
presentPendingQuickQuestionIfPossible()
|
||||
}
|
||||
.onReceive(quickActionRouter.$quickQuestionPresentationRequest) { request in
|
||||
guard request > 0 else {
|
||||
return
|
||||
}
|
||||
queueQuickQuestionPresentation()
|
||||
}
|
||||
.onChange(of: viewModel.isCheckingSession) { _, _ in
|
||||
presentPendingQuickQuestionIfPossible()
|
||||
}
|
||||
.onChange(of: viewModel.isAuthenticated) { _, _ in
|
||||
presentPendingQuickQuestionIfPossible()
|
||||
}
|
||||
.onChange(of: scenePhase) { _, nextPhase in
|
||||
switch nextPhase {
|
||||
@@ -112,6 +136,28 @@ public struct SplitView: View {
|
||||
columnVisibility = .all
|
||||
}
|
||||
}
|
||||
|
||||
private func queueQuickQuestionPresentation() {
|
||||
hasPendingQuickQuestionPresentation = true
|
||||
presentPendingQuickQuestionIfPossible()
|
||||
}
|
||||
|
||||
private func presentPendingQuickQuestionIfPossible() {
|
||||
guard hasPendingQuickQuestionPresentation,
|
||||
!viewModel.isCheckingSession,
|
||||
viewModel.isAuthenticated
|
||||
else {
|
||||
return
|
||||
}
|
||||
|
||||
hasPendingQuickQuestionPresentation = false
|
||||
quickQuestionFocusRequest += 1
|
||||
isQuickQuestionPresented = true
|
||||
}
|
||||
|
||||
private func handleQuickQuestionDismissed() {
|
||||
viewModel.cancelQuickQuestion()
|
||||
}
|
||||
}
|
||||
|
||||
public struct SybilCommands: Commands {
|
||||
|
||||
@@ -49,11 +49,16 @@ actor SybilAPIClient: SybilAPIClienting {
|
||||
return response.chats
|
||||
}
|
||||
|
||||
func createChat(title: String? = nil) async throws -> ChatSummary {
|
||||
func createChat(
|
||||
title: String? = nil,
|
||||
provider: Provider? = nil,
|
||||
model: String? = nil,
|
||||
messages: [CompletionRequestMessage]? = nil
|
||||
) async throws -> ChatSummary {
|
||||
let response = try await request(
|
||||
"/v1/chats",
|
||||
method: "POST",
|
||||
body: AnyEncodable(ChatCreateBody(title: title)),
|
||||
body: AnyEncodable(ChatCreateBody(title: title, provider: provider, model: model, messages: messages)),
|
||||
responseType: ChatCreateResponse.self
|
||||
)
|
||||
return response.chat
|
||||
@@ -116,6 +121,10 @@ actor SybilAPIClient: SybilAPIClienting {
|
||||
try await request("/v1/models", method: "GET", responseType: ModelCatalogResponse.self)
|
||||
}
|
||||
|
||||
func getActiveRuns() async throws -> ActiveRunsResponse {
|
||||
try await request("/v1/active-runs", method: "GET", responseType: ActiveRunsResponse.self)
|
||||
}
|
||||
|
||||
func runCompletionStream(
|
||||
body: CompletionStreamRequest,
|
||||
onEvent: @escaping @Sendable (CompletionStreamEvent) async -> Void
|
||||
@@ -133,43 +142,35 @@ actor SybilAPIClient: SybilAPIClienting {
|
||||
)
|
||||
|
||||
try await stream(request: request) { eventName, dataText in
|
||||
switch eventName {
|
||||
case "meta":
|
||||
let payload: CompletionStreamMeta = try Self.decodeEvent(dataText, as: CompletionStreamMeta.self, eventName: eventName)
|
||||
await onEvent(.meta(payload))
|
||||
case "tool_call":
|
||||
let payload: CompletionStreamToolCall = try Self.decodeEvent(dataText, as: CompletionStreamToolCall.self, eventName: eventName)
|
||||
await onEvent(.toolCall(payload))
|
||||
case "delta":
|
||||
let payload: CompletionStreamDelta = try Self.decodeEvent(dataText, as: CompletionStreamDelta.self, eventName: eventName)
|
||||
await onEvent(.delta(payload))
|
||||
case "done":
|
||||
do {
|
||||
let payload: CompletionStreamDone = try Self.decodeEvent(dataText, as: CompletionStreamDone.self, eventName: eventName)
|
||||
await onEvent(.done(payload))
|
||||
} catch {
|
||||
if let recovered = Self.decodeLastJSONLine(dataText, as: CompletionStreamDone.self) {
|
||||
SybilLog.warning(
|
||||
SybilLog.network,
|
||||
"Recovered chat stream done payload from concatenated SSE data"
|
||||
)
|
||||
await onEvent(.done(recovered))
|
||||
} else {
|
||||
throw error
|
||||
}
|
||||
}
|
||||
case "error":
|
||||
let payload: StreamErrorPayload = try Self.decodeEvent(dataText, as: StreamErrorPayload.self, eventName: eventName)
|
||||
await onEvent(.error(payload))
|
||||
default:
|
||||
SybilLog.warning(SybilLog.network, "Ignoring unknown chat stream event '\(eventName)'")
|
||||
await onEvent(.ignored)
|
||||
}
|
||||
try await Self.handleCompletionStreamEvent(eventName: eventName, dataText: dataText, onEvent: onEvent)
|
||||
}
|
||||
|
||||
SybilLog.info(SybilLog.network, "Chat stream completed")
|
||||
}
|
||||
|
||||
func attachCompletionStream(
|
||||
chatID: String,
|
||||
onEvent: @escaping @Sendable (CompletionStreamEvent) async -> Void
|
||||
) async throws {
|
||||
let request = try makeRequest(
|
||||
path: "/v1/chats/\(chatID)/stream/attach",
|
||||
method: "POST",
|
||||
body: nil,
|
||||
acceptsSSE: true
|
||||
)
|
||||
|
||||
SybilLog.info(
|
||||
SybilLog.network,
|
||||
"Attaching chat stream POST \(request.url?.absoluteString ?? "<unknown>")"
|
||||
)
|
||||
|
||||
try await stream(request: request) { eventName, dataText in
|
||||
try await Self.handleCompletionStreamEvent(eventName: eventName, dataText: dataText, onEvent: onEvent)
|
||||
}
|
||||
|
||||
SybilLog.info(SybilLog.network, "Attached chat stream completed")
|
||||
}
|
||||
|
||||
func runSearchStream(
|
||||
searchID: String,
|
||||
body: SearchRunRequest,
|
||||
@@ -188,34 +189,35 @@ actor SybilAPIClient: SybilAPIClienting {
|
||||
)
|
||||
|
||||
try await stream(request: request) { eventName, dataText in
|
||||
switch eventName {
|
||||
case "search_results":
|
||||
let payload: SearchResultsPayload = try Self.decodeEvent(dataText, as: SearchResultsPayload.self, eventName: eventName)
|
||||
await onEvent(.searchResults(payload))
|
||||
case "search_error":
|
||||
let payload: SearchErrorPayload = try Self.decodeEvent(dataText, as: SearchErrorPayload.self, eventName: eventName)
|
||||
await onEvent(.searchError(payload))
|
||||
case "answer":
|
||||
let payload: SearchAnswerPayload = try Self.decodeEvent(dataText, as: SearchAnswerPayload.self, eventName: eventName)
|
||||
await onEvent(.answer(payload))
|
||||
case "answer_error":
|
||||
let payload: SearchErrorPayload = try Self.decodeEvent(dataText, as: SearchErrorPayload.self, eventName: eventName)
|
||||
await onEvent(.answerError(payload))
|
||||
case "done":
|
||||
let payload: SearchDonePayload = try Self.decodeEvent(dataText, as: SearchDonePayload.self, eventName: eventName)
|
||||
await onEvent(.done(payload))
|
||||
case "error":
|
||||
let payload: StreamErrorPayload = try Self.decodeEvent(dataText, as: StreamErrorPayload.self, eventName: eventName)
|
||||
await onEvent(.error(payload))
|
||||
default:
|
||||
SybilLog.warning(SybilLog.network, "Ignoring unknown search stream event '\(eventName)'")
|
||||
await onEvent(.ignored)
|
||||
}
|
||||
try await Self.handleSearchStreamEvent(eventName: eventName, dataText: dataText, onEvent: onEvent)
|
||||
}
|
||||
|
||||
SybilLog.info(SybilLog.network, "Search stream completed")
|
||||
}
|
||||
|
||||
func attachSearchStream(
|
||||
searchID: String,
|
||||
onEvent: @escaping @Sendable (SearchStreamEvent) async -> Void
|
||||
) async throws {
|
||||
let request = try makeRequest(
|
||||
path: "/v1/searches/\(searchID)/run/stream/attach",
|
||||
method: "POST",
|
||||
body: nil,
|
||||
acceptsSSE: true
|
||||
)
|
||||
|
||||
SybilLog.info(
|
||||
SybilLog.network,
|
||||
"Attaching search stream POST \(request.url?.absoluteString ?? "<unknown>")"
|
||||
)
|
||||
|
||||
try await stream(request: request) { eventName, dataText in
|
||||
try await Self.handleSearchStreamEvent(eventName: eventName, dataText: dataText, onEvent: onEvent)
|
||||
}
|
||||
|
||||
SybilLog.info(SybilLog.network, "Attached search stream completed")
|
||||
}
|
||||
|
||||
private func request<Response: Decodable>(
|
||||
_ path: String,
|
||||
method: String,
|
||||
@@ -498,6 +500,75 @@ actor SybilAPIClient: SybilAPIClienting {
|
||||
return try? Self.decodeJSON(type, from: data)
|
||||
}
|
||||
|
||||
private static func handleCompletionStreamEvent(
|
||||
eventName: String,
|
||||
dataText: String,
|
||||
onEvent: @escaping @Sendable (CompletionStreamEvent) async -> Void
|
||||
) async throws {
|
||||
switch eventName {
|
||||
case "meta":
|
||||
let payload: CompletionStreamMeta = try Self.decodeEvent(dataText, as: CompletionStreamMeta.self, eventName: eventName)
|
||||
await onEvent(.meta(payload))
|
||||
case "tool_call":
|
||||
let payload: CompletionStreamToolCall = try Self.decodeEvent(dataText, as: CompletionStreamToolCall.self, eventName: eventName)
|
||||
await onEvent(.toolCall(payload))
|
||||
case "delta":
|
||||
let payload: CompletionStreamDelta = try Self.decodeEvent(dataText, as: CompletionStreamDelta.self, eventName: eventName)
|
||||
await onEvent(.delta(payload))
|
||||
case "done":
|
||||
do {
|
||||
let payload: CompletionStreamDone = try Self.decodeEvent(dataText, as: CompletionStreamDone.self, eventName: eventName)
|
||||
await onEvent(.done(payload))
|
||||
} catch {
|
||||
if let recovered = Self.decodeLastJSONLine(dataText, as: CompletionStreamDone.self) {
|
||||
SybilLog.warning(
|
||||
SybilLog.network,
|
||||
"Recovered chat stream done payload from concatenated SSE data"
|
||||
)
|
||||
await onEvent(.done(recovered))
|
||||
} else {
|
||||
throw error
|
||||
}
|
||||
}
|
||||
case "error":
|
||||
let payload: StreamErrorPayload = try Self.decodeEvent(dataText, as: StreamErrorPayload.self, eventName: eventName)
|
||||
await onEvent(.error(payload))
|
||||
default:
|
||||
SybilLog.warning(SybilLog.network, "Ignoring unknown chat stream event '\(eventName)'")
|
||||
await onEvent(.ignored)
|
||||
}
|
||||
}
|
||||
|
||||
private static func handleSearchStreamEvent(
|
||||
eventName: String,
|
||||
dataText: String,
|
||||
onEvent: @escaping @Sendable (SearchStreamEvent) async -> Void
|
||||
) async throws {
|
||||
switch eventName {
|
||||
case "search_results":
|
||||
let payload: SearchResultsPayload = try Self.decodeEvent(dataText, as: SearchResultsPayload.self, eventName: eventName)
|
||||
await onEvent(.searchResults(payload))
|
||||
case "search_error":
|
||||
let payload: SearchErrorPayload = try Self.decodeEvent(dataText, as: SearchErrorPayload.self, eventName: eventName)
|
||||
await onEvent(.searchError(payload))
|
||||
case "answer":
|
||||
let payload: SearchAnswerPayload = try Self.decodeEvent(dataText, as: SearchAnswerPayload.self, eventName: eventName)
|
||||
await onEvent(.answer(payload))
|
||||
case "answer_error":
|
||||
let payload: SearchErrorPayload = try Self.decodeEvent(dataText, as: SearchErrorPayload.self, eventName: eventName)
|
||||
await onEvent(.answerError(payload))
|
||||
case "done":
|
||||
let payload: SearchDonePayload = try Self.decodeEvent(dataText, as: SearchDonePayload.self, eventName: eventName)
|
||||
await onEvent(.done(payload))
|
||||
case "error":
|
||||
let payload: StreamErrorPayload = try Self.decodeEvent(dataText, as: StreamErrorPayload.self, eventName: eventName)
|
||||
await onEvent(.error(payload))
|
||||
default:
|
||||
SybilLog.warning(SybilLog.network, "Ignoring unknown search stream event '\(eventName)'")
|
||||
await onEvent(.ignored)
|
||||
}
|
||||
}
|
||||
|
||||
private static func flushSSEEvent(
|
||||
eventName: inout String,
|
||||
dataLines: inout [String]
|
||||
@@ -551,6 +622,7 @@ actor SybilAPIClient: SybilAPIClienting {
|
||||
|
||||
struct CompletionStreamRequest: Codable, Sendable {
|
||||
var chatId: String?
|
||||
var persist: Bool? = nil
|
||||
var provider: Provider
|
||||
var model: String
|
||||
var messages: [CompletionRequestMessage]
|
||||
@@ -558,6 +630,9 @@ struct CompletionStreamRequest: Codable, Sendable {
|
||||
|
||||
private struct ChatCreateBody: Encodable {
|
||||
var title: String?
|
||||
var provider: Provider?
|
||||
var model: String?
|
||||
var messages: [CompletionRequestMessage]?
|
||||
}
|
||||
|
||||
private struct SearchCreateBody: Encodable {
|
||||
|
||||
@@ -3,7 +3,12 @@ import Foundation
|
||||
protocol SybilAPIClienting: Sendable {
|
||||
func verifySession() async throws -> AuthSession
|
||||
func listChats() async throws -> [ChatSummary]
|
||||
func createChat(title: String?) async throws -> ChatSummary
|
||||
func createChat(
|
||||
title: String?,
|
||||
provider: Provider?,
|
||||
model: String?,
|
||||
messages: [CompletionRequestMessage]?
|
||||
) async throws -> ChatSummary
|
||||
func getChat(chatID: String) async throws -> ChatDetail
|
||||
func deleteChat(chatID: String) async throws
|
||||
func suggestChatTitle(chatID: String, content: String) async throws -> ChatSummary
|
||||
@@ -13,13 +18,28 @@ protocol SybilAPIClienting: Sendable {
|
||||
func createChatFromSearch(searchID: String, title: String?) async throws -> ChatSummary
|
||||
func deleteSearch(searchID: String) async throws
|
||||
func listModels() async throws -> ModelCatalogResponse
|
||||
func getActiveRuns() async throws -> ActiveRunsResponse
|
||||
func runCompletionStream(
|
||||
body: CompletionStreamRequest,
|
||||
onEvent: @escaping @Sendable (CompletionStreamEvent) async -> Void
|
||||
) async throws
|
||||
func attachCompletionStream(
|
||||
chatID: String,
|
||||
onEvent: @escaping @Sendable (CompletionStreamEvent) async -> Void
|
||||
) async throws
|
||||
func runSearchStream(
|
||||
searchID: String,
|
||||
body: SearchRunRequest,
|
||||
onEvent: @escaping @Sendable (SearchStreamEvent) async -> Void
|
||||
) async throws
|
||||
func attachSearchStream(
|
||||
searchID: String,
|
||||
onEvent: @escaping @Sendable (SearchStreamEvent) async -> Void
|
||||
) async throws
|
||||
}
|
||||
|
||||
extension SybilAPIClienting {
|
||||
func createChat(title: String?) async throws -> ChatSummary {
|
||||
try await createChat(title: title, provider: nil, model: nil, messages: nil)
|
||||
}
|
||||
}
|
||||
|
||||
@@ -17,18 +17,6 @@ struct SybilChatTranscriptView: View {
|
||||
var body: some View {
|
||||
ScrollView {
|
||||
LazyVStack(alignment: .leading, spacing: 26) {
|
||||
if isSending && !hasPendingAssistant {
|
||||
HStack(spacing: 8) {
|
||||
ProgressView()
|
||||
.controlSize(.small)
|
||||
.tint(SybilTheme.textMuted)
|
||||
Text("Assistant is typing…")
|
||||
.font(.sybil(.footnote))
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
}
|
||||
.scaleEffect(x: 1, y: -1)
|
||||
}
|
||||
|
||||
ForEach(messages.reversed()) { message in
|
||||
MessageBubble(message: message, isSending: isSending)
|
||||
.frame(maxWidth: .infinity)
|
||||
|
||||
@@ -4,12 +4,14 @@ public enum Provider: String, Codable, CaseIterable, Hashable, Sendable {
|
||||
case openai
|
||||
case anthropic
|
||||
case xai
|
||||
case hermesAgent = "hermes-agent"
|
||||
|
||||
public var displayName: String {
|
||||
switch self {
|
||||
case .openai: return "OpenAI"
|
||||
case .anthropic: return "Anthropic"
|
||||
case .xai: return "xAI"
|
||||
case .hermesAgent: return "Hermes Agent"
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -354,6 +356,16 @@ public struct SearchDetail: Codable, Identifiable, Hashable, Sendable {
|
||||
public var results: [SearchResultItem]
|
||||
}
|
||||
|
||||
public struct ActiveRunsResponse: Codable, Hashable, Sendable {
|
||||
public var chats: [String]
|
||||
public var searches: [String]
|
||||
|
||||
public init(chats: [String] = [], searches: [String] = []) {
|
||||
self.chats = chats
|
||||
self.searches = searches
|
||||
}
|
||||
}
|
||||
|
||||
public struct SearchRunRequest: Codable, Sendable {
|
||||
public var query: String?
|
||||
public var title: String?
|
||||
@@ -394,8 +406,8 @@ public struct CompletionRequestMessage: Codable, Sendable {
|
||||
}
|
||||
|
||||
public struct CompletionStreamMeta: Codable, Sendable {
|
||||
public var chatId: String
|
||||
public var callId: String
|
||||
public var chatId: String?
|
||||
public var callId: String?
|
||||
public var provider: Provider
|
||||
public var model: String
|
||||
}
|
||||
|
||||
@@ -22,75 +22,60 @@ enum PhoneRoute: Hashable {
|
||||
|
||||
struct SybilPhoneShellView: View {
|
||||
@Bindable var viewModel: SybilViewModel
|
||||
@State private var path: [PhoneRoute] = []
|
||||
@State private var route: PhoneRoute = .draftChat
|
||||
@Environment(\.scenePhase) private var scenePhase
|
||||
@State private var shouldRefreshOnForeground = false
|
||||
@State private var composerFocusRequest = 0
|
||||
@State private var phoneStackWidth: CGFloat = BackSwipeMetrics.referenceWidth
|
||||
@State private var backSwipeOffset: CGFloat = 0
|
||||
@State private var backSwipeCompletionOffset: CGFloat = 0
|
||||
@State private var backSwipeIsActive = false
|
||||
@State private var backSwipeIsCompleting = false
|
||||
@State private var backSwipeHasLatched = false
|
||||
@State private var isSidebarOverlayPresented = false
|
||||
@State private var sidebarSwipeOffset: CGFloat = 0
|
||||
@State private var sidebarSwipeIsActive = false
|
||||
@State private var sidebarSwipeIsCompleting = false
|
||||
@State private var sidebarSwipeHasLatched = false
|
||||
@State private var sidebarHighlightSelection: SidebarSelection?
|
||||
@State private var sidebarHighlightClearTask: Task<Void, Never>?
|
||||
@State private var openingSelectionRequestID: UUID?
|
||||
|
||||
private var canRecognizeBackSwipe: Bool {
|
||||
!path.isEmpty && !backSwipeIsCompleting
|
||||
private var canRecognizeSidebarSwipe: Bool {
|
||||
!isSidebarOverlayPresented && !sidebarSwipeIsCompleting
|
||||
}
|
||||
|
||||
private var backSwipeVisualOffset: CGFloat {
|
||||
backSwipeOffset + backSwipeCompletionOffset
|
||||
private var sidebarOverlayProgress: CGFloat {
|
||||
if isSidebarOverlayPresented {
|
||||
return 1
|
||||
}
|
||||
|
||||
return SidebarOverlaySwipeMetrics.progress(
|
||||
for: sidebarSwipeOffset,
|
||||
width: phoneStackWidth
|
||||
)
|
||||
}
|
||||
|
||||
private var shouldRenderSidebarOverlay: Bool {
|
||||
isSidebarOverlayPresented ||
|
||||
sidebarSwipeIsActive ||
|
||||
sidebarSwipeIsCompleting ||
|
||||
sidebarOverlayProgress > 0.001
|
||||
}
|
||||
|
||||
private var currentRouteSelection: SidebarSelection? {
|
||||
switch route {
|
||||
case let .chat(chatID):
|
||||
return .chat(chatID)
|
||||
case let .search(searchID):
|
||||
return .search(searchID)
|
||||
case .draftChat, .draftSearch, .settings:
|
||||
return nil
|
||||
}
|
||||
}
|
||||
|
||||
private var highlightedSidebarSelection: SidebarSelection? {
|
||||
sidebarHighlightSelection ?? currentRouteSelection
|
||||
}
|
||||
|
||||
var body: some View {
|
||||
GeometryReader { proxy in
|
||||
ZStack(alignment: .topLeading) {
|
||||
SybilPhoneSidebarRoot(viewModel: viewModel, path: $path)
|
||||
.safeAreaInset(edge: .top, spacing: 0) {
|
||||
phoneRootTopBar
|
||||
}
|
||||
.zIndex(0)
|
||||
|
||||
if let route = path.last {
|
||||
SybilPhoneDestinationView(
|
||||
viewModel: viewModel,
|
||||
composerFocusRequest: $composerFocusRequest,
|
||||
route: route,
|
||||
onRequestBack: requestBack,
|
||||
onRequestNewChat: startNewChatFromDestination
|
||||
)
|
||||
.background(SybilTheme.background)
|
||||
.offset(x: backSwipeVisualOffset)
|
||||
.shadow(
|
||||
color: backSwipeVisualOffset > 0 ? Color.black.opacity(0.34) : Color.clear,
|
||||
radius: backSwipeVisualOffset > 0 ? 18 : 0,
|
||||
x: -8,
|
||||
y: 0
|
||||
)
|
||||
.transition(.move(edge: .trailing))
|
||||
.zIndex(1)
|
||||
.background {
|
||||
WorkspaceSwipePanInstaller(
|
||||
direction: .right,
|
||||
isEnabled: canRecognizeBackSwipe,
|
||||
onBegan: { width in
|
||||
beginBackSwipe(containerWidth: width)
|
||||
},
|
||||
onChanged: { translationX, width in
|
||||
updateBackSwipe(with: translationX, containerWidth: width)
|
||||
},
|
||||
onEnded: { translationX, width, velocityX, didFinish in
|
||||
finishBackSwipe(
|
||||
translationX: translationX,
|
||||
containerWidth: width,
|
||||
velocityX: velocityX,
|
||||
didFinish: didFinish
|
||||
)
|
||||
}
|
||||
)
|
||||
.frame(maxWidth: .infinity, maxHeight: .infinity)
|
||||
}
|
||||
}
|
||||
}
|
||||
phoneStack(width: proxy.size.width)
|
||||
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
|
||||
.onAppear {
|
||||
updatePhoneStackWidth(proxy.size.width)
|
||||
@@ -100,13 +85,8 @@ struct SybilPhoneShellView: View {
|
||||
}
|
||||
}
|
||||
.tint(SybilTheme.primary)
|
||||
.animation(.easeOut(duration: 0.22), value: path.last)
|
||||
.onChange(of: path) { _, nextPath in
|
||||
guard nextPath.isEmpty else {
|
||||
return
|
||||
}
|
||||
resetBackSwipe(animated: false)
|
||||
}
|
||||
.animation(.easeOut(duration: 0.22), value: route)
|
||||
.animation(.easeOut(duration: 0.18), value: isSidebarOverlayPresented)
|
||||
.onChange(of: scenePhase) { _, nextPhase in
|
||||
switch nextPhase {
|
||||
case .background:
|
||||
@@ -120,8 +100,8 @@ struct SybilPhoneShellView: View {
|
||||
shouldRefreshOnForeground = false
|
||||
Task {
|
||||
await viewModel.refreshAfterAppBecameActive(
|
||||
refreshCollections: path.isEmpty,
|
||||
refreshSelection: !path.isEmpty && viewModel.hasRefreshableSelection
|
||||
refreshCollections: isSidebarOverlayPresented,
|
||||
refreshSelection: !isSidebarOverlayPresented && viewModel.hasRefreshableSelection
|
||||
)
|
||||
}
|
||||
case .inactive:
|
||||
@@ -133,16 +113,117 @@ struct SybilPhoneShellView: View {
|
||||
}
|
||||
}
|
||||
|
||||
private var phoneRootTopBar: some View {
|
||||
HStack {
|
||||
private func phoneStack(width: CGFloat) -> some View {
|
||||
ZStack(alignment: .topLeading) {
|
||||
phoneWorkspaceLayer
|
||||
.zIndex(0)
|
||||
|
||||
phoneSidebarOverlayLayer(width: width)
|
||||
.zIndex(1)
|
||||
}
|
||||
}
|
||||
|
||||
private var phoneWorkspaceLayer: some View {
|
||||
SybilPhoneDestinationView(
|
||||
viewModel: viewModel,
|
||||
composerFocusRequest: $composerFocusRequest,
|
||||
route: route,
|
||||
onRequestBack: { _ in showSidebarOverlay() },
|
||||
onRequestNewChat: sidebarWorkspaceNewChatAction,
|
||||
onShowSidebar: showSidebarOverlay
|
||||
)
|
||||
.background(SybilTheme.background)
|
||||
.blur(radius: SidebarOverlaySwipeMetrics.workspaceBlurRadius(for: sidebarOverlayProgress))
|
||||
.opacity(SidebarOverlaySwipeMetrics.workspaceOpacity(for: sidebarOverlayProgress))
|
||||
.allowsHitTesting(!shouldRenderSidebarOverlay)
|
||||
.background {
|
||||
sidebarSwipeInstaller
|
||||
}
|
||||
}
|
||||
|
||||
private func phoneSidebarOverlayLayer(width: CGFloat) -> some View {
|
||||
VStack(spacing: 0) {
|
||||
phoneOverlayTopBar
|
||||
|
||||
SybilPhoneSidebarRoot(
|
||||
viewModel: viewModel,
|
||||
highlightedSelection: highlightedSidebarSelection,
|
||||
onSelect: openSidebarSelection,
|
||||
onRoute: showRouteAndClearSidebarHighlight
|
||||
)
|
||||
}
|
||||
.opacity(sidebarOverlayProgress)
|
||||
.blur(radius: SidebarOverlaySwipeMetrics.overlayBlurRadius(for: sidebarOverlayProgress))
|
||||
.offset(x: SidebarOverlaySwipeMetrics.overlayOffset(for: sidebarOverlayProgress, width: width))
|
||||
.allowsHitTesting(isSidebarOverlayPresented)
|
||||
.accessibilityHidden(!isSidebarOverlayPresented)
|
||||
}
|
||||
|
||||
private var sidebarSwipeInstaller: some View {
|
||||
WorkspaceSwipePanInstaller(
|
||||
direction: .right,
|
||||
isEnabled: canRecognizeSidebarSwipe,
|
||||
onBegan: { width in
|
||||
beginSidebarSwipe(containerWidth: width)
|
||||
},
|
||||
onChanged: { translationX, width in
|
||||
updateSidebarSwipe(with: translationX, containerWidth: width)
|
||||
},
|
||||
onEnded: { translationX, width, velocityX, didFinish in
|
||||
finishSidebarSwipe(
|
||||
translationX: translationX,
|
||||
containerWidth: width,
|
||||
velocityX: velocityX,
|
||||
didFinish: didFinish
|
||||
)
|
||||
}
|
||||
)
|
||||
.frame(maxWidth: .infinity, maxHeight: .infinity)
|
||||
}
|
||||
|
||||
private var sidebarWorkspaceNewChatAction: (() -> Void)? {
|
||||
guard !isSidebarOverlayPresented else {
|
||||
return nil
|
||||
}
|
||||
|
||||
return {
|
||||
startNewChatFromDestination()
|
||||
}
|
||||
}
|
||||
|
||||
private var phoneOverlayTopBar: some View {
|
||||
HStack(spacing: 12) {
|
||||
SybilWordmark(size: 21)
|
||||
Spacer()
|
||||
|
||||
Button {
|
||||
hideSidebarOverlay()
|
||||
} label: {
|
||||
Image(systemName: "chevron.right.2")
|
||||
.font(.system(size: 21, weight: .bold))
|
||||
.foregroundStyle(SybilTheme.text)
|
||||
.frame(width: 54, height: 54)
|
||||
.background(
|
||||
Circle()
|
||||
.fill(.ultraThinMaterial)
|
||||
.overlay(
|
||||
Circle()
|
||||
.fill(SybilTheme.surface.opacity(0.76))
|
||||
)
|
||||
)
|
||||
.overlay(
|
||||
Circle()
|
||||
.stroke(SybilTheme.border.opacity(0.64), lineWidth: 1)
|
||||
)
|
||||
}
|
||||
.buttonStyle(.plain)
|
||||
.accessibilityLabel("Hide conversations")
|
||||
}
|
||||
.padding(.horizontal, 16)
|
||||
.padding(.top, 10)
|
||||
.padding(.bottom, 12)
|
||||
.background {
|
||||
SybilTheme.panelGradient
|
||||
SybilPhoneOverlayBlurBand(edge: .top)
|
||||
.ignoresSafeArea(edges: .top)
|
||||
}
|
||||
}
|
||||
@@ -151,62 +232,98 @@ struct SybilPhoneShellView: View {
|
||||
phoneStackWidth = max(width, 1)
|
||||
}
|
||||
|
||||
private func requestBack(animateNavigation: Bool = true) {
|
||||
guard !path.isEmpty, !backSwipeIsCompleting else {
|
||||
return
|
||||
}
|
||||
|
||||
if animateNavigation {
|
||||
Task {
|
||||
await completeBackSwipe(containerWidth: phoneStackWidth, releaseVelocityX: 0)
|
||||
}
|
||||
} else {
|
||||
popRoute(disablesAnimations: true)
|
||||
resetBackSwipe(animated: false)
|
||||
}
|
||||
}
|
||||
|
||||
private func startNewChatFromDestination() {
|
||||
viewModel.startNewChat()
|
||||
composerFocusRequest += 1
|
||||
replaceTopRoute(with: .draftChat)
|
||||
showRoute(.draftChat)
|
||||
}
|
||||
|
||||
private func replaceTopRoute(with route: PhoneRoute) {
|
||||
if path.isEmpty {
|
||||
private func showRoute(_ nextRoute: PhoneRoute) {
|
||||
let update = {
|
||||
route = nextRoute
|
||||
}
|
||||
|
||||
if isSidebarOverlayPresented {
|
||||
withAnimation(.easeOut(duration: 0.22)) {
|
||||
path = [route]
|
||||
update()
|
||||
isSidebarOverlayPresented = false
|
||||
}
|
||||
} else {
|
||||
path[path.index(before: path.endIndex)] = route
|
||||
update()
|
||||
}
|
||||
|
||||
resetSidebarSwipe(animated: false)
|
||||
}
|
||||
|
||||
private func popRoute(disablesAnimations: Bool) {
|
||||
let pop = {
|
||||
guard !path.isEmpty else {
|
||||
private func showRouteAndClearSidebarHighlight(_ nextRoute: PhoneRoute) {
|
||||
showRoute(nextRoute)
|
||||
clearSidebarHighlight()
|
||||
}
|
||||
|
||||
private func showSidebarOverlay() {
|
||||
withAnimation(.easeOut(duration: 0.18)) {
|
||||
isSidebarOverlayPresented = true
|
||||
}
|
||||
resetSidebarSwipe(animated: false)
|
||||
}
|
||||
|
||||
private func hideSidebarOverlay() {
|
||||
withAnimation(.easeOut(duration: 0.18)) {
|
||||
isSidebarOverlayPresented = false
|
||||
}
|
||||
resetSidebarSwipe(animated: false)
|
||||
}
|
||||
|
||||
private func openSidebarSelection(_ selection: SidebarSelection) {
|
||||
if openingSelectionRequestID != nil, sidebarHighlightSelection == selection {
|
||||
return
|
||||
}
|
||||
|
||||
let requestID = UUID()
|
||||
openingSelectionRequestID = requestID
|
||||
setSidebarHighlight(selection)
|
||||
|
||||
Task {
|
||||
await viewModel.selectForNavigation(selection)
|
||||
guard openingSelectionRequestID == requestID else {
|
||||
return
|
||||
}
|
||||
_ = path.removeLast()
|
||||
}
|
||||
|
||||
if disablesAnimations {
|
||||
var transaction = Transaction()
|
||||
transaction.disablesAnimations = true
|
||||
withTransaction(transaction) {
|
||||
pop()
|
||||
}
|
||||
} else {
|
||||
withAnimation(.easeOut(duration: 0.22)) {
|
||||
pop()
|
||||
}
|
||||
showRoute(PhoneRoute.from(selection: selection))
|
||||
openingSelectionRequestID = nil
|
||||
clearSidebarHighlight(selection, after: .milliseconds(260))
|
||||
}
|
||||
}
|
||||
|
||||
private func beginBackSwipe(containerWidth: CGFloat) {
|
||||
private func setSidebarHighlight(_ selection: SidebarSelection) {
|
||||
sidebarHighlightClearTask?.cancel()
|
||||
sidebarHighlightSelection = selection
|
||||
}
|
||||
|
||||
private func clearSidebarHighlight(_ selection: SidebarSelection, after delay: Duration) {
|
||||
sidebarHighlightClearTask?.cancel()
|
||||
sidebarHighlightClearTask = Task { @MainActor in
|
||||
try? await Task.sleep(for: delay)
|
||||
guard !Task.isCancelled,
|
||||
sidebarHighlightSelection == selection,
|
||||
openingSelectionRequestID == nil else {
|
||||
return
|
||||
}
|
||||
sidebarHighlightSelection = nil
|
||||
}
|
||||
}
|
||||
|
||||
private func clearSidebarHighlight() {
|
||||
sidebarHighlightClearTask?.cancel()
|
||||
openingSelectionRequestID = nil
|
||||
sidebarHighlightSelection = nil
|
||||
}
|
||||
|
||||
private func beginSidebarSwipe(containerWidth: CGFloat) {
|
||||
let update = {
|
||||
backSwipeIsActive = true
|
||||
backSwipeHasLatched = false
|
||||
phoneStackWidth = max(containerWidth, 1)
|
||||
sidebarSwipeIsActive = true
|
||||
sidebarSwipeHasLatched = false
|
||||
}
|
||||
|
||||
var transaction = Transaction()
|
||||
@@ -214,97 +331,79 @@ struct SybilPhoneShellView: View {
|
||||
withTransaction(transaction, update)
|
||||
}
|
||||
|
||||
private func updateBackSwipe(with rawTranslation: CGFloat, containerWidth: CGFloat) {
|
||||
let nextOffset = BackSwipeMetrics.clampedOffset(for: rawTranslation, width: containerWidth)
|
||||
let nextLatched = BackSwipeMetrics.isLatched(
|
||||
private func updateSidebarSwipe(with rawTranslation: CGFloat, containerWidth: CGFloat) {
|
||||
let nextOffset = SidebarOverlaySwipeMetrics.clampedOffset(for: rawTranslation, width: containerWidth)
|
||||
let nextLatched = SidebarOverlaySwipeMetrics.isLatched(
|
||||
offset: nextOffset,
|
||||
width: containerWidth,
|
||||
isCurrentlyLatched: backSwipeHasLatched
|
||||
isCurrentlyLatched: sidebarSwipeHasLatched
|
||||
)
|
||||
|
||||
var transaction = Transaction()
|
||||
transaction.disablesAnimations = true
|
||||
withTransaction(transaction) {
|
||||
backSwipeOffset = nextOffset
|
||||
backSwipeHasLatched = nextLatched
|
||||
phoneStackWidth = max(containerWidth, 1)
|
||||
sidebarSwipeOffset = nextOffset
|
||||
sidebarSwipeHasLatched = nextLatched
|
||||
}
|
||||
}
|
||||
|
||||
private func finishBackSwipe(
|
||||
private func finishSidebarSwipe(
|
||||
translationX: CGFloat,
|
||||
containerWidth: CGFloat,
|
||||
velocityX: CGFloat,
|
||||
didFinish: Bool
|
||||
) {
|
||||
guard backSwipeIsActive else {
|
||||
resetBackSwipe(animated: false)
|
||||
guard sidebarSwipeIsActive else {
|
||||
resetSidebarSwipe(animated: false)
|
||||
return
|
||||
}
|
||||
|
||||
let finalOffset = BackSwipeMetrics.clampedOffset(for: translationX, width: containerWidth)
|
||||
let finalLatched = BackSwipeMetrics.isLatched(
|
||||
let finalOffset = SidebarOverlaySwipeMetrics.clampedOffset(for: translationX, width: containerWidth)
|
||||
let finalLatched = SidebarOverlaySwipeMetrics.isLatched(
|
||||
offset: finalOffset,
|
||||
width: containerWidth,
|
||||
isCurrentlyLatched: backSwipeHasLatched
|
||||
isCurrentlyLatched: sidebarSwipeHasLatched
|
||||
)
|
||||
updateBackSwipe(with: translationX, containerWidth: containerWidth)
|
||||
updateSidebarSwipe(with: translationX, containerWidth: containerWidth)
|
||||
|
||||
if didFinish && BackSwipeMetrics.shouldComplete(
|
||||
if didFinish && SidebarOverlaySwipeMetrics.shouldComplete(
|
||||
offset: finalOffset,
|
||||
velocityX: velocityX,
|
||||
width: containerWidth,
|
||||
isLatched: finalLatched
|
||||
) {
|
||||
Task {
|
||||
await completeBackSwipe(containerWidth: containerWidth, releaseVelocityX: velocityX)
|
||||
}
|
||||
completeSidebarSwipe()
|
||||
return
|
||||
}
|
||||
|
||||
resetBackSwipe(animated: true, velocityX: velocityX)
|
||||
resetSidebarSwipe(animated: true, velocityX: velocityX)
|
||||
}
|
||||
|
||||
@MainActor
|
||||
private func completeBackSwipe(containerWidth: CGFloat, releaseVelocityX: CGFloat) async {
|
||||
guard !path.isEmpty else {
|
||||
resetBackSwipe(animated: false)
|
||||
return
|
||||
}
|
||||
guard !backSwipeIsCompleting else {
|
||||
private func completeSidebarSwipe() {
|
||||
guard !sidebarSwipeIsCompleting else {
|
||||
return
|
||||
}
|
||||
|
||||
backSwipeIsCompleting = true
|
||||
let targetOffset = BackSwipeMetrics.completionTargetOffset(for: containerWidth)
|
||||
|
||||
withAnimation(
|
||||
BackSwipeMetrics.springAnimation(
|
||||
currentOffset: backSwipeOffset,
|
||||
targetOffset: targetOffset,
|
||||
velocityX: releaseVelocityX
|
||||
)
|
||||
) {
|
||||
backSwipeCompletionOffset = targetOffset - backSwipeOffset
|
||||
sidebarSwipeIsCompleting = true
|
||||
withAnimation(.easeOut(duration: 0.18)) {
|
||||
isSidebarOverlayPresented = true
|
||||
}
|
||||
|
||||
try? await Task.sleep(for: .milliseconds(BackSwipeMetrics.completionAnimationDelayMs))
|
||||
popRoute(disablesAnimations: true)
|
||||
resetBackSwipe(animated: false)
|
||||
resetSidebarSwipe(animated: false)
|
||||
}
|
||||
|
||||
private func resetBackSwipe(animated: Bool, velocityX: CGFloat = 0) {
|
||||
let currentOffset = backSwipeOffset + backSwipeCompletionOffset
|
||||
private func resetSidebarSwipe(animated: Bool, velocityX: CGFloat = 0) {
|
||||
let currentOffset = sidebarSwipeOffset
|
||||
let reset = {
|
||||
backSwipeOffset = 0
|
||||
backSwipeCompletionOffset = 0
|
||||
backSwipeIsActive = false
|
||||
backSwipeIsCompleting = false
|
||||
backSwipeHasLatched = false
|
||||
sidebarSwipeOffset = 0
|
||||
sidebarSwipeIsActive = false
|
||||
sidebarSwipeIsCompleting = false
|
||||
sidebarSwipeHasLatched = false
|
||||
}
|
||||
|
||||
if animated {
|
||||
withAnimation(
|
||||
BackSwipeMetrics.springAnimation(
|
||||
SidebarOverlaySwipeMetrics.springAnimation(
|
||||
currentOffset: currentOffset,
|
||||
targetOffset: 0,
|
||||
velocityX: velocityX
|
||||
@@ -318,31 +417,79 @@ struct SybilPhoneShellView: View {
|
||||
}
|
||||
}
|
||||
|
||||
private struct SybilPhoneSidebarRoot: View {
|
||||
@Bindable var viewModel: SybilViewModel
|
||||
@Binding var path: [PhoneRoute]
|
||||
@State private var openingSelection: SidebarSelection?
|
||||
@State private var openingRequestID: UUID?
|
||||
private enum SidebarOverlaySwipeMetrics {
|
||||
static func clampedOffset(for rawTranslation: CGFloat, width: CGFloat) -> CGFloat {
|
||||
BackSwipeMetrics.clampedOffset(for: rawTranslation, width: width)
|
||||
}
|
||||
|
||||
private var highlightedSelection: SidebarSelection? {
|
||||
if let openingSelection {
|
||||
return openingSelection
|
||||
}
|
||||
static func progress(for offset: CGFloat, width: CGFloat) -> CGFloat {
|
||||
BackSwipeMetrics.progress(for: offset, width: width)
|
||||
}
|
||||
|
||||
guard let route = path.last else {
|
||||
return nil
|
||||
}
|
||||
static func isLatched(offset: CGFloat, width: CGFloat, isCurrentlyLatched: Bool = false) -> Bool {
|
||||
BackSwipeMetrics.isLatched(offset: offset, width: width, isCurrentlyLatched: isCurrentlyLatched)
|
||||
}
|
||||
|
||||
switch route {
|
||||
case let .chat(chatID):
|
||||
return .chat(chatID)
|
||||
case let .search(searchID):
|
||||
return .search(searchID)
|
||||
case .draftChat, .draftSearch, .settings:
|
||||
return nil
|
||||
static func shouldComplete(offset: CGFloat, velocityX: CGFloat, width: CGFloat, isLatched: Bool) -> Bool {
|
||||
BackSwipeMetrics.shouldComplete(offset: offset, velocityX: velocityX, width: width, isLatched: isLatched)
|
||||
}
|
||||
|
||||
static func springAnimation(currentOffset: CGFloat, targetOffset: CGFloat, velocityX: CGFloat) -> Animation {
|
||||
BackSwipeMetrics.springAnimation(currentOffset: currentOffset, targetOffset: targetOffset, velocityX: velocityX)
|
||||
}
|
||||
|
||||
static func overlayOffset(for progress: CGFloat, width: CGFloat) -> CGFloat {
|
||||
-(1 - min(max(progress, 0), 1)) * min(max(width * 0.18, 44), 76)
|
||||
}
|
||||
|
||||
static func overlayBlurRadius(for progress: CGFloat) -> CGFloat {
|
||||
(1 - min(max(progress, 0), 1)) * 18
|
||||
}
|
||||
|
||||
static func workspaceBlurRadius(for progress: CGFloat) -> CGFloat {
|
||||
min(max(progress, 0), 1) * 14
|
||||
}
|
||||
|
||||
static func workspaceOpacity(for progress: CGFloat) -> CGFloat {
|
||||
1 - (min(max(progress, 0), 1) * 0.22)
|
||||
}
|
||||
}
|
||||
|
||||
private struct SybilPhoneOverlayBlurBand: View {
|
||||
var edge: VerticalEdge
|
||||
|
||||
var body: some View {
|
||||
ZStack {
|
||||
Rectangle()
|
||||
.fill(.ultraThinMaterial)
|
||||
.opacity(0.34)
|
||||
|
||||
Rectangle()
|
||||
.fill(
|
||||
LinearGradient(
|
||||
colors: gradientColors,
|
||||
startPoint: edge == .top ? .top : .bottom,
|
||||
endPoint: edge == .top ? .bottom : .top
|
||||
)
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
private var gradientColors: [Color] {
|
||||
[
|
||||
Color.black.opacity(0.94),
|
||||
SybilTheme.background.opacity(0.78),
|
||||
Color.black.opacity(0)
|
||||
]
|
||||
}
|
||||
}
|
||||
|
||||
private struct SybilPhoneSidebarRoot: View {
|
||||
@Bindable var viewModel: SybilViewModel
|
||||
var highlightedSelection: SidebarSelection?
|
||||
var onSelect: (SidebarSelection) -> Void
|
||||
var onRoute: (PhoneRoute) -> Void
|
||||
|
||||
var body: some View {
|
||||
VStack(spacing: 0) {
|
||||
if let errorMessage = viewModel.errorMessage {
|
||||
@@ -357,64 +504,15 @@ private struct SybilPhoneSidebarRoot: View {
|
||||
.overlay(SybilTheme.border)
|
||||
}
|
||||
|
||||
if viewModel.isLoadingCollections && viewModel.sidebarItems.isEmpty {
|
||||
VStack(alignment: .leading, spacing: 8) {
|
||||
ProgressView()
|
||||
.tint(SybilTheme.primary)
|
||||
Text("Loading conversations…")
|
||||
.font(.sybil(.footnote))
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
SybilSidebarItemList(
|
||||
viewModel: viewModel,
|
||||
isSelected: { item in
|
||||
highlightedSelection == item.selection
|
||||
},
|
||||
onSelect: { item in
|
||||
onSelect(item.selection)
|
||||
}
|
||||
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
|
||||
.padding(16)
|
||||
} else if viewModel.sidebarItems.isEmpty {
|
||||
VStack(spacing: 10) {
|
||||
Image(systemName: "message.badge")
|
||||
.font(.system(size: 20, weight: .medium))
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
Text("Start a chat or run your first search.")
|
||||
.font(.sybil(.footnote))
|
||||
.multilineTextAlignment(.center)
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
}
|
||||
.frame(maxWidth: .infinity, maxHeight: .infinity)
|
||||
.padding(16)
|
||||
} else {
|
||||
ScrollView {
|
||||
LazyVStack(alignment: .leading, spacing: 0) {
|
||||
ForEach(viewModel.sidebarItems) { item in
|
||||
Button {
|
||||
open(item.selection)
|
||||
} label: {
|
||||
VStack(spacing: 0.0) {
|
||||
SybilPhoneSidebarRow(item: item)
|
||||
Divider()
|
||||
}
|
||||
}
|
||||
.buttonStyle(
|
||||
SybilPhoneSidebarRowButtonStyle(
|
||||
isHighlighted: highlightedSelection == item.selection
|
||||
)
|
||||
)
|
||||
.contextMenu {
|
||||
Button(role: .destructive) {
|
||||
Task {
|
||||
await viewModel.deleteItem(item.selection)
|
||||
}
|
||||
} label: {
|
||||
Label("Delete", systemImage: "trash")
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.refreshable {
|
||||
await viewModel.refreshVisibleContent(
|
||||
refreshCollections: true,
|
||||
refreshSelection: false
|
||||
)
|
||||
}
|
||||
}
|
||||
)
|
||||
}
|
||||
.background(SybilTheme.panelGradient)
|
||||
.safeAreaInset(edge: .bottom, spacing: 0) {
|
||||
@@ -429,22 +527,20 @@ private struct SybilPhoneSidebarRoot: View {
|
||||
|
||||
HStack(spacing: 12) {
|
||||
toolbarIconButton(systemImage: "gearshape", accessibilityLabel: "Settings") {
|
||||
clearOpeningSelection()
|
||||
showRoute(.settings)
|
||||
viewModel.openSettings()
|
||||
onRoute(.settings)
|
||||
}
|
||||
|
||||
Spacer()
|
||||
|
||||
toolbarIconButton(systemImage: "magnifyingglass", accessibilityLabel: "New search") {
|
||||
clearOpeningSelection()
|
||||
viewModel.startNewSearch()
|
||||
showRoute(.draftSearch)
|
||||
onRoute(.draftSearch)
|
||||
}
|
||||
|
||||
toolbarIconButton(systemImage: "plus", accessibilityLabel: "New chat", isPrimary: true) {
|
||||
clearOpeningSelection()
|
||||
viewModel.startNewChat()
|
||||
showRoute(.draftChat)
|
||||
onRoute(.draftChat)
|
||||
}
|
||||
}
|
||||
.padding(.horizontal, 18)
|
||||
@@ -480,114 +576,6 @@ private struct SybilPhoneSidebarRoot: View {
|
||||
.buttonStyle(.plain)
|
||||
.accessibilityLabel(accessibilityLabel)
|
||||
}
|
||||
|
||||
private func clearOpeningSelection() {
|
||||
openingRequestID = nil
|
||||
openingSelection = nil
|
||||
}
|
||||
|
||||
private func showRoute(_ route: PhoneRoute) {
|
||||
withAnimation(.easeOut(duration: 0.22)) {
|
||||
path = [route]
|
||||
}
|
||||
}
|
||||
|
||||
private func open(_ selection: SidebarSelection) {
|
||||
guard openingSelection != selection else {
|
||||
return
|
||||
}
|
||||
|
||||
let requestID = UUID()
|
||||
openingRequestID = requestID
|
||||
openingSelection = selection
|
||||
Task {
|
||||
await viewModel.selectForNavigation(selection)
|
||||
guard openingRequestID == requestID else {
|
||||
return
|
||||
}
|
||||
showRoute(PhoneRoute.from(selection: selection))
|
||||
openingRequestID = nil
|
||||
openingSelection = nil
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
private struct SybilPhoneSidebarRowIsActiveKey: EnvironmentKey {
|
||||
static let defaultValue = false
|
||||
}
|
||||
|
||||
private extension EnvironmentValues {
|
||||
var sybilPhoneSidebarRowIsActive: Bool {
|
||||
get { self[SybilPhoneSidebarRowIsActiveKey.self] }
|
||||
set { self[SybilPhoneSidebarRowIsActiveKey.self] = newValue }
|
||||
}
|
||||
}
|
||||
|
||||
private struct SybilPhoneSidebarRowButtonStyle: ButtonStyle {
|
||||
var isHighlighted: Bool
|
||||
|
||||
func makeBody(configuration: Configuration) -> some View {
|
||||
configuration.label
|
||||
.environment(\.sybilPhoneSidebarRowIsActive, isHighlighted || configuration.isPressed)
|
||||
}
|
||||
}
|
||||
|
||||
private struct SybilPhoneSidebarRow: View {
|
||||
@Environment(\.sybilPhoneSidebarRowIsActive) private var isHighlighted
|
||||
var item: SidebarItem
|
||||
|
||||
var body: some View {
|
||||
let leadingWidth = 22.0
|
||||
|
||||
VStack(alignment: .leading, spacing: 8) {
|
||||
HStack(spacing: 8) {
|
||||
Image(systemName: item.kind == .chat ? "message" : "globe")
|
||||
.font(.system(size: 12, weight: .semibold))
|
||||
.foregroundStyle(isHighlighted ? SybilTheme.accent : SybilTheme.textMuted)
|
||||
.frame(width: leadingWidth, height: leadingWidth)
|
||||
.background(
|
||||
Rectangle()
|
||||
.fill(isHighlighted ? SybilTheme.accent.opacity(0.12) : SybilTheme.surface.opacity(0.72))
|
||||
|
||||
)
|
||||
|
||||
Text(item.title)
|
||||
.font(.sybil(.subheadline, weight: .semibold))
|
||||
.lineLimit(1)
|
||||
}
|
||||
|
||||
HStack(spacing: 8) {
|
||||
Spacer()
|
||||
.frame(width: leadingWidth)
|
||||
|
||||
Text(item.updatedAt.sybilRelativeLabel)
|
||||
.font(.sybil(.caption2))
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
|
||||
if let initiated = item.initiatedLabel {
|
||||
Spacer(minLength: 0)
|
||||
Text(initiated)
|
||||
.font(.sybil(.caption2))
|
||||
.foregroundStyle(SybilTheme.textMuted.opacity(0.88))
|
||||
.lineLimit(1)
|
||||
.multilineTextAlignment(.trailing)
|
||||
.frame(maxWidth: .infinity, alignment: .trailing)
|
||||
}
|
||||
}
|
||||
}
|
||||
.foregroundStyle(SybilTheme.text)
|
||||
.padding(18.0)
|
||||
.frame(maxWidth: .infinity, alignment: .leading)
|
||||
.background(
|
||||
Rectangle()
|
||||
.fill(
|
||||
isHighlighted
|
||||
? SybilTheme.selectedRowGradient
|
||||
: LinearGradient(colors: [SybilTheme.surface.opacity(0.56), SybilTheme.surface.opacity(0.36)], startPoint: .topLeading, endPoint: .bottomTrailing)
|
||||
)
|
||||
)
|
||||
|
||||
}
|
||||
}
|
||||
|
||||
private struct SybilPhoneDestinationView: View {
|
||||
@@ -595,12 +583,15 @@ private struct SybilPhoneDestinationView: View {
|
||||
@Binding var composerFocusRequest: Int
|
||||
let route: PhoneRoute
|
||||
let onRequestBack: (_ animateNavigation: Bool) -> Void
|
||||
let onRequestNewChat: () -> Void
|
||||
let onRequestNewChat: (() -> Void)?
|
||||
let onShowSidebar: () -> Void
|
||||
|
||||
var body: some View {
|
||||
SybilWorkspaceView(
|
||||
viewModel: viewModel,
|
||||
composerFocusRequest: composerFocusRequest,
|
||||
navigationLeadingControl: .showSidebar,
|
||||
onShowSidebar: onShowSidebar,
|
||||
onRequestBack: onRequestBack,
|
||||
onRequestNewChat: onRequestNewChat
|
||||
)
|
||||
|
||||
@@ -0,0 +1,19 @@
|
||||
import Combine
|
||||
import Foundation
|
||||
|
||||
public enum SybilHomeScreenQuickAction {
|
||||
public static let quickQuestionType = "net.buzzert.sybil2.quick-question"
|
||||
}
|
||||
|
||||
@MainActor
|
||||
public final class SybilQuickActionRouter: ObservableObject {
|
||||
public static let shared = SybilQuickActionRouter()
|
||||
|
||||
@Published public private(set) var quickQuestionPresentationRequest = 0
|
||||
|
||||
private init() {}
|
||||
|
||||
public func requestQuickQuestionPresentation() {
|
||||
quickQuestionPresentationRequest += 1
|
||||
}
|
||||
}
|
||||
302
ios/Packages/Sybil/Sources/Sybil/SybilQuickQuestionView.swift
Normal file
302
ios/Packages/Sybil/Sources/Sybil/SybilQuickQuestionView.swift
Normal file
@@ -0,0 +1,302 @@
|
||||
import MarkdownUI
|
||||
import Observation
|
||||
import SwiftUI
|
||||
|
||||
struct SybilQuickQuestionView: View {
|
||||
@Bindable var viewModel: SybilViewModel
|
||||
var focusRequest: Int
|
||||
|
||||
@Environment(\.dismiss) private var dismiss
|
||||
@FocusState private var promptFocused: Bool
|
||||
|
||||
private var hasAnswerContent: Bool {
|
||||
!viewModel.quickQuestionMessages.isEmpty || viewModel.quickQuestionError != nil
|
||||
}
|
||||
|
||||
var body: some View {
|
||||
VStack(spacing: 0) {
|
||||
VStack(alignment: .leading, spacing: 16) {
|
||||
header
|
||||
|
||||
answerArea
|
||||
|
||||
composer
|
||||
}
|
||||
.padding(.horizontal, 16)
|
||||
.padding(.top, 18)
|
||||
.padding(.bottom, 12)
|
||||
.frame(maxWidth: 640, maxHeight: .infinity, alignment: .top)
|
||||
}
|
||||
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .top)
|
||||
.background(SybilTheme.backgroundGradient)
|
||||
.preferredColorScheme(.dark)
|
||||
.task(id: focusRequest) {
|
||||
try? await Task.sleep(for: .milliseconds(260))
|
||||
guard !Task.isCancelled else {
|
||||
return
|
||||
}
|
||||
promptFocused = true
|
||||
}
|
||||
}
|
||||
|
||||
private var header: some View {
|
||||
HStack {
|
||||
Image(systemName: "sparkles")
|
||||
.font(.system(size: 21, weight: .semibold))
|
||||
.foregroundStyle(SybilTheme.primary)
|
||||
|
||||
Text("Quick question")
|
||||
.font(.title3.weight(.semibold))
|
||||
.foregroundStyle(SybilTheme.text)
|
||||
.lineLimit(1)
|
||||
|
||||
}
|
||||
.frame(maxWidth: .infinity, alignment: .leading)
|
||||
}
|
||||
|
||||
private var answerArea: some View {
|
||||
ScrollView {
|
||||
VStack(alignment: .leading, spacing: 12) {
|
||||
if hasAnswerContent {
|
||||
ForEach(viewModel.quickQuestionMessages) { message in
|
||||
QuickQuestionMessageView(message: message, isSending: viewModel.isQuickQuestionSending)
|
||||
}
|
||||
|
||||
if let error = viewModel.quickQuestionError {
|
||||
Text(error)
|
||||
.font(.caption)
|
||||
.foregroundStyle(SybilTheme.danger)
|
||||
.fixedSize(horizontal: false, vertical: true)
|
||||
}
|
||||
}
|
||||
}
|
||||
.frame(maxWidth: .infinity, alignment: .topLeading)
|
||||
.padding(14)
|
||||
}
|
||||
.scrollDismissesKeyboard(.interactively)
|
||||
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
|
||||
.background(
|
||||
RoundedRectangle(cornerRadius: 12)
|
||||
.fill(Color.black.opacity(0.36))
|
||||
)
|
||||
.overlay(
|
||||
RoundedRectangle(cornerRadius: 12)
|
||||
.stroke(SybilTheme.border.opacity(0.55), lineWidth: 1)
|
||||
)
|
||||
}
|
||||
|
||||
private var composer: some View {
|
||||
VStack(alignment: .leading, spacing: 10) {
|
||||
HStack(alignment: .bottom, spacing: 10) {
|
||||
TextField(
|
||||
"Ask anything...",
|
||||
text: Binding(
|
||||
get: { viewModel.quickQuestionPrompt },
|
||||
set: { viewModel.updateQuickQuestionPrompt($0) }
|
||||
),
|
||||
axis: .vertical
|
||||
)
|
||||
.focused($promptFocused)
|
||||
.font(.body)
|
||||
.textInputAutocapitalization(.sentences)
|
||||
.autocorrectionDisabled(false)
|
||||
.lineLimit(1 ... 6)
|
||||
.submitLabel(.send)
|
||||
.onSubmit(submitQuestion)
|
||||
.padding(.horizontal, 12)
|
||||
.padding(.vertical, 10)
|
||||
.background(
|
||||
RoundedRectangle(cornerRadius: 12)
|
||||
.fill(SybilTheme.composerGradient)
|
||||
.opacity(0.98)
|
||||
)
|
||||
.foregroundStyle(SybilTheme.text)
|
||||
|
||||
Button(action: submitQuestion) {
|
||||
Image(systemName: "arrow.up")
|
||||
.font(.body.weight(.semibold))
|
||||
.frame(width: 40, height: 40)
|
||||
.background(
|
||||
Circle()
|
||||
.fill(
|
||||
viewModel.canSendQuickQuestion
|
||||
? AnyShapeStyle(SybilTheme.primaryGradient)
|
||||
: AnyShapeStyle(SybilTheme.surfaceStrong.opacity(0.92))
|
||||
)
|
||||
)
|
||||
.foregroundStyle(viewModel.canSendQuickQuestion ? SybilTheme.text : SybilTheme.textMuted)
|
||||
}
|
||||
.buttonStyle(.plain)
|
||||
.disabled(!viewModel.canSendQuickQuestion)
|
||||
.accessibilityLabel("Ask quick question")
|
||||
}
|
||||
|
||||
controlsRow
|
||||
}
|
||||
}
|
||||
|
||||
private var convertButton: some View {
|
||||
Button {
|
||||
Task {
|
||||
let didConvert = await viewModel.convertQuickQuestionToChat()
|
||||
if didConvert {
|
||||
dismiss()
|
||||
}
|
||||
}
|
||||
} label: {
|
||||
Label("Chat", systemImage: "bubble.left")
|
||||
.font(.caption.weight(.medium))
|
||||
.lineLimit(1)
|
||||
.minimumScaleFactor(0.8)
|
||||
}
|
||||
.buttonStyle(.plain)
|
||||
.foregroundStyle(viewModel.canConvertQuickQuestion ? SybilTheme.text : SybilTheme.textMuted)
|
||||
.padding(.horizontal, 10)
|
||||
.frame(maxWidth: .infinity, minHeight: 40)
|
||||
.background(
|
||||
RoundedRectangle(cornerRadius: 12)
|
||||
.fill(SybilTheme.surfaceStrong.opacity(0.78))
|
||||
.overlay(
|
||||
RoundedRectangle(cornerRadius: 12)
|
||||
.stroke(SybilTheme.border.opacity(0.78), lineWidth: 1)
|
||||
)
|
||||
)
|
||||
.disabled(!viewModel.canConvertQuickQuestion)
|
||||
}
|
||||
|
||||
private var controlsRow: some View {
|
||||
HStack(alignment: .center, spacing: 10) {
|
||||
providerMenu
|
||||
modelMenu
|
||||
convertButton
|
||||
}
|
||||
}
|
||||
|
||||
private var providerMenu: some View {
|
||||
Menu {
|
||||
ForEach(viewModel.providerOptions, id: \.self) { provider in
|
||||
Button {
|
||||
viewModel.setQuickQuestionProvider(provider)
|
||||
} label: {
|
||||
if viewModel.quickQuestionProvider == provider {
|
||||
Label(provider.displayName, systemImage: "checkmark")
|
||||
} else {
|
||||
Text(provider.displayName)
|
||||
}
|
||||
}
|
||||
}
|
||||
} label: {
|
||||
QuickQuestionPickerPill(title: viewModel.quickQuestionProvider.displayName)
|
||||
}
|
||||
.frame(maxWidth: .infinity)
|
||||
.disabled(viewModel.isQuickQuestionSending || viewModel.isConvertingQuickQuestion)
|
||||
.accessibilityLabel("Quick question provider")
|
||||
}
|
||||
|
||||
private var modelMenu: some View {
|
||||
Menu {
|
||||
if viewModel.quickQuestionProviderModelOptions.isEmpty {
|
||||
Text("No models")
|
||||
} else {
|
||||
ForEach(viewModel.quickQuestionProviderModelOptions, id: \.self) { model in
|
||||
Button {
|
||||
viewModel.setQuickQuestionModel(model)
|
||||
} label: {
|
||||
if viewModel.quickQuestionModel == model {
|
||||
Label(model, systemImage: "checkmark")
|
||||
} else {
|
||||
Text(model)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
} label: {
|
||||
QuickQuestionPickerPill(title: viewModel.quickQuestionModel.isEmpty ? "No model" : viewModel.quickQuestionModel)
|
||||
}
|
||||
.frame(maxWidth: .infinity)
|
||||
.disabled(viewModel.isQuickQuestionSending || viewModel.isConvertingQuickQuestion)
|
||||
.accessibilityLabel("Quick question model")
|
||||
}
|
||||
|
||||
private func submitQuestion() {
|
||||
guard viewModel.canSendQuickQuestion else {
|
||||
return
|
||||
}
|
||||
|
||||
promptFocused = false
|
||||
_ = viewModel.sendQuickQuestion()
|
||||
}
|
||||
}
|
||||
|
||||
private struct QuickQuestionPickerPill: View {
|
||||
var title: String
|
||||
|
||||
var body: some View {
|
||||
HStack(spacing: 8) {
|
||||
Text(title)
|
||||
.font(.caption.weight(.medium))
|
||||
.foregroundStyle(SybilTheme.text)
|
||||
.lineLimit(1)
|
||||
.minimumScaleFactor(0.8)
|
||||
|
||||
Image(systemName: "chevron.down")
|
||||
.font(.caption.weight(.semibold))
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
}
|
||||
.padding(.horizontal, 10)
|
||||
.frame(maxWidth: .infinity, minHeight: 40)
|
||||
.background(
|
||||
RoundedRectangle(cornerRadius: 12)
|
||||
.fill(SybilTheme.surfaceStrong.opacity(0.78))
|
||||
.overlay(
|
||||
RoundedRectangle(cornerRadius: 12)
|
||||
.stroke(SybilTheme.border.opacity(0.78), lineWidth: 1)
|
||||
)
|
||||
)
|
||||
}
|
||||
}
|
||||
|
||||
private struct QuickQuestionMessageView: View {
|
||||
var message: Message
|
||||
var isSending: Bool
|
||||
|
||||
private var isPendingAssistant: Bool {
|
||||
message.id.hasPrefix("temp-assistant-quick-") &&
|
||||
isSending &&
|
||||
message.content.trimmingCharacters(in: .whitespacesAndNewlines).isEmpty
|
||||
}
|
||||
|
||||
var body: some View {
|
||||
if let metadata = message.toolCallMetadata {
|
||||
Text(toolCallSummary(for: metadata, fallbackContent: message.content))
|
||||
.font(.caption)
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
.fixedSize(horizontal: false, vertical: true)
|
||||
} else if isPendingAssistant {
|
||||
HStack(spacing: 8) {
|
||||
ProgressView()
|
||||
.controlSize(.small)
|
||||
.tint(SybilTheme.primary)
|
||||
Text("Thinking...")
|
||||
.font(.caption)
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
}
|
||||
} else if !message.content.trimmingCharacters(in: .whitespacesAndNewlines).isEmpty {
|
||||
Markdown(message.content)
|
||||
.font(.body)
|
||||
.tint(SybilTheme.primary)
|
||||
.foregroundStyle(SybilTheme.text.opacity(0.96))
|
||||
.textSelection(.enabled)
|
||||
}
|
||||
}
|
||||
|
||||
private func toolCallSummary(for metadata: ToolCallMetadata, fallbackContent: String) -> String {
|
||||
if let summary = metadata.summary?.trimmingCharacters(in: .whitespacesAndNewlines), !summary.isEmpty {
|
||||
return summary
|
||||
}
|
||||
if !fallbackContent.trimmingCharacters(in: .whitespacesAndNewlines).isEmpty {
|
||||
return fallbackContent
|
||||
}
|
||||
return "Ran \(metadata.toolName ?? "tool")."
|
||||
}
|
||||
}
|
||||
@@ -11,6 +11,12 @@ final class SybilSettingsStore {
|
||||
static let preferredOpenAIModel = "sybil.ios.preferredOpenAIModel"
|
||||
static let preferredAnthropicModel = "sybil.ios.preferredAnthropicModel"
|
||||
static let preferredXAIModel = "sybil.ios.preferredXAIModel"
|
||||
static let preferredHermesAgentModel = "sybil.ios.preferredHermesAgentModel"
|
||||
static let quickQuestionPreferredProvider = "sybil.ios.quickQuestionPreferredProvider"
|
||||
static let quickQuestionPreferredOpenAIModel = "sybil.ios.quickQuestionPreferredOpenAIModel"
|
||||
static let quickQuestionPreferredAnthropicModel = "sybil.ios.quickQuestionPreferredAnthropicModel"
|
||||
static let quickQuestionPreferredXAIModel = "sybil.ios.quickQuestionPreferredXAIModel"
|
||||
static let quickQuestionPreferredHermesAgentModel = "sybil.ios.quickQuestionPreferredHermesAgentModel"
|
||||
}
|
||||
|
||||
private let defaults: UserDefaults
|
||||
@@ -19,6 +25,8 @@ final class SybilSettingsStore {
|
||||
var adminToken: String
|
||||
var preferredProvider: Provider
|
||||
var preferredModelByProvider: [Provider: String]
|
||||
var quickQuestionPreferredProvider: Provider
|
||||
var quickQuestionPreferredModelByProvider: [Provider: String]
|
||||
|
||||
init(defaults: UserDefaults = .standard) {
|
||||
self.defaults = defaults
|
||||
@@ -32,10 +40,21 @@ final class SybilSettingsStore {
|
||||
let provider = defaults.string(forKey: Keys.preferredProvider).flatMap(Provider.init(rawValue:)) ?? .openai
|
||||
self.preferredProvider = provider
|
||||
|
||||
self.preferredModelByProvider = [
|
||||
let preferredModels: [Provider: String] = [
|
||||
.openai: defaults.string(forKey: Keys.preferredOpenAIModel) ?? "gpt-4.1-mini",
|
||||
.anthropic: defaults.string(forKey: Keys.preferredAnthropicModel) ?? "claude-3-5-sonnet-latest",
|
||||
.xai: defaults.string(forKey: Keys.preferredXAIModel) ?? "grok-3-mini"
|
||||
.xai: defaults.string(forKey: Keys.preferredXAIModel) ?? "grok-3-mini",
|
||||
.hermesAgent: defaults.string(forKey: Keys.preferredHermesAgentModel) ?? "hermes-agent"
|
||||
]
|
||||
self.preferredModelByProvider = preferredModels
|
||||
|
||||
self.quickQuestionPreferredProvider =
|
||||
defaults.string(forKey: Keys.quickQuestionPreferredProvider).flatMap(Provider.init(rawValue:)) ?? provider
|
||||
self.quickQuestionPreferredModelByProvider = [
|
||||
.openai: defaults.string(forKey: Keys.quickQuestionPreferredOpenAIModel) ?? preferredModels[.openai] ?? "gpt-4.1-mini",
|
||||
.anthropic: defaults.string(forKey: Keys.quickQuestionPreferredAnthropicModel) ?? preferredModels[.anthropic] ?? "claude-3-5-sonnet-latest",
|
||||
.xai: defaults.string(forKey: Keys.quickQuestionPreferredXAIModel) ?? preferredModels[.xai] ?? "grok-3-mini",
|
||||
.hermesAgent: defaults.string(forKey: Keys.quickQuestionPreferredHermesAgentModel) ?? preferredModels[.hermesAgent] ?? "hermes-agent"
|
||||
]
|
||||
}
|
||||
|
||||
@@ -53,6 +72,13 @@ final class SybilSettingsStore {
|
||||
defaults.set(preferredModelByProvider[.openai], forKey: Keys.preferredOpenAIModel)
|
||||
defaults.set(preferredModelByProvider[.anthropic], forKey: Keys.preferredAnthropicModel)
|
||||
defaults.set(preferredModelByProvider[.xai], forKey: Keys.preferredXAIModel)
|
||||
defaults.set(preferredModelByProvider[.hermesAgent], forKey: Keys.preferredHermesAgentModel)
|
||||
|
||||
defaults.set(quickQuestionPreferredProvider.rawValue, forKey: Keys.quickQuestionPreferredProvider)
|
||||
defaults.set(quickQuestionPreferredModelByProvider[.openai], forKey: Keys.quickQuestionPreferredOpenAIModel)
|
||||
defaults.set(quickQuestionPreferredModelByProvider[.anthropic], forKey: Keys.quickQuestionPreferredAnthropicModel)
|
||||
defaults.set(quickQuestionPreferredModelByProvider[.xai], forKey: Keys.quickQuestionPreferredXAIModel)
|
||||
defaults.set(quickQuestionPreferredModelByProvider[.hermesAgent], forKey: Keys.quickQuestionPreferredHermesAgentModel)
|
||||
}
|
||||
|
||||
var trimmedTokenOrNil: String? {
|
||||
@@ -68,7 +94,7 @@ final class SybilSettingsStore {
|
||||
raw.removeLast()
|
||||
}
|
||||
|
||||
guard var components = URLComponents(string: raw) else {
|
||||
guard let components = URLComponents(string: raw) else {
|
||||
return nil
|
||||
}
|
||||
|
||||
|
||||
@@ -4,13 +4,6 @@ import SwiftUI
|
||||
struct SybilSidebarView: View {
|
||||
@Bindable var viewModel: SybilViewModel
|
||||
|
||||
private func iconName(for item: SidebarItem) -> String {
|
||||
switch item.kind {
|
||||
case .chat: return "message"
|
||||
case .search: return "globe"
|
||||
}
|
||||
}
|
||||
|
||||
private func isSelected(_ item: SidebarItem) -> Bool {
|
||||
viewModel.draftKind == nil && viewModel.selectedItem == item.selection
|
||||
}
|
||||
@@ -57,105 +50,13 @@ struct SybilSidebarView: View {
|
||||
.overlay(SybilTheme.border)
|
||||
}
|
||||
|
||||
if viewModel.isLoadingCollections && viewModel.sidebarItems.isEmpty {
|
||||
VStack(alignment: .leading, spacing: 8) {
|
||||
ProgressView()
|
||||
.tint(SybilTheme.primary)
|
||||
Text("Loading conversations…")
|
||||
.font(.sybil(.footnote))
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
SybilSidebarItemList(
|
||||
viewModel: viewModel,
|
||||
isSelected: isSelected,
|
||||
onSelect: { item in
|
||||
viewModel.select(item.selection)
|
||||
}
|
||||
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
|
||||
.padding(16)
|
||||
} else if viewModel.sidebarItems.isEmpty {
|
||||
VStack(spacing: 10) {
|
||||
Image(systemName: "message.badge")
|
||||
.font(.system(size: 20, weight: .medium))
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
Text("Start a chat or run your first search.")
|
||||
.font(.sybil(.footnote))
|
||||
.multilineTextAlignment(.center)
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
}
|
||||
.frame(maxWidth: .infinity, maxHeight: .infinity)
|
||||
.padding(16)
|
||||
} else {
|
||||
ScrollView {
|
||||
LazyVStack(alignment: .leading, spacing: 8) {
|
||||
ForEach(viewModel.sidebarItems) { item in
|
||||
Button {
|
||||
viewModel.select(item.selection)
|
||||
} label: {
|
||||
VStack(alignment: .leading, spacing: 6) {
|
||||
HStack(spacing: 8) {
|
||||
Image(systemName: iconName(for: item))
|
||||
.font(.system(size: 12, weight: .semibold))
|
||||
.foregroundStyle(isSelected(item) ? SybilTheme.accent : SybilTheme.textMuted)
|
||||
.frame(width: 22, height: 22)
|
||||
.background(
|
||||
RoundedRectangle(cornerRadius: 7)
|
||||
.fill(isSelected(item) ? SybilTheme.accent.opacity(0.12) : SybilTheme.surface.opacity(0.72))
|
||||
.overlay(
|
||||
RoundedRectangle(cornerRadius: 7)
|
||||
.stroke(isSelected(item) ? SybilTheme.accent.opacity(0.36) : SybilTheme.border.opacity(0.72), lineWidth: 1)
|
||||
)
|
||||
)
|
||||
|
||||
Text(item.title)
|
||||
.font(.sybil(.subheadline, weight: .semibold))
|
||||
.lineLimit(1)
|
||||
}
|
||||
|
||||
HStack(spacing: 8) {
|
||||
Text(item.updatedAt.sybilRelativeLabel)
|
||||
.font(.sybil(.caption2))
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
|
||||
if let initiated = item.initiatedLabel {
|
||||
Spacer(minLength: 0)
|
||||
Text(initiated)
|
||||
.font(.sybil(.caption2))
|
||||
.foregroundStyle(SybilTheme.textMuted.opacity(0.88))
|
||||
.lineLimit(1)
|
||||
.multilineTextAlignment(.trailing)
|
||||
.frame(maxWidth: .infinity, alignment: .trailing)
|
||||
}
|
||||
}
|
||||
}
|
||||
.foregroundStyle(SybilTheme.text)
|
||||
.padding(.horizontal, 12)
|
||||
.padding(.vertical, 10)
|
||||
.frame(maxWidth: .infinity, alignment: .leading)
|
||||
.background(
|
||||
RoundedRectangle(cornerRadius: 12)
|
||||
.fill(isSelected(item) ? SybilTheme.selectedRowGradient : LinearGradient(colors: [SybilTheme.surface.opacity(0.56), SybilTheme.surface.opacity(0.36)], startPoint: .topLeading, endPoint: .bottomTrailing))
|
||||
)
|
||||
.overlay(
|
||||
RoundedRectangle(cornerRadius: 12)
|
||||
.stroke(isSelected(item) ? SybilTheme.primary.opacity(0.55) : SybilTheme.border.opacity(0.72), lineWidth: 1)
|
||||
)
|
||||
}
|
||||
.buttonStyle(.plain)
|
||||
.contextMenu {
|
||||
Button(role: .destructive) {
|
||||
Task {
|
||||
await viewModel.deleteItem(item.selection)
|
||||
}
|
||||
} label: {
|
||||
Label("Delete", systemImage: "trash")
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.padding(10)
|
||||
}
|
||||
.refreshable {
|
||||
await viewModel.refreshVisibleContent(
|
||||
refreshCollections: true,
|
||||
refreshSelection: false
|
||||
)
|
||||
}
|
||||
}
|
||||
)
|
||||
|
||||
}
|
||||
.background(SybilTheme.panelGradient)
|
||||
@@ -205,3 +106,148 @@ struct SybilSidebarView: View {
|
||||
.buttonStyle(.plain)
|
||||
}
|
||||
}
|
||||
|
||||
struct SybilSidebarItemList: View {
|
||||
@Bindable var viewModel: SybilViewModel
|
||||
var isSelected: (SidebarItem) -> Bool
|
||||
var onSelect: (SidebarItem) -> Void
|
||||
|
||||
var body: some View {
|
||||
if viewModel.isLoadingCollections && viewModel.sidebarItems.isEmpty {
|
||||
VStack(alignment: .leading, spacing: 8) {
|
||||
ProgressView()
|
||||
.tint(SybilTheme.primary)
|
||||
Text("Loading conversations…")
|
||||
.font(.sybil(.footnote))
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
}
|
||||
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
|
||||
.padding(16)
|
||||
} else if viewModel.sidebarItems.isEmpty {
|
||||
VStack(spacing: 10) {
|
||||
Image(systemName: "message.badge")
|
||||
.font(.system(size: 20, weight: .medium))
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
Text("Start a chat or run your first search.")
|
||||
.font(.sybil(.footnote))
|
||||
.multilineTextAlignment(.center)
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
}
|
||||
.frame(maxWidth: .infinity, maxHeight: .infinity)
|
||||
.padding(16)
|
||||
} else {
|
||||
ScrollView {
|
||||
LazyVStack(alignment: .leading, spacing: 8) {
|
||||
ForEach(viewModel.sidebarItems) { item in
|
||||
Button {
|
||||
onSelect(item)
|
||||
} label: {
|
||||
SybilSidebarRow(item: item, isSelected: isSelected(item))
|
||||
}
|
||||
.buttonStyle(.plain)
|
||||
.contextMenu {
|
||||
Button(role: .destructive) {
|
||||
Task {
|
||||
await viewModel.deleteItem(item.selection)
|
||||
}
|
||||
} label: {
|
||||
Label("Delete", systemImage: "trash")
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
.padding(10)
|
||||
}
|
||||
.refreshable {
|
||||
await viewModel.refreshSidebarCollectionsFromPullToRefresh()
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
struct SybilSidebarRow: View {
|
||||
var item: SidebarItem
|
||||
var isSelected: Bool
|
||||
|
||||
private var isHighlighted: Bool {
|
||||
isSelected
|
||||
}
|
||||
|
||||
private var iconName: String {
|
||||
switch item.kind {
|
||||
case .chat: return "message"
|
||||
case .search: return "globe"
|
||||
}
|
||||
}
|
||||
|
||||
var body: some View {
|
||||
VStack(alignment: .leading, spacing: 6) {
|
||||
HStack(spacing: 8) {
|
||||
Image(systemName: iconName)
|
||||
.font(.system(size: 12, weight: .semibold))
|
||||
.foregroundStyle(isHighlighted ? SybilTheme.accent : SybilTheme.textMuted)
|
||||
.frame(width: 22, height: 22)
|
||||
.background(
|
||||
RoundedRectangle(cornerRadius: 7)
|
||||
.fill(isHighlighted ? SybilTheme.accent.opacity(0.12) : SybilTheme.surface.opacity(0.72))
|
||||
.overlay(
|
||||
RoundedRectangle(cornerRadius: 7)
|
||||
.stroke(isHighlighted ? SybilTheme.accent.opacity(0.36) : SybilTheme.border.opacity(0.72), lineWidth: 1)
|
||||
)
|
||||
)
|
||||
|
||||
Text(item.title)
|
||||
.font(.sybil(.subheadline, weight: .semibold))
|
||||
.lineLimit(1)
|
||||
.layoutPriority(1)
|
||||
|
||||
Spacer(minLength: 8)
|
||||
|
||||
if item.isRunning {
|
||||
SybilSidebarActivityIndicator()
|
||||
}
|
||||
}
|
||||
|
||||
HStack(spacing: 8) {
|
||||
Text(item.updatedAt.sybilRelativeLabel)
|
||||
.font(.sybil(.caption2))
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
|
||||
if let initiated = item.initiatedLabel {
|
||||
Spacer(minLength: 0)
|
||||
Text(initiated)
|
||||
.font(.sybil(.caption2))
|
||||
.foregroundStyle(SybilTheme.textMuted.opacity(0.88))
|
||||
.lineLimit(1)
|
||||
.multilineTextAlignment(.trailing)
|
||||
.frame(maxWidth: .infinity, alignment: .trailing)
|
||||
}
|
||||
}
|
||||
}
|
||||
.foregroundStyle(SybilTheme.text)
|
||||
.padding(.horizontal, 12)
|
||||
.padding(.vertical, 10)
|
||||
.frame(maxWidth: .infinity, alignment: .leading)
|
||||
.background(
|
||||
RoundedRectangle(cornerRadius: 12)
|
||||
.fill(isHighlighted ? SybilTheme.selectedRowGradient : LinearGradient(colors: [SybilTheme.surface.opacity(0.56), SybilTheme.surface.opacity(0.36)], startPoint: .topLeading, endPoint: .bottomTrailing))
|
||||
)
|
||||
.overlay(
|
||||
RoundedRectangle(cornerRadius: 12)
|
||||
.stroke(isHighlighted ? SybilTheme.primary.opacity(0.55) : SybilTheme.border.opacity(0.72), lineWidth: 1)
|
||||
)
|
||||
.contentShape(RoundedRectangle(cornerRadius: 12))
|
||||
}
|
||||
}
|
||||
|
||||
struct SybilSidebarActivityIndicator: View {
|
||||
var body: some View {
|
||||
ProgressView()
|
||||
.progressViewStyle(.circular)
|
||||
.controlSize(.small)
|
||||
.tint(SybilTheme.accent)
|
||||
.scaleEffect(0.82)
|
||||
.frame(width: 16, height: 16)
|
||||
.accessibilityLabel("Generating")
|
||||
}
|
||||
}
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
@@ -50,7 +50,7 @@ struct SybilWorkspaceView: View {
|
||||
}
|
||||
|
||||
private var showsCustomWorkspaceNavigation: Bool {
|
||||
usesCustomWorkspaceNavigation && (!isSettingsSelected || navigationLeadingControl == .back)
|
||||
usesCustomWorkspaceNavigation && (!isSettingsSelected || navigationLeadingControl != .hidden)
|
||||
}
|
||||
|
||||
private var transcriptScrollContextID: String {
|
||||
@@ -75,7 +75,7 @@ struct SybilWorkspaceView: View {
|
||||
guard onRequestNewChat != nil else {
|
||||
return false
|
||||
}
|
||||
guard !viewModel.isSending, viewModel.draftKind == nil else {
|
||||
guard !viewModel.isActiveSelectionSending, viewModel.draftKind == nil else {
|
||||
return false
|
||||
}
|
||||
guard case .chat = viewModel.selectedItem else {
|
||||
@@ -155,7 +155,7 @@ struct SybilWorkspaceView: View {
|
||||
workspaceContentStack
|
||||
|
||||
if showsCustomWorkspaceNavigation {
|
||||
SybilWorkspaceCharacterBackdrop(isBusy: viewModel.isSending)
|
||||
SybilWorkspaceCharacterBackdrop(isBusy: viewModel.isActiveSelectionSending)
|
||||
.allowsHitTesting(false)
|
||||
customWorkspaceNavigationBar
|
||||
}
|
||||
@@ -232,13 +232,7 @@ struct SybilWorkspaceView: View {
|
||||
HStack(spacing: 14) {
|
||||
workspaceNavigationLeadingControl
|
||||
|
||||
Text(viewModel.selectedTitle)
|
||||
.font(.sybil(size: 16, weight: .semibold))
|
||||
.foregroundStyle(SybilTheme.text)
|
||||
.lineLimit(1)
|
||||
.minimumScaleFactor(0.78)
|
||||
.frame(maxWidth: .infinity, alignment: .leading)
|
||||
.multilineTextAlignment(.leading)
|
||||
customWorkspaceNavigationTitle
|
||||
|
||||
workspaceNavigationTrailingControl
|
||||
}
|
||||
@@ -251,6 +245,32 @@ struct SybilWorkspaceView: View {
|
||||
}
|
||||
}
|
||||
|
||||
private var selectedProviderModelSubtitle: String {
|
||||
let selectedModel = viewModel.model.trimmingCharacters(in: .whitespacesAndNewlines)
|
||||
guard !selectedModel.isEmpty else {
|
||||
return viewModel.provider.displayName
|
||||
}
|
||||
return "\(viewModel.provider.displayName) • \(selectedModel)"
|
||||
}
|
||||
|
||||
private var customWorkspaceNavigationTitle: some View {
|
||||
VStack(alignment: .leading, spacing: 2) {
|
||||
Text(viewModel.selectedTitle)
|
||||
.font(.sybil(size: 16, weight: .semibold))
|
||||
.foregroundStyle(SybilTheme.text)
|
||||
.lineLimit(1)
|
||||
.minimumScaleFactor(0.78)
|
||||
|
||||
Text(selectedProviderModelSubtitle)
|
||||
.font(.sybil(size: 10, weight: .medium))
|
||||
.foregroundStyle(SybilTheme.textMuted)
|
||||
.lineLimit(1)
|
||||
.minimumScaleFactor(0.82)
|
||||
}
|
||||
.frame(maxWidth: .infinity, alignment: .leading)
|
||||
.multilineTextAlignment(.leading)
|
||||
}
|
||||
|
||||
@ViewBuilder
|
||||
private var workspaceNavigationLeadingControl: some View {
|
||||
switch navigationLeadingControl {
|
||||
@@ -495,7 +515,7 @@ struct SybilWorkspaceView: View {
|
||||
|
||||
Divider()
|
||||
|
||||
ForEach(Provider.allCases, id: \.self) { candidate in
|
||||
ForEach(viewModel.providerOptions, id: \.self) { candidate in
|
||||
Menu(candidate.displayName) {
|
||||
let models = viewModel.modelOptions(for: candidate)
|
||||
if models.isEmpty {
|
||||
@@ -560,10 +580,10 @@ struct SybilWorkspaceView: View {
|
||||
Circle()
|
||||
.stroke(SybilTheme.border.opacity(0.82), lineWidth: 1)
|
||||
)
|
||||
.foregroundStyle(viewModel.isSending ? SybilTheme.textMuted : SybilTheme.text)
|
||||
.foregroundStyle(viewModel.isActiveSelectionSending ? SybilTheme.textMuted : SybilTheme.text)
|
||||
}
|
||||
.buttonStyle(.plain)
|
||||
.disabled(viewModel.isSending)
|
||||
.disabled(viewModel.isActiveSelectionSending)
|
||||
.accessibilityLabel("Attach file")
|
||||
}
|
||||
|
||||
@@ -626,7 +646,7 @@ struct SybilWorkspaceView: View {
|
||||
}
|
||||
}
|
||||
.onDrop(of: [UTType.fileURL.identifier, UTType.image.identifier], isTargeted: $isComposerDropTargeted) { providers in
|
||||
if viewModel.isSearchMode || viewModel.isSending {
|
||||
if viewModel.isSearchMode || viewModel.isActiveSelectionSending {
|
||||
return false
|
||||
}
|
||||
|
||||
@@ -703,9 +723,7 @@ struct SybilWorkspaceView: View {
|
||||
}
|
||||
|
||||
#if !targetEnvironment(macCatalyst)
|
||||
if !viewModel.isSearchMode {
|
||||
composerFocused = false
|
||||
}
|
||||
composerFocused = false
|
||||
#endif
|
||||
|
||||
Task {
|
||||
|
||||
@@ -6,8 +6,20 @@ import Testing
|
||||
private struct MockClientCallSnapshot: Sendable {
|
||||
var listChats = 0
|
||||
var listSearches = 0
|
||||
var createChat = 0
|
||||
var getChat = 0
|
||||
var getSearch = 0
|
||||
var getActiveRuns = 0
|
||||
var runCompletionStream = 0
|
||||
var attachCompletionStream = 0
|
||||
var attachSearchStream = 0
|
||||
}
|
||||
|
||||
private struct ChatCreateCallSnapshot: Sendable {
|
||||
var title: String?
|
||||
var provider: Provider?
|
||||
var model: String?
|
||||
var messages: [CompletionRequestMessage]?
|
||||
}
|
||||
|
||||
private struct UnexpectedClientCall: Error {}
|
||||
@@ -18,38 +30,68 @@ private actor MockSybilClient: SybilAPIClienting {
|
||||
private let chatDetails: [String: ChatDetail]
|
||||
private let searchDetails: [String: SearchDetail]
|
||||
private let createChatResponse: ChatSummary?
|
||||
private let activeRunsResponse: ActiveRunsResponse
|
||||
|
||||
private var snapshot = MockClientCallSnapshot()
|
||||
private var lastCreateChatCall: ChatCreateCallSnapshot?
|
||||
private var lastCompletionStreamBody: CompletionStreamRequest?
|
||||
private var completionStreamEvents: [CompletionStreamEvent]?
|
||||
private var listChatsDelayNanoseconds: UInt64 = 0
|
||||
private var listSearchesDelayNanoseconds: UInt64 = 0
|
||||
private var getChatDelayNanoseconds: UInt64 = 0
|
||||
private var getSearchDelayNanoseconds: UInt64 = 0
|
||||
private var completionStreamNetworkErrorMessage: String?
|
||||
private var completionStreamDelayNanoseconds: UInt64 = 0
|
||||
private var completionAttachEvents: [String: [CompletionStreamEvent]] = [:]
|
||||
private var completionAttachDelayNanoseconds: UInt64 = 0
|
||||
private var searchStreamNetworkErrorMessage: String?
|
||||
private var searchStreamDelayNanoseconds: UInt64 = 0
|
||||
private var searchAttachEvents: [String: [SearchStreamEvent]] = [:]
|
||||
private var searchAttachDelayNanoseconds: UInt64 = 0
|
||||
|
||||
init(
|
||||
chatsResponse: [ChatSummary] = [],
|
||||
searchesResponse: [SearchSummary] = [],
|
||||
chatDetails: [String: ChatDetail] = [:],
|
||||
searchDetails: [String: SearchDetail] = [:],
|
||||
createChatResponse: ChatSummary? = nil
|
||||
createChatResponse: ChatSummary? = nil,
|
||||
activeRunsResponse: ActiveRunsResponse = ActiveRunsResponse()
|
||||
) {
|
||||
self.chatsResponse = chatsResponse
|
||||
self.searchesResponse = searchesResponse
|
||||
self.chatDetails = chatDetails
|
||||
self.searchDetails = searchDetails
|
||||
self.createChatResponse = createChatResponse
|
||||
self.activeRunsResponse = activeRunsResponse
|
||||
}
|
||||
|
||||
func currentSnapshot() -> MockClientCallSnapshot {
|
||||
snapshot
|
||||
}
|
||||
|
||||
func currentCreateChatCall() -> ChatCreateCallSnapshot? {
|
||||
lastCreateChatCall
|
||||
}
|
||||
|
||||
func currentCompletionStreamBody() -> CompletionStreamRequest? {
|
||||
lastCompletionStreamBody
|
||||
}
|
||||
|
||||
func setCompletionStreamEvents(_ events: [CompletionStreamEvent], delayNanoseconds: UInt64 = 0) {
|
||||
completionStreamEvents = events
|
||||
completionStreamDelayNanoseconds = delayNanoseconds
|
||||
}
|
||||
|
||||
func setCompletionStreamNetworkError(_ message: String, delayNanoseconds: UInt64 = 0) {
|
||||
completionStreamNetworkErrorMessage = message
|
||||
completionStreamDelayNanoseconds = delayNanoseconds
|
||||
}
|
||||
|
||||
func setListDelays(chats: UInt64 = 0, searches: UInt64 = 0) {
|
||||
listChatsDelayNanoseconds = chats
|
||||
listSearchesDelayNanoseconds = searches
|
||||
}
|
||||
|
||||
func setGetChatDelay(_ delayNanoseconds: UInt64) {
|
||||
getChatDelayNanoseconds = delayNanoseconds
|
||||
}
|
||||
@@ -63,16 +105,49 @@ private actor MockSybilClient: SybilAPIClienting {
|
||||
searchStreamDelayNanoseconds = delayNanoseconds
|
||||
}
|
||||
|
||||
func setCompletionAttachEvents(
|
||||
chatID: String,
|
||||
events: [CompletionStreamEvent],
|
||||
delayNanoseconds: UInt64 = 0
|
||||
) {
|
||||
completionAttachEvents[chatID] = events
|
||||
completionAttachDelayNanoseconds = delayNanoseconds
|
||||
}
|
||||
|
||||
func setSearchAttachEvents(
|
||||
searchID: String,
|
||||
events: [SearchStreamEvent],
|
||||
delayNanoseconds: UInt64 = 0
|
||||
) {
|
||||
searchAttachEvents[searchID] = events
|
||||
searchAttachDelayNanoseconds = delayNanoseconds
|
||||
}
|
||||
|
||||
func verifySession() async throws -> AuthSession {
|
||||
AuthSession(authenticated: true, mode: "open")
|
||||
}
|
||||
|
||||
func listChats() async throws -> [ChatSummary] {
|
||||
snapshot.listChats += 1
|
||||
if listChatsDelayNanoseconds > 0 {
|
||||
try await Task.sleep(nanoseconds: listChatsDelayNanoseconds)
|
||||
}
|
||||
return chatsResponse
|
||||
}
|
||||
|
||||
func createChat(title: String?) async throws -> ChatSummary {
|
||||
func createChat(
|
||||
title: String?,
|
||||
provider: Provider?,
|
||||
model: String?,
|
||||
messages: [CompletionRequestMessage]?
|
||||
) async throws -> ChatSummary {
|
||||
snapshot.createChat += 1
|
||||
lastCreateChatCall = ChatCreateCallSnapshot(
|
||||
title: title,
|
||||
provider: provider,
|
||||
model: model,
|
||||
messages: messages
|
||||
)
|
||||
if let createChatResponse {
|
||||
return createChatResponse
|
||||
}
|
||||
@@ -100,6 +175,9 @@ private actor MockSybilClient: SybilAPIClienting {
|
||||
|
||||
func listSearches() async throws -> [SearchSummary] {
|
||||
snapshot.listSearches += 1
|
||||
if listSearchesDelayNanoseconds > 0 {
|
||||
try await Task.sleep(nanoseconds: listSearchesDelayNanoseconds)
|
||||
}
|
||||
return searchesResponse
|
||||
}
|
||||
|
||||
@@ -130,19 +208,46 @@ private actor MockSybilClient: SybilAPIClienting {
|
||||
ModelCatalogResponse(providers: [:])
|
||||
}
|
||||
|
||||
func getActiveRuns() async throws -> ActiveRunsResponse {
|
||||
snapshot.getActiveRuns += 1
|
||||
return activeRunsResponse
|
||||
}
|
||||
|
||||
func runCompletionStream(
|
||||
body: CompletionStreamRequest,
|
||||
onEvent: @escaping @Sendable (CompletionStreamEvent) async -> Void
|
||||
) async throws {
|
||||
snapshot.runCompletionStream += 1
|
||||
lastCompletionStreamBody = body
|
||||
if completionStreamDelayNanoseconds > 0 {
|
||||
try await Task.sleep(nanoseconds: completionStreamDelayNanoseconds)
|
||||
}
|
||||
if let completionStreamNetworkErrorMessage {
|
||||
throw APIError.networkError(message: completionStreamNetworkErrorMessage)
|
||||
}
|
||||
if let completionStreamEvents {
|
||||
for event in completionStreamEvents {
|
||||
await onEvent(event)
|
||||
}
|
||||
return
|
||||
}
|
||||
throw UnexpectedClientCall()
|
||||
}
|
||||
|
||||
func attachCompletionStream(
|
||||
chatID: String,
|
||||
onEvent: @escaping @Sendable (CompletionStreamEvent) async -> Void
|
||||
) async throws {
|
||||
snapshot.attachCompletionStream += 1
|
||||
let events = completionAttachEvents[chatID] ?? []
|
||||
for event in events {
|
||||
await onEvent(event)
|
||||
}
|
||||
if completionAttachDelayNanoseconds > 0 {
|
||||
try await Task.sleep(nanoseconds: completionAttachDelayNanoseconds)
|
||||
}
|
||||
}
|
||||
|
||||
func runSearchStream(
|
||||
searchID: String,
|
||||
body: SearchRunRequest,
|
||||
@@ -156,6 +261,20 @@ private actor MockSybilClient: SybilAPIClienting {
|
||||
}
|
||||
throw UnexpectedClientCall()
|
||||
}
|
||||
|
||||
func attachSearchStream(
|
||||
searchID: String,
|
||||
onEvent: @escaping @Sendable (SearchStreamEvent) async -> Void
|
||||
) async throws {
|
||||
snapshot.attachSearchStream += 1
|
||||
let events = searchAttachEvents[searchID] ?? []
|
||||
for event in events {
|
||||
await onEvent(event)
|
||||
}
|
||||
if searchAttachDelayNanoseconds > 0 {
|
||||
try await Task.sleep(nanoseconds: searchAttachDelayNanoseconds)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
@MainActor
|
||||
@@ -277,6 +396,33 @@ private func makeSearchDetail(id: String, date: Date, answer: String) -> SearchD
|
||||
#expect(viewModel.selectedItem == .chat("chat-1"))
|
||||
}
|
||||
|
||||
@MainActor
|
||||
@Test func pullToRefreshCompletesWhenRefreshableTaskIsCancelled() async throws {
|
||||
let date = Date(timeIntervalSince1970: 1_700_000_050)
|
||||
let chat = makeChatSummary(id: "chat-cancelled", date: date)
|
||||
let search = makeSearchSummary(id: "search-cancelled", date: date)
|
||||
let client = MockSybilClient(
|
||||
chatsResponse: [chat],
|
||||
searchesResponse: [search]
|
||||
)
|
||||
await client.setListDelays(chats: 50_000_000, searches: 50_000_000)
|
||||
let viewModel = SybilViewModel(settings: testSettings(named: #function)) { _ in client }
|
||||
viewModel.isAuthenticated = true
|
||||
viewModel.isCheckingSession = false
|
||||
|
||||
let refreshTask = Task {
|
||||
await viewModel.refreshSidebarCollectionsFromPullToRefresh()
|
||||
}
|
||||
try await Task.sleep(nanoseconds: 10_000_000)
|
||||
refreshTask.cancel()
|
||||
await refreshTask.value
|
||||
|
||||
#expect(viewModel.errorMessage == nil)
|
||||
#expect(!viewModel.isLoadingCollections)
|
||||
#expect(viewModel.chats.map(\.id) == ["chat-cancelled"])
|
||||
#expect(viewModel.searches.map(\.id) == ["search-cancelled"])
|
||||
}
|
||||
|
||||
@MainActor
|
||||
@Test func foregroundChatRefreshReloadsSelectedTranscript() async throws {
|
||||
let date = Date(timeIntervalSince1970: 1_700_000_100)
|
||||
@@ -409,6 +555,170 @@ private func makeSearchDetail(id: String, date: Date, answer: String) -> SearchD
|
||||
await sendTask.value
|
||||
}
|
||||
|
||||
@MainActor
|
||||
@Test func quickQuestionRunsNonPersistentCompletionStream() async throws {
|
||||
let client = MockSybilClient()
|
||||
await client.setCompletionStreamEvents([
|
||||
.delta(CompletionStreamDelta(text: "Reset it from ")),
|
||||
.done(CompletionStreamDone(text: "Reset it from Settings."))
|
||||
])
|
||||
let viewModel = SybilViewModel(settings: testSettings(named: #function)) { _ in client }
|
||||
viewModel.isAuthenticated = true
|
||||
viewModel.isCheckingSession = false
|
||||
viewModel.quickQuestionPrompt = "How do I reset my password?"
|
||||
|
||||
let task = viewModel.sendQuickQuestion()
|
||||
await task?.value
|
||||
|
||||
let snapshot = await client.currentSnapshot()
|
||||
let body = await client.currentCompletionStreamBody()
|
||||
#expect(snapshot.runCompletionStream == 1)
|
||||
#expect(body?.persist == false)
|
||||
#expect(body?.chatId == nil)
|
||||
#expect(body?.provider == .openai)
|
||||
#expect(body?.messages.first?.role == .user)
|
||||
#expect(body?.messages.first?.content == "How do I reset my password?")
|
||||
#expect(viewModel.quickQuestionAnswerText == "Reset it from Settings.")
|
||||
#expect(!viewModel.isQuickQuestionSending)
|
||||
}
|
||||
|
||||
@MainActor
|
||||
@Test func quickQuestionConvertCreatesSeededChat() async throws {
|
||||
let date = Date(timeIntervalSince1970: 1_700_000_250)
|
||||
let chat = makeChatSummary(id: "quick-chat", date: date)
|
||||
let detail = ChatDetail(
|
||||
id: chat.id,
|
||||
title: chat.title,
|
||||
createdAt: chat.createdAt,
|
||||
updatedAt: chat.updatedAt,
|
||||
initiatedProvider: .openai,
|
||||
initiatedModel: "gpt-4.1-mini",
|
||||
lastUsedProvider: .openai,
|
||||
lastUsedModel: "gpt-4.1-mini",
|
||||
messages: [
|
||||
Message(id: "quick-user", createdAt: date, role: .user, content: "How do I reset my password?", name: nil),
|
||||
Message(id: "quick-assistant", createdAt: date, role: .assistant, content: "Reset it from Settings.", name: nil)
|
||||
]
|
||||
)
|
||||
let client = MockSybilClient(
|
||||
chatsResponse: [chat],
|
||||
chatDetails: [chat.id: detail],
|
||||
createChatResponse: chat
|
||||
)
|
||||
let viewModel = SybilViewModel(settings: testSettings(named: #function)) { _ in client }
|
||||
viewModel.isAuthenticated = true
|
||||
viewModel.isCheckingSession = false
|
||||
viewModel.quickQuestionSubmittedPrompt = "How do I reset my password?"
|
||||
viewModel.quickQuestionSubmittedProvider = .openai
|
||||
viewModel.quickQuestionSubmittedModel = "gpt-4.1-mini"
|
||||
viewModel.quickQuestionMessages = [
|
||||
Message(
|
||||
id: "temp-assistant-quick",
|
||||
createdAt: date,
|
||||
role: .assistant,
|
||||
content: "Reset it from Settings.",
|
||||
name: nil
|
||||
)
|
||||
]
|
||||
|
||||
let didConvert = await viewModel.convertQuickQuestionToChat()
|
||||
|
||||
let snapshot = await client.currentSnapshot()
|
||||
let createCall = await client.currentCreateChatCall()
|
||||
#expect(didConvert)
|
||||
#expect(snapshot.createChat == 1)
|
||||
#expect(createCall?.title == "How do I reset my password?")
|
||||
#expect(createCall?.provider == .openai)
|
||||
#expect(createCall?.model == "gpt-4.1-mini")
|
||||
#expect(createCall?.messages?.map(\.role) == [.user, .assistant])
|
||||
#expect(createCall?.messages?.map(\.content) == ["How do I reset my password?", "Reset it from Settings."])
|
||||
#expect(viewModel.selectedItem == .chat("quick-chat"))
|
||||
#expect(viewModel.quickQuestionPrompt.isEmpty)
|
||||
}
|
||||
|
||||
@MainActor
|
||||
@Test func quickQuestionProviderAndModelSelectionPersistSeparately() async throws {
|
||||
let defaults = UserDefaults(suiteName: #function)!
|
||||
defaults.removePersistentDomain(forName: #function)
|
||||
let settings = SybilSettingsStore(defaults: defaults)
|
||||
settings.apiBaseURL = "http://127.0.0.1:8787"
|
||||
let viewModel = SybilViewModel(settings: settings) { _ in MockSybilClient() }
|
||||
viewModel.modelCatalog = [
|
||||
.openai: ProviderModelInfo(models: ["gpt-4.1-mini", "gpt-4o"], loadedAt: nil, error: nil),
|
||||
.anthropic: ProviderModelInfo(models: ["claude-3-5-sonnet-latest", "claude-3-haiku"], loadedAt: nil, error: nil)
|
||||
]
|
||||
|
||||
viewModel.setQuickQuestionProvider(.anthropic)
|
||||
viewModel.setQuickQuestionModel("claude-3-haiku")
|
||||
|
||||
#expect(viewModel.quickQuestionProvider == .anthropic)
|
||||
#expect(viewModel.quickQuestionModel == "claude-3-haiku")
|
||||
#expect(settings.preferredProvider == .openai)
|
||||
|
||||
let reloadedSettings = SybilSettingsStore(defaults: defaults)
|
||||
#expect(reloadedSettings.quickQuestionPreferredProvider == .anthropic)
|
||||
#expect(reloadedSettings.quickQuestionPreferredModelByProvider[.anthropic] == "claude-3-haiku")
|
||||
#expect(reloadedSettings.preferredProvider == .openai)
|
||||
|
||||
let reloadedViewModel = SybilViewModel(settings: reloadedSettings) { _ in MockSybilClient() }
|
||||
#expect(reloadedViewModel.quickQuestionProvider == .anthropic)
|
||||
#expect(reloadedViewModel.quickQuestionModel == "claude-3-haiku")
|
||||
#expect(reloadedViewModel.provider == .openai)
|
||||
}
|
||||
|
||||
@MainActor
|
||||
@Test func reconnectAttachesSelectedActiveChatStream() async throws {
|
||||
let date = Date(timeIntervalSince1970: 1_700_000_260)
|
||||
let chat = makeChatSummary(id: "chat-active", date: date)
|
||||
let detail = makeChatDetail(id: "chat-active", date: date, body: "existing transcript")
|
||||
let client = MockSybilClient(
|
||||
chatsResponse: [chat],
|
||||
chatDetails: ["chat-active": detail],
|
||||
activeRunsResponse: ActiveRunsResponse(chats: ["chat-active"])
|
||||
)
|
||||
await client.setCompletionAttachEvents(
|
||||
chatID: "chat-active",
|
||||
events: [.delta(CompletionStreamDelta(text: "streaming"))],
|
||||
delayNanoseconds: 100_000_000
|
||||
)
|
||||
let viewModel = SybilViewModel(settings: testSettings(named: #function)) { _ in client }
|
||||
|
||||
await viewModel.reconnect()
|
||||
try await Task.sleep(nanoseconds: 20_000_000)
|
||||
|
||||
let snapshot = await client.currentSnapshot()
|
||||
#expect(snapshot.getActiveRuns >= 1)
|
||||
#expect(snapshot.attachCompletionStream == 1)
|
||||
#expect(viewModel.sidebarItems.first?.isRunning == true)
|
||||
#expect(viewModel.isSendingVisibleChat)
|
||||
#expect(viewModel.displayedMessages.last?.content == "streaming")
|
||||
}
|
||||
|
||||
@MainActor
|
||||
@Test func activeRunOnDifferentChatDoesNotDisableComposer() async throws {
|
||||
let date = Date(timeIntervalSince1970: 1_700_000_270)
|
||||
let activeChat = makeChatSummary(id: "chat-active", date: date)
|
||||
let idleChat = makeChatSummary(id: "chat-idle", date: date.addingTimeInterval(1))
|
||||
let client = MockSybilClient(
|
||||
chatsResponse: [idleChat, activeChat],
|
||||
chatDetails: [
|
||||
"chat-active": makeChatDetail(id: "chat-active", date: date, body: "active transcript"),
|
||||
"chat-idle": makeChatDetail(id: "chat-idle", date: date, body: "idle transcript")
|
||||
],
|
||||
activeRunsResponse: ActiveRunsResponse(chats: ["chat-active"])
|
||||
)
|
||||
let viewModel = SybilViewModel(settings: testSettings(named: #function)) { _ in client }
|
||||
viewModel.selectedItem = .chat("chat-idle")
|
||||
viewModel.composer = "new message"
|
||||
|
||||
await viewModel.reconnect()
|
||||
|
||||
#expect(viewModel.selectedItem == .chat("chat-idle"))
|
||||
#expect(viewModel.sidebarItems.first(where: { $0.selection == .chat("chat-active") })?.isRunning == true)
|
||||
#expect(!viewModel.isActiveSelectionSending)
|
||||
#expect(viewModel.canSendComposer)
|
||||
}
|
||||
|
||||
@MainActor
|
||||
@Test func backgroundChatStreamInterruptionIsSuppressedUntilForegroundRefresh() async throws {
|
||||
let date = Date(timeIntervalSince1970: 1_700_000_300)
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
# Sybil Server
|
||||
|
||||
Backend API for:
|
||||
- LLM multiplexer (OpenAI Responses / Anthropic / xAI Chat Completions-compatible Grok)
|
||||
- LLM multiplexer (OpenAI Responses / Anthropic / xAI Chat Completions-compatible Grok / Hermes Agent)
|
||||
- Personal chat database (chats/messages + LLM call log)
|
||||
|
||||
## Stack
|
||||
@@ -43,6 +43,9 @@ If `ADMIN_TOKEN` is not set, the server runs in open mode (dev).
|
||||
- `OPENAI_API_KEY`
|
||||
- `ANTHROPIC_API_KEY`
|
||||
- `XAI_API_KEY`
|
||||
- `HERMES_AGENT_API_BASE_URL` (`http://127.0.0.1:8642/v1` by default; include the `/v1` suffix)
|
||||
- `HERMES_AGENT_API_KEY` (enables the Hermes Agent provider; set to Hermes `API_SERVER_KEY`, or any non-empty value if that local server does not require auth)
|
||||
- `HERMES_AGENT_MODEL` (optional fallback/override model id; defaults client-side to `hermes-agent`)
|
||||
- `EXA_API_KEY`
|
||||
- `CHAT_WEB_SEARCH_ENGINE` (`exa` by default, or `searxng` for chat tool calls only)
|
||||
- `SEARXNG_BASE_URL` (required when `CHAT_WEB_SEARCH_ENGINE=searxng`; instance must allow `format=json`)
|
||||
|
||||
@@ -13,6 +13,7 @@ enum Provider {
|
||||
openai
|
||||
anthropic
|
||||
xai
|
||||
hermes_agent @map("hermes-agent")
|
||||
}
|
||||
|
||||
enum MessageRole {
|
||||
|
||||
59
server/src/active-streams.ts
Normal file
59
server/src/active-streams.ts
Normal file
@@ -0,0 +1,59 @@
|
||||
export type SseStreamEvent = {
|
||||
event: string;
|
||||
data: unknown;
|
||||
};
|
||||
|
||||
type SseStreamListener = (event: SseStreamEvent) => void;
|
||||
|
||||
export class ActiveSseStream {
|
||||
private readonly events: SseStreamEvent[] = [];
|
||||
private readonly listeners = new Set<SseStreamListener>();
|
||||
private completed = false;
|
||||
private resolveDone!: () => void;
|
||||
|
||||
readonly done: Promise<void>;
|
||||
|
||||
constructor() {
|
||||
this.done = new Promise((resolve) => {
|
||||
this.resolveDone = resolve;
|
||||
});
|
||||
}
|
||||
|
||||
get isCompleted() {
|
||||
return this.completed;
|
||||
}
|
||||
|
||||
emit(event: string, data: unknown) {
|
||||
if (this.completed) return;
|
||||
const entry = { event, data };
|
||||
this.events.push(entry);
|
||||
for (const listener of this.listeners) {
|
||||
listener(entry);
|
||||
}
|
||||
}
|
||||
|
||||
complete(finalEvent?: SseStreamEvent) {
|
||||
if (this.completed) return;
|
||||
if (finalEvent) {
|
||||
this.emit(finalEvent.event, finalEvent.data);
|
||||
}
|
||||
this.completed = true;
|
||||
this.listeners.clear();
|
||||
this.resolveDone();
|
||||
}
|
||||
|
||||
subscribe(listener: SseStreamListener) {
|
||||
for (const event of this.events) {
|
||||
listener(event);
|
||||
}
|
||||
|
||||
if (this.completed) {
|
||||
return () => {};
|
||||
}
|
||||
|
||||
this.listeners.add(listener);
|
||||
return () => {
|
||||
this.listeners.delete(listener);
|
||||
};
|
||||
}
|
||||
}
|
||||
@@ -11,6 +11,13 @@ const OptionalUrlSchema = z.preprocess(
|
||||
z.string().trim().url().optional()
|
||||
);
|
||||
|
||||
const DEFAULT_HERMES_AGENT_API_BASE_URL = "http://127.0.0.1:8642/v1";
|
||||
|
||||
const HermesAgentApiBaseUrlSchema = z.preprocess(
|
||||
(value) => (typeof value === "string" && value.trim() === "" ? undefined : value),
|
||||
z.string().trim().url().default(DEFAULT_HERMES_AGENT_API_BASE_URL)
|
||||
);
|
||||
|
||||
const ChatWebSearchEngineSchema = z.preprocess(
|
||||
(value) => {
|
||||
if (typeof value !== "string") return value;
|
||||
@@ -59,6 +66,9 @@ const EnvSchema = z.object({
|
||||
OPENAI_API_KEY: z.string().optional(),
|
||||
ANTHROPIC_API_KEY: z.string().optional(),
|
||||
XAI_API_KEY: z.string().optional(),
|
||||
HERMES_AGENT_API_BASE_URL: HermesAgentApiBaseUrlSchema,
|
||||
HERMES_AGENT_API_KEY: OptionalTrimmedStringSchema,
|
||||
HERMES_AGENT_MODEL: OptionalTrimmedStringSchema,
|
||||
EXA_API_KEY: z.string().optional(),
|
||||
|
||||
// Chat-mode web_search tool configuration. Search mode remains Exa-only for now.
|
||||
|
||||
@@ -385,6 +385,10 @@ function normalizeIncomingMessages(messages: ChatMessage[]) {
|
||||
return [{ role: "system", content: CHAT_TOOL_SYSTEM_PROMPT }, ...normalized];
|
||||
}
|
||||
|
||||
function normalizePlainIncomingMessages(messages: ChatMessage[]) {
|
||||
return messages.map((message) => buildOpenAIConversationMessage(message));
|
||||
}
|
||||
|
||||
function normalizeIncomingResponsesInput(messages: ChatMessage[]) {
|
||||
const normalized = messages.map((message) => buildOpenAIResponsesInputMessage(message));
|
||||
|
||||
@@ -853,6 +857,20 @@ function extractResponsesText(response: any, fallback = "") {
|
||||
return parts.join("") || fallback;
|
||||
}
|
||||
|
||||
function extractChatCompletionContent(message: any) {
|
||||
if (typeof message?.content === "string") return message.content;
|
||||
if (!Array.isArray(message?.content)) return "";
|
||||
|
||||
return message.content
|
||||
.map((part: any) => {
|
||||
if (typeof part === "string") return part;
|
||||
if (typeof part?.text === "string") return part.text;
|
||||
if (typeof part?.content === "string") return part.content;
|
||||
return "";
|
||||
})
|
||||
.join("");
|
||||
}
|
||||
|
||||
function getUnstreamedText(finalText: string, streamedText: string) {
|
||||
if (!finalText) return "";
|
||||
if (!streamedText) return finalText;
|
||||
@@ -1093,6 +1111,26 @@ export async function runToolAwareChatCompletions(params: ToolAwareCompletionPar
|
||||
};
|
||||
}
|
||||
|
||||
export async function runPlainChatCompletions(params: ToolAwareCompletionParams): Promise<ToolAwareCompletionResult> {
|
||||
const completion = await params.client.chat.completions.create({
|
||||
model: params.model,
|
||||
messages: normalizePlainIncomingMessages(params.messages),
|
||||
temperature: params.temperature,
|
||||
max_tokens: params.maxTokens,
|
||||
} as any);
|
||||
|
||||
const usageAcc: Required<ToolAwareUsage> = { inputTokens: 0, outputTokens: 0, totalTokens: 0 };
|
||||
const sawUsage = mergeUsage(usageAcc, completion?.usage);
|
||||
const message = completion?.choices?.[0]?.message;
|
||||
|
||||
return {
|
||||
text: extractChatCompletionContent(message),
|
||||
usage: sawUsage ? usageAcc : undefined,
|
||||
raw: { response: completion, api: "chat.completions" },
|
||||
toolEvents: [],
|
||||
};
|
||||
}
|
||||
|
||||
export async function* runToolAwareOpenAIChatStream(
|
||||
params: ToolAwareCompletionParams
|
||||
): AsyncGenerator<ToolAwareStreamingEvent> {
|
||||
@@ -1354,3 +1392,41 @@ export async function* runToolAwareChatCompletionsStream(
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
export async function* runPlainChatCompletionsStream(
|
||||
params: ToolAwareCompletionParams
|
||||
): AsyncGenerator<ToolAwareStreamingEvent> {
|
||||
const rawResponses: unknown[] = [];
|
||||
const usageAcc: Required<ToolAwareUsage> = { inputTokens: 0, outputTokens: 0, totalTokens: 0 };
|
||||
let sawUsage = false;
|
||||
let text = "";
|
||||
|
||||
const stream = await params.client.chat.completions.create({
|
||||
model: params.model,
|
||||
messages: normalizePlainIncomingMessages(params.messages),
|
||||
temperature: params.temperature,
|
||||
max_tokens: params.maxTokens,
|
||||
stream: true,
|
||||
} as any);
|
||||
|
||||
for await (const chunk of stream as any as AsyncIterable<any>) {
|
||||
rawResponses.push(chunk);
|
||||
sawUsage = mergeUsage(usageAcc, chunk?.usage) || sawUsage;
|
||||
|
||||
const deltaText = chunk?.choices?.[0]?.delta?.content ?? "";
|
||||
if (typeof deltaText === "string" && deltaText.length) {
|
||||
text += deltaText;
|
||||
yield { type: "delta", text: deltaText };
|
||||
}
|
||||
}
|
||||
|
||||
yield {
|
||||
type: "done",
|
||||
result: {
|
||||
text,
|
||||
usage: sawUsage ? usageAcc : undefined,
|
||||
raw: { streamed: true, responses: rawResponses, api: "chat.completions" },
|
||||
toolEvents: [],
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import type { FastifyBaseLogger } from "fastify";
|
||||
import { anthropicClient, openaiClient, xaiClient } from "./providers.js";
|
||||
import { env } from "../env.js";
|
||||
import { anthropicClient, hermesAgentClient, isHermesAgentConfigured, openaiClient, xaiClient } from "./providers.js";
|
||||
import type { Provider } from "./types.js";
|
||||
|
||||
export type ProviderModelSnapshot = {
|
||||
@@ -8,9 +9,9 @@ export type ProviderModelSnapshot = {
|
||||
error: string | null;
|
||||
};
|
||||
|
||||
export type ModelCatalogSnapshot = Record<Provider, ProviderModelSnapshot>;
|
||||
export type ModelCatalogSnapshot = Partial<Record<Provider, ProviderModelSnapshot>>;
|
||||
|
||||
const providers: Provider[] = ["openai", "anthropic", "xai"];
|
||||
const baseProviders: Provider[] = ["openai", "anthropic", "xai"];
|
||||
const MODEL_FETCH_TIMEOUT_MS = 15000;
|
||||
|
||||
const modelCatalog: ModelCatalogSnapshot = {
|
||||
@@ -19,6 +20,10 @@ const modelCatalog: ModelCatalogSnapshot = {
|
||||
xai: { models: [], loadedAt: null, error: null },
|
||||
};
|
||||
|
||||
function getCatalogProviders(): Provider[] {
|
||||
return isHermesAgentConfigured() ? [...baseProviders, "hermes-agent"] : baseProviders;
|
||||
}
|
||||
|
||||
function uniqSorted(models: string[]) {
|
||||
return [...new Set(models.map((value) => value.trim()).filter(Boolean))].sort((a, b) => a.localeCompare(b));
|
||||
}
|
||||
@@ -59,8 +64,15 @@ async function fetchProviderModels(provider: Provider) {
|
||||
return uniqSorted(page.data.map((model) => model.id));
|
||||
}
|
||||
|
||||
const page = await xaiClient().models.list();
|
||||
return uniqSorted(page.data.map((model) => model.id));
|
||||
if (provider === "xai") {
|
||||
const page = await xaiClient().models.list();
|
||||
return uniqSorted(page.data.map((model) => model.id));
|
||||
}
|
||||
|
||||
const page = await hermesAgentClient().models.list();
|
||||
const models = page.data.map((model) => model.id);
|
||||
if (env.HERMES_AGENT_MODEL) models.push(env.HERMES_AGENT_MODEL);
|
||||
return uniqSorted(models);
|
||||
}
|
||||
|
||||
async function refreshProviderModels(provider: Provider, logger?: FastifyBaseLogger) {
|
||||
@@ -75,7 +87,7 @@ async function refreshProviderModels(provider: Provider, logger?: FastifyBaseLog
|
||||
} catch (err: any) {
|
||||
const message = err?.message ?? String(err);
|
||||
modelCatalog[provider] = {
|
||||
models: [],
|
||||
models: provider === "hermes-agent" && env.HERMES_AGENT_MODEL ? [env.HERMES_AGENT_MODEL] : [],
|
||||
loadedAt: new Date().toISOString(),
|
||||
error: message,
|
||||
};
|
||||
@@ -84,25 +96,18 @@ async function refreshProviderModels(provider: Provider, logger?: FastifyBaseLog
|
||||
}
|
||||
|
||||
export async function warmModelCatalog(logger?: FastifyBaseLogger) {
|
||||
await Promise.all(providers.map((provider) => refreshProviderModels(provider, logger)));
|
||||
await Promise.all(getCatalogProviders().map((provider) => refreshProviderModels(provider, logger)));
|
||||
}
|
||||
|
||||
export function getModelCatalogSnapshot(): ModelCatalogSnapshot {
|
||||
return {
|
||||
openai: {
|
||||
models: [...modelCatalog.openai.models],
|
||||
loadedAt: modelCatalog.openai.loadedAt,
|
||||
error: modelCatalog.openai.error,
|
||||
},
|
||||
anthropic: {
|
||||
models: [...modelCatalog.anthropic.models],
|
||||
loadedAt: modelCatalog.anthropic.loadedAt,
|
||||
error: modelCatalog.anthropic.error,
|
||||
},
|
||||
xai: {
|
||||
models: [...modelCatalog.xai.models],
|
||||
loadedAt: modelCatalog.xai.loadedAt,
|
||||
error: modelCatalog.xai.error,
|
||||
},
|
||||
};
|
||||
const snapshot: ModelCatalogSnapshot = {};
|
||||
for (const provider of getCatalogProviders()) {
|
||||
const entry = modelCatalog[provider] ?? { models: [], loadedAt: null, error: null };
|
||||
snapshot[provider] = {
|
||||
models: [...entry.models],
|
||||
loadedAt: entry.loadedAt,
|
||||
error: entry.error,
|
||||
};
|
||||
}
|
||||
return snapshot;
|
||||
}
|
||||
|
||||
@@ -1,13 +1,13 @@
|
||||
import { performance } from "node:perf_hooks";
|
||||
import { prisma } from "../db.js";
|
||||
import { anthropicClient, openaiClient, xaiClient } from "./providers.js";
|
||||
import { buildToolLogMessageData, runToolAwareChatCompletions, runToolAwareOpenAIChat } from "./chat-tools.js";
|
||||
import { anthropicClient, hermesAgentClient, openaiClient, xaiClient } from "./providers.js";
|
||||
import { buildToolLogMessageData, runPlainChatCompletions, runToolAwareChatCompletions, runToolAwareOpenAIChat } from "./chat-tools.js";
|
||||
import { buildAnthropicConversationMessage, getAnthropicSystemPrompt } from "./message-content.js";
|
||||
import { toPrismaProvider } from "./provider-ids.js";
|
||||
import type { MultiplexRequest, MultiplexResponse, Provider } from "./types.js";
|
||||
|
||||
function asProviderEnum(p: Provider) {
|
||||
// Prisma enum values match these strings.
|
||||
return p;
|
||||
return toPrismaProvider(p);
|
||||
}
|
||||
|
||||
export async function runMultiplex(req: MultiplexRequest): Promise<MultiplexResponse> {
|
||||
@@ -84,6 +84,23 @@ export async function runMultiplex(req: MultiplexRequest): Promise<MultiplexResp
|
||||
outText = r.text;
|
||||
usage = r.usage;
|
||||
toolMessages = r.toolEvents.map((event) => buildToolLogMessageData(call.chatId, event));
|
||||
} else if (req.provider === "hermes-agent") {
|
||||
const client = hermesAgentClient();
|
||||
const r = await runPlainChatCompletions({
|
||||
client,
|
||||
model: req.model,
|
||||
messages: req.messages,
|
||||
temperature: req.temperature,
|
||||
maxTokens: req.maxTokens,
|
||||
logContext: {
|
||||
provider: req.provider,
|
||||
model: req.model,
|
||||
chatId,
|
||||
},
|
||||
});
|
||||
raw = r.raw;
|
||||
outText = r.text;
|
||||
usage = r.usage;
|
||||
} else if (req.provider === "anthropic") {
|
||||
const client = anthropicClient();
|
||||
|
||||
|
||||
31
server/src/llm/provider-ids.ts
Normal file
31
server/src/llm/provider-ids.ts
Normal file
@@ -0,0 +1,31 @@
|
||||
import type { Provider } from "./types.js";
|
||||
|
||||
type PrismaProvider = Exclude<Provider, "hermes-agent"> | "hermes_agent";
|
||||
|
||||
export function toPrismaProvider(provider: Provider): PrismaProvider {
|
||||
return provider === "hermes-agent" ? "hermes_agent" : provider;
|
||||
}
|
||||
|
||||
export function fromPrismaProvider(provider: unknown): Provider | null {
|
||||
if (provider === null || provider === undefined) return null;
|
||||
if (provider === "hermes_agent" || provider === "hermes-agent") return "hermes-agent";
|
||||
if (provider === "openai" || provider === "anthropic" || provider === "xai") return provider;
|
||||
return null;
|
||||
}
|
||||
|
||||
export function serializeProviderFields<T extends Record<string, any>>(value: T): T {
|
||||
const next: Record<string, any> = { ...value };
|
||||
if ("initiatedProvider" in next) {
|
||||
next.initiatedProvider = fromPrismaProvider(next.initiatedProvider);
|
||||
}
|
||||
if ("lastUsedProvider" in next) {
|
||||
next.lastUsedProvider = fromPrismaProvider(next.lastUsedProvider);
|
||||
}
|
||||
if ("provider" in next) {
|
||||
next.provider = fromPrismaProvider(next.provider);
|
||||
}
|
||||
if (Array.isArray(next.calls)) {
|
||||
next.calls = next.calls.map((call: Record<string, any>) => serializeProviderFields(call));
|
||||
}
|
||||
return next as T;
|
||||
}
|
||||
@@ -13,6 +13,18 @@ export function xaiClient() {
|
||||
return new OpenAI({ apiKey: env.XAI_API_KEY, baseURL: "https://api.x.ai/v1" });
|
||||
}
|
||||
|
||||
export function isHermesAgentConfigured() {
|
||||
return Boolean(env.HERMES_AGENT_API_KEY);
|
||||
}
|
||||
|
||||
export function hermesAgentClient() {
|
||||
if (!env.HERMES_AGENT_API_KEY) throw new Error("HERMES_AGENT_API_KEY not set");
|
||||
return new OpenAI({
|
||||
apiKey: env.HERMES_AGENT_API_KEY,
|
||||
baseURL: env.HERMES_AGENT_API_BASE_URL,
|
||||
});
|
||||
}
|
||||
|
||||
export function anthropicClient() {
|
||||
if (!env.ANTHROPIC_API_KEY) throw new Error("ANTHROPIC_API_KEY not set");
|
||||
return new Anthropic({ apiKey: env.ANTHROPIC_API_KEY });
|
||||
|
||||
@@ -1,13 +1,15 @@
|
||||
import { performance } from "node:perf_hooks";
|
||||
import { prisma } from "../db.js";
|
||||
import { anthropicClient, openaiClient, xaiClient } from "./providers.js";
|
||||
import { anthropicClient, hermesAgentClient, openaiClient, xaiClient } from "./providers.js";
|
||||
import {
|
||||
buildToolLogMessageData,
|
||||
runPlainChatCompletionsStream,
|
||||
runToolAwareChatCompletionsStream,
|
||||
runToolAwareOpenAIChatStream,
|
||||
type ToolExecutionEvent,
|
||||
} from "./chat-tools.js";
|
||||
import { buildAnthropicConversationMessage, getAnthropicSystemPrompt } from "./message-content.js";
|
||||
import { toPrismaProvider } from "./provider-ids.js";
|
||||
import type { MultiplexRequest, Provider } from "./types.js";
|
||||
|
||||
type StreamUsage = {
|
||||
@@ -38,7 +40,7 @@ export async function* runMultiplexStream(req: MultiplexRequest): AsyncGenerator
|
||||
? await prisma.llmCall.create({
|
||||
data: {
|
||||
chatId,
|
||||
provider: req.provider as any,
|
||||
provider: toPrismaProvider(req.provider) as any,
|
||||
model: req.model,
|
||||
request: req as any,
|
||||
},
|
||||
@@ -51,14 +53,14 @@ export async function* runMultiplexStream(req: MultiplexRequest): AsyncGenerator
|
||||
prisma.chat.update({
|
||||
where: { id: chatId },
|
||||
data: {
|
||||
lastUsedProvider: req.provider as any,
|
||||
lastUsedProvider: toPrismaProvider(req.provider) as any,
|
||||
lastUsedModel: req.model,
|
||||
},
|
||||
}),
|
||||
prisma.chat.updateMany({
|
||||
where: { id: chatId, initiatedProvider: null },
|
||||
data: {
|
||||
initiatedProvider: req.provider as any,
|
||||
initiatedProvider: toPrismaProvider(req.provider) as any,
|
||||
initiatedModel: req.model,
|
||||
},
|
||||
}),
|
||||
@@ -72,8 +74,8 @@ export async function* runMultiplexStream(req: MultiplexRequest): AsyncGenerator
|
||||
let raw: unknown = { streamed: true };
|
||||
|
||||
try {
|
||||
if (req.provider === "openai" || req.provider === "xai") {
|
||||
const client = req.provider === "openai" ? openaiClient() : xaiClient();
|
||||
if (req.provider === "openai" || req.provider === "xai" || req.provider === "hermes-agent") {
|
||||
const client = req.provider === "openai" ? openaiClient() : req.provider === "xai" ? xaiClient() : hermesAgentClient();
|
||||
const streamEvents =
|
||||
req.provider === "openai"
|
||||
? runToolAwareOpenAIChatStream({
|
||||
@@ -88,6 +90,19 @@ export async function* runMultiplexStream(req: MultiplexRequest): AsyncGenerator
|
||||
chatId: chatId ?? undefined,
|
||||
},
|
||||
})
|
||||
: req.provider === "hermes-agent"
|
||||
? runPlainChatCompletionsStream({
|
||||
client,
|
||||
model: req.model,
|
||||
messages: req.messages,
|
||||
temperature: req.temperature,
|
||||
maxTokens: req.maxTokens,
|
||||
logContext: {
|
||||
provider: req.provider,
|
||||
model: req.model,
|
||||
chatId: chatId ?? undefined,
|
||||
},
|
||||
})
|
||||
: runToolAwareChatCompletionsStream({
|
||||
client,
|
||||
model: req.model,
|
||||
|
||||
@@ -1,4 +1,6 @@
|
||||
export type Provider = "openai" | "anthropic" | "xai";
|
||||
export const PROVIDERS = ["openai", "anthropic", "xai", "hermes-agent"] as const;
|
||||
|
||||
export type Provider = (typeof PROVIDERS)[number];
|
||||
|
||||
export type ChatImageAttachment = {
|
||||
kind: "image";
|
||||
|
||||
@@ -1,17 +1,21 @@
|
||||
import { performance } from "node:perf_hooks";
|
||||
import { z } from "zod";
|
||||
import type { FastifyInstance } from "fastify";
|
||||
import type { FastifyInstance, FastifyReply, FastifyRequest } from "fastify";
|
||||
import { ActiveSseStream, type SseStreamEvent } from "./active-streams.js";
|
||||
import { prisma } from "./db.js";
|
||||
import { requireAdmin } from "./auth.js";
|
||||
import { env } from "./env.js";
|
||||
import { buildComparableAttachments } from "./llm/message-content.js";
|
||||
import { runMultiplex } from "./llm/multiplexer.js";
|
||||
import { runMultiplexStream } from "./llm/streaming.js";
|
||||
import { runMultiplexStream, type StreamEvent } from "./llm/streaming.js";
|
||||
import { getModelCatalogSnapshot } from "./llm/model-catalog.js";
|
||||
import { openaiClient } from "./llm/providers.js";
|
||||
import { serializeProviderFields, toPrismaProvider } from "./llm/provider-ids.js";
|
||||
import { exaClient } from "./search/exa.js";
|
||||
import type { ChatAttachment } from "./llm/types.js";
|
||||
|
||||
const ProviderSchema = z.enum(["openai", "anthropic", "xai", "hermes-agent"]);
|
||||
|
||||
type IncomingChatMessage = {
|
||||
role: "system" | "user" | "assistant" | "tool";
|
||||
content: string;
|
||||
@@ -120,6 +124,26 @@ const CompletionMessageSchema = z
|
||||
}
|
||||
});
|
||||
|
||||
const CompletionStreamBody = z
|
||||
.object({
|
||||
chatId: z.string().optional(),
|
||||
persist: z.boolean().optional(),
|
||||
provider: ProviderSchema,
|
||||
model: z.string().min(1),
|
||||
messages: z.array(CompletionMessageSchema),
|
||||
temperature: z.number().min(0).max(2).optional(),
|
||||
maxTokens: z.number().int().positive().optional(),
|
||||
})
|
||||
.superRefine((value, ctx) => {
|
||||
if (value.persist === false && value.chatId) {
|
||||
ctx.addIssue({
|
||||
code: z.ZodIssueCode.custom,
|
||||
message: "chatId must be omitted when persist is false",
|
||||
path: ["chatId"],
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
function mergeAttachmentsIntoMetadata(metadata: unknown, attachments?: ChatAttachment[]) {
|
||||
if (!attachments?.length) return metadata as any;
|
||||
if (!metadata || typeof metadata !== "object" || Array.isArray(metadata)) {
|
||||
@@ -293,6 +317,246 @@ function buildSseHeaders(originHeader: string | undefined) {
|
||||
return headers;
|
||||
}
|
||||
|
||||
type SearchRunRequest = z.infer<typeof SearchRunBody>;
|
||||
|
||||
const activeChatStreams = new Map<string, ActiveSseStream>();
|
||||
const activeSearchStreams = new Map<string, ActiveSseStream>();
|
||||
|
||||
function getErrorMessage(err: unknown) {
|
||||
return err instanceof Error ? err.message : String(err);
|
||||
}
|
||||
|
||||
function writeSseEvent(reply: FastifyReply, event: SseStreamEvent) {
|
||||
if (reply.raw.destroyed || reply.raw.writableEnded) return;
|
||||
reply.raw.write(`event: ${event.event}\n`);
|
||||
reply.raw.write(`data: ${JSON.stringify(event.data)}\n\n`);
|
||||
}
|
||||
|
||||
async function streamActiveRun(req: FastifyRequest, reply: FastifyReply, stream: ActiveSseStream) {
|
||||
reply.raw.writeHead(200, buildSseHeaders(typeof req.headers.origin === "string" ? req.headers.origin : undefined));
|
||||
reply.raw.flushHeaders?.();
|
||||
|
||||
let unsubscribe = () => {};
|
||||
let closed = false;
|
||||
const closedPromise = new Promise<void>((resolve) => {
|
||||
const onClose = () => {
|
||||
closed = true;
|
||||
unsubscribe();
|
||||
reply.raw.off("close", onClose);
|
||||
resolve();
|
||||
};
|
||||
reply.raw.on("close", onClose);
|
||||
stream.done.finally(() => {
|
||||
reply.raw.off("close", onClose);
|
||||
});
|
||||
});
|
||||
|
||||
unsubscribe = stream.subscribe((event) => writeSseEvent(reply, event));
|
||||
await Promise.race([stream.done, closedPromise]);
|
||||
unsubscribe();
|
||||
|
||||
if (!closed && !reply.raw.destroyed && !reply.raw.writableEnded) {
|
||||
reply.raw.end();
|
||||
}
|
||||
|
||||
return reply;
|
||||
}
|
||||
|
||||
function mapChatStreamEvent(ev: StreamEvent): SseStreamEvent {
|
||||
if (ev.type === "tool_call") return { event: "tool_call", data: ev.event };
|
||||
return { event: ev.type, data: ev };
|
||||
}
|
||||
|
||||
function startActiveChatStream(chatId: string, body: z.infer<typeof CompletionStreamBody>) {
|
||||
const stream = new ActiveSseStream();
|
||||
activeChatStreams.set(chatId, stream);
|
||||
|
||||
void (async () => {
|
||||
let sawTerminalEvent = false;
|
||||
try {
|
||||
for await (const ev of runMultiplexStream(body)) {
|
||||
const event = mapChatStreamEvent(ev);
|
||||
if (ev.type === "done" || ev.type === "error") {
|
||||
sawTerminalEvent = true;
|
||||
stream.complete(event);
|
||||
break;
|
||||
}
|
||||
stream.emit(event.event, event.data);
|
||||
}
|
||||
|
||||
if (!sawTerminalEvent) {
|
||||
stream.complete({ event: "error", data: { message: "chat stream ended unexpectedly" } });
|
||||
}
|
||||
} catch (err) {
|
||||
stream.complete({ event: "error", data: { message: getErrorMessage(err) } });
|
||||
} finally {
|
||||
activeChatStreams.delete(chatId);
|
||||
}
|
||||
})();
|
||||
|
||||
return stream;
|
||||
}
|
||||
|
||||
async function executeSearchRunStream(searchId: string, body: SearchRunRequest, stream: ActiveSseStream) {
|
||||
const startedAt = performance.now();
|
||||
const query = body.query?.trim();
|
||||
if (!query) {
|
||||
stream.complete({ event: "error", data: { message: "query is required" } });
|
||||
return;
|
||||
}
|
||||
|
||||
const normalizedTitle = body.title?.trim() || query.slice(0, 80);
|
||||
|
||||
try {
|
||||
const exa = exaClient();
|
||||
const searchPromise = exa.search(query, {
|
||||
type: body.type ?? "auto",
|
||||
numResults: body.numResults ?? 10,
|
||||
includeDomains: body.includeDomains,
|
||||
excludeDomains: body.excludeDomains,
|
||||
moderation: true,
|
||||
userLocation: "US",
|
||||
contents: false,
|
||||
} as any);
|
||||
const answerPromise = exa.answer(query, {
|
||||
text: true,
|
||||
model: "exa",
|
||||
userLocation: "US",
|
||||
});
|
||||
|
||||
let searchResponse: any | null = null;
|
||||
let answerResponse: any | null = null;
|
||||
let enrichedResults: any[] | null = null;
|
||||
let searchError: string | null = null;
|
||||
let answerError: string | null = null;
|
||||
|
||||
const searchSettled = searchPromise.then(
|
||||
async (value) => {
|
||||
searchResponse = value;
|
||||
const previewResults = (value?.results ?? []).map((result: any, index: number) => mapSearchResultPreview(result, index));
|
||||
stream.emit("search_results", {
|
||||
requestId: value?.requestId ?? null,
|
||||
results: previewResults,
|
||||
});
|
||||
|
||||
const urls = (value?.results ?? []).map((result: any) => result?.url).filter((url: string | undefined) => typeof url === "string");
|
||||
if (!urls.length) return;
|
||||
|
||||
try {
|
||||
const contentsResponse = await exa.getContents(urls, {
|
||||
text: { maxCharacters: 1200 },
|
||||
highlights: {
|
||||
query,
|
||||
maxCharacters: 320,
|
||||
numSentences: 2,
|
||||
highlightsPerUrl: 2,
|
||||
},
|
||||
} as any);
|
||||
const byUrl = new Map<string, any>();
|
||||
for (const contentItem of contentsResponse?.results ?? []) {
|
||||
byUrl.set(normalizeUrlForMatch(contentItem?.url), contentItem);
|
||||
}
|
||||
|
||||
enrichedResults = (value?.results ?? []).map((result: any) => {
|
||||
const contentItem = byUrl.get(normalizeUrlForMatch(result?.url));
|
||||
if (!contentItem) return result;
|
||||
return {
|
||||
...result,
|
||||
text: contentItem.text ?? result.text ?? null,
|
||||
highlights: Array.isArray(contentItem.highlights) ? contentItem.highlights : result.highlights ?? null,
|
||||
highlightScores: Array.isArray(contentItem.highlightScores) ? contentItem.highlightScores : result.highlightScores ?? null,
|
||||
};
|
||||
});
|
||||
|
||||
stream.emit("search_results", {
|
||||
requestId: value?.requestId ?? null,
|
||||
results: enrichedResults.map((result: any, index: number) => mapSearchResultPreview(result, index)),
|
||||
});
|
||||
} catch {
|
||||
// keep preview results if content enrichment fails
|
||||
}
|
||||
},
|
||||
(reason) => {
|
||||
searchError = reason?.message ?? String(reason);
|
||||
stream.emit("search_error", { error: searchError });
|
||||
}
|
||||
);
|
||||
|
||||
const answerSettled = answerPromise.then(
|
||||
(value) => {
|
||||
answerResponse = value;
|
||||
stream.emit("answer", {
|
||||
answerText: parseAnswerText(value),
|
||||
answerRequestId: value?.requestId ?? null,
|
||||
answerCitations: (value?.citations as any) ?? null,
|
||||
});
|
||||
},
|
||||
(reason) => {
|
||||
answerError = reason?.message ?? String(reason);
|
||||
stream.emit("answer_error", { error: answerError });
|
||||
}
|
||||
);
|
||||
|
||||
await Promise.all([searchSettled, answerSettled]);
|
||||
|
||||
const latencyMs = Math.round(performance.now() - startedAt);
|
||||
const persistedResults = enrichedResults ?? searchResponse?.results ?? [];
|
||||
const rows = persistedResults.map((result: any, index: number) => mapSearchResultRow(searchId, result, index));
|
||||
const answerText = parseAnswerText(answerResponse);
|
||||
|
||||
await prisma.$transaction(async (tx) => {
|
||||
await tx.search.update({
|
||||
where: { id: searchId },
|
||||
data: {
|
||||
query,
|
||||
title: normalizedTitle,
|
||||
requestId: searchResponse?.requestId ?? null,
|
||||
rawResponse: searchResponse as any,
|
||||
latencyMs,
|
||||
error: searchError,
|
||||
answerText,
|
||||
answerRequestId: answerResponse?.requestId ?? null,
|
||||
answerCitations: (answerResponse?.citations as any) ?? null,
|
||||
answerRawResponse: answerResponse as any,
|
||||
answerError,
|
||||
},
|
||||
});
|
||||
await tx.searchResult.deleteMany({ where: { searchId } });
|
||||
if (rows.length) {
|
||||
await tx.searchResult.createMany({ data: rows as any });
|
||||
}
|
||||
});
|
||||
|
||||
const search = await prisma.search.findUnique({
|
||||
where: { id: searchId },
|
||||
include: { results: { orderBy: { rank: "asc" } } },
|
||||
});
|
||||
if (!search) {
|
||||
stream.complete({ event: "error", data: { message: "search not found" } });
|
||||
} else {
|
||||
stream.complete({ event: "done", data: { search } });
|
||||
}
|
||||
} catch (err) {
|
||||
const message = getErrorMessage(err);
|
||||
try {
|
||||
await prisma.search.update({
|
||||
where: { id: searchId },
|
||||
data: {
|
||||
query,
|
||||
title: normalizedTitle,
|
||||
latencyMs: Math.round(performance.now() - startedAt),
|
||||
error: message,
|
||||
},
|
||||
});
|
||||
} catch {
|
||||
// keep the stream terminal event even if the backing search row disappeared
|
||||
}
|
||||
stream.complete({ event: "error", data: { message } });
|
||||
} finally {
|
||||
activeSearchStreams.delete(searchId);
|
||||
}
|
||||
}
|
||||
|
||||
export async function registerRoutes(app: FastifyInstance) {
|
||||
app.get("/health", { logLevel: "silent" }, async () => ({ ok: true }));
|
||||
|
||||
@@ -306,6 +570,14 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
return { providers: getModelCatalogSnapshot() };
|
||||
});
|
||||
|
||||
app.get("/v1/active-runs", async (req) => {
|
||||
requireAdmin(req);
|
||||
return {
|
||||
chats: Array.from(activeChatStreams.keys()),
|
||||
searches: Array.from(activeSearchStreams.keys()),
|
||||
};
|
||||
});
|
||||
|
||||
app.get("/v1/chats", async (req) => {
|
||||
requireAdmin(req);
|
||||
const chats = await prisma.chat.findMany({
|
||||
@@ -322,7 +594,7 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
lastUsedModel: true,
|
||||
},
|
||||
});
|
||||
return { chats };
|
||||
return { chats: chats.map((chat) => serializeProviderFields(chat)) };
|
||||
});
|
||||
|
||||
app.post("/v1/chats", async (req) => {
|
||||
@@ -330,7 +602,7 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
const Body = z
|
||||
.object({
|
||||
title: z.string().optional(),
|
||||
provider: z.enum(["openai", "anthropic", "xai"]).optional(),
|
||||
provider: ProviderSchema.optional(),
|
||||
model: z.string().trim().min(1).optional(),
|
||||
messages: z.array(CompletionMessageSchema).optional(),
|
||||
})
|
||||
@@ -356,9 +628,9 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
const chat = await prisma.chat.create({
|
||||
data: {
|
||||
title: body.title,
|
||||
initiatedProvider: body.provider as any,
|
||||
initiatedProvider: body.provider ? (toPrismaProvider(body.provider) as any) : undefined,
|
||||
initiatedModel: body.model,
|
||||
lastUsedProvider: body.provider as any,
|
||||
lastUsedProvider: body.provider ? (toPrismaProvider(body.provider) as any) : undefined,
|
||||
lastUsedModel: body.model,
|
||||
messages: body.messages?.length
|
||||
? {
|
||||
@@ -382,7 +654,7 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
lastUsedModel: true,
|
||||
},
|
||||
});
|
||||
return { chat };
|
||||
return { chat: serializeProviderFields(chat) };
|
||||
});
|
||||
|
||||
app.patch("/v1/chats/:chatId", async (req) => {
|
||||
@@ -413,7 +685,7 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
},
|
||||
});
|
||||
if (!chat) return app.httpErrors.notFound("chat not found");
|
||||
return { chat };
|
||||
return { chat: serializeProviderFields(chat) };
|
||||
});
|
||||
|
||||
app.post("/v1/chats/title/suggest", async (req) => {
|
||||
@@ -438,7 +710,7 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
},
|
||||
});
|
||||
if (!existing) return app.httpErrors.notFound("chat not found");
|
||||
if (existing.title?.trim()) return { chat: existing };
|
||||
if (existing.title?.trim()) return { chat: serializeProviderFields(existing) };
|
||||
|
||||
const fallback = body.content.split(/\r?\n/)[0]?.trim().slice(0, 48) || "New chat";
|
||||
const suggestedRaw = await generateChatTitle(body.content);
|
||||
@@ -459,7 +731,7 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
},
|
||||
});
|
||||
|
||||
return { chat };
|
||||
return { chat: serializeProviderFields(chat) };
|
||||
});
|
||||
|
||||
app.delete("/v1/chats/:chatId", async (req) => {
|
||||
@@ -579,7 +851,7 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
},
|
||||
});
|
||||
|
||||
return { chat };
|
||||
return { chat: serializeProviderFields(chat) };
|
||||
});
|
||||
|
||||
app.post("/v1/searches/:searchId/run", async (req) => {
|
||||
@@ -695,162 +967,24 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
const query = body.query?.trim() || existing.query?.trim();
|
||||
if (!query) return app.httpErrors.badRequest("query is required");
|
||||
|
||||
const startedAt = performance.now();
|
||||
const normalizedTitle = body.title?.trim() || query.slice(0, 80);
|
||||
|
||||
reply.raw.writeHead(200, buildSseHeaders(typeof req.headers.origin === "string" ? req.headers.origin : undefined));
|
||||
|
||||
const send = (event: string, data: any) => {
|
||||
if (reply.raw.writableEnded) return;
|
||||
reply.raw.write(`event: ${event}\n`);
|
||||
reply.raw.write(`data: ${JSON.stringify(data)}\n\n`);
|
||||
};
|
||||
|
||||
try {
|
||||
const exa = exaClient();
|
||||
const searchPromise = exa.search(query, {
|
||||
type: body.type ?? "auto",
|
||||
numResults: body.numResults ?? 10,
|
||||
includeDomains: body.includeDomains,
|
||||
excludeDomains: body.excludeDomains,
|
||||
moderation: true,
|
||||
userLocation: "US",
|
||||
contents: false,
|
||||
} as any);
|
||||
const answerPromise = exa.answer(query, {
|
||||
text: true,
|
||||
model: "exa",
|
||||
userLocation: "US",
|
||||
});
|
||||
|
||||
let searchResponse: any | null = null;
|
||||
let answerResponse: any | null = null;
|
||||
let enrichedResults: any[] | null = null;
|
||||
let searchError: string | null = null;
|
||||
let answerError: string | null = null;
|
||||
|
||||
const searchSettled = searchPromise.then(
|
||||
async (value) => {
|
||||
searchResponse = value;
|
||||
const previewResults = (value?.results ?? []).map((result: any, index: number) => mapSearchResultPreview(result, index));
|
||||
send("search_results", {
|
||||
requestId: value?.requestId ?? null,
|
||||
results: previewResults,
|
||||
});
|
||||
|
||||
const urls = (value?.results ?? []).map((result: any) => result?.url).filter((url: string | undefined) => typeof url === "string");
|
||||
if (!urls.length) return;
|
||||
|
||||
try {
|
||||
const contentsResponse = await exa.getContents(urls, {
|
||||
text: { maxCharacters: 1200 },
|
||||
highlights: {
|
||||
query,
|
||||
maxCharacters: 320,
|
||||
numSentences: 2,
|
||||
highlightsPerUrl: 2,
|
||||
},
|
||||
} as any);
|
||||
const byUrl = new Map<string, any>();
|
||||
for (const contentItem of contentsResponse?.results ?? []) {
|
||||
byUrl.set(normalizeUrlForMatch(contentItem?.url), contentItem);
|
||||
}
|
||||
|
||||
enrichedResults = (value?.results ?? []).map((result: any) => {
|
||||
const contentItem = byUrl.get(normalizeUrlForMatch(result?.url));
|
||||
if (!contentItem) return result;
|
||||
return {
|
||||
...result,
|
||||
text: contentItem.text ?? result.text ?? null,
|
||||
highlights: Array.isArray(contentItem.highlights) ? contentItem.highlights : result.highlights ?? null,
|
||||
highlightScores: Array.isArray(contentItem.highlightScores) ? contentItem.highlightScores : result.highlightScores ?? null,
|
||||
};
|
||||
});
|
||||
|
||||
send("search_results", {
|
||||
requestId: value?.requestId ?? null,
|
||||
results: enrichedResults.map((result: any, index: number) => mapSearchResultPreview(result, index)),
|
||||
});
|
||||
} catch {
|
||||
// keep preview results if content enrichment fails
|
||||
}
|
||||
},
|
||||
(reason) => {
|
||||
searchError = reason?.message ?? String(reason);
|
||||
send("search_error", { error: searchError });
|
||||
}
|
||||
);
|
||||
|
||||
const answerSettled = answerPromise.then(
|
||||
(value) => {
|
||||
answerResponse = value;
|
||||
send("answer", {
|
||||
answerText: parseAnswerText(value),
|
||||
answerRequestId: value?.requestId ?? null,
|
||||
answerCitations: (value?.citations as any) ?? null,
|
||||
});
|
||||
},
|
||||
(reason) => {
|
||||
answerError = reason?.message ?? String(reason);
|
||||
send("answer_error", { error: answerError });
|
||||
}
|
||||
);
|
||||
|
||||
await Promise.all([searchSettled, answerSettled]);
|
||||
|
||||
const latencyMs = Math.round(performance.now() - startedAt);
|
||||
const persistedResults = enrichedResults ?? searchResponse?.results ?? [];
|
||||
const rows = persistedResults.map((result: any, index: number) => mapSearchResultRow(searchId, result, index));
|
||||
const answerText = parseAnswerText(answerResponse);
|
||||
|
||||
await prisma.$transaction(async (tx) => {
|
||||
await tx.search.update({
|
||||
where: { id: searchId },
|
||||
data: {
|
||||
query,
|
||||
title: normalizedTitle,
|
||||
requestId: searchResponse?.requestId ?? null,
|
||||
rawResponse: searchResponse as any,
|
||||
latencyMs,
|
||||
error: searchError,
|
||||
answerText,
|
||||
answerRequestId: answerResponse?.requestId ?? null,
|
||||
answerCitations: (answerResponse?.citations as any) ?? null,
|
||||
answerRawResponse: answerResponse as any,
|
||||
answerError,
|
||||
},
|
||||
});
|
||||
await tx.searchResult.deleteMany({ where: { searchId } });
|
||||
if (rows.length) {
|
||||
await tx.searchResult.createMany({ data: rows as any });
|
||||
}
|
||||
});
|
||||
|
||||
const search = await prisma.search.findUnique({
|
||||
where: { id: searchId },
|
||||
include: { results: { orderBy: { rank: "asc" } } },
|
||||
});
|
||||
if (!search) {
|
||||
send("error", { message: "search not found" });
|
||||
} else {
|
||||
send("done", { search });
|
||||
}
|
||||
} catch (err: any) {
|
||||
await prisma.search.update({
|
||||
where: { id: searchId },
|
||||
data: {
|
||||
query,
|
||||
title: normalizedTitle,
|
||||
latencyMs: Math.round(performance.now() - startedAt),
|
||||
error: err?.message ?? String(err),
|
||||
},
|
||||
});
|
||||
send("error", { message: err?.message ?? String(err) });
|
||||
} finally {
|
||||
reply.raw.end();
|
||||
const existingStream = activeSearchStreams.get(searchId);
|
||||
if (existingStream) {
|
||||
return streamActiveRun(req, reply, existingStream);
|
||||
}
|
||||
|
||||
return reply;
|
||||
const stream = new ActiveSseStream();
|
||||
activeSearchStreams.set(searchId, stream);
|
||||
void executeSearchRunStream(searchId, { ...body, query }, stream);
|
||||
return streamActiveRun(req, reply, stream);
|
||||
});
|
||||
|
||||
app.post("/v1/searches/:searchId/run/stream/attach", async (req, reply) => {
|
||||
requireAdmin(req);
|
||||
const Params = z.object({ searchId: z.string() });
|
||||
const { searchId } = Params.parse(req.params);
|
||||
const stream = activeSearchStreams.get(searchId);
|
||||
if (!stream) return app.httpErrors.notFound("active search stream not found");
|
||||
return streamActiveRun(req, reply, stream);
|
||||
});
|
||||
|
||||
app.get("/v1/chats/:chatId", async (req) => {
|
||||
@@ -863,7 +997,7 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
include: { messages: { orderBy: { createdAt: "asc" } }, calls: { orderBy: { createdAt: "desc" } } },
|
||||
});
|
||||
if (!chat) return app.httpErrors.notFound("chat not found");
|
||||
return { chat };
|
||||
return { chat: serializeProviderFields(chat) };
|
||||
});
|
||||
|
||||
app.post("/v1/chats/:chatId/messages", async (req) => {
|
||||
@@ -895,13 +1029,22 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
return { message: msg };
|
||||
});
|
||||
|
||||
app.post("/v1/chats/:chatId/stream/attach", async (req, reply) => {
|
||||
requireAdmin(req);
|
||||
const Params = z.object({ chatId: z.string() });
|
||||
const { chatId } = Params.parse(req.params);
|
||||
const stream = activeChatStreams.get(chatId);
|
||||
if (!stream) return app.httpErrors.notFound("active chat stream not found");
|
||||
return streamActiveRun(req, reply, stream);
|
||||
});
|
||||
|
||||
// Main: create a completion via provider+model and store everything.
|
||||
app.post("/v1/chat-completions", async (req) => {
|
||||
requireAdmin(req);
|
||||
|
||||
const Body = z.object({
|
||||
chatId: z.string().optional(),
|
||||
provider: z.enum(["openai", "anthropic", "xai"]),
|
||||
provider: ProviderSchema,
|
||||
model: z.string().min(1),
|
||||
messages: z.array(CompletionMessageSchema),
|
||||
temperature: z.number().min(0).max(2).optional(),
|
||||
@@ -935,27 +1078,7 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
app.post("/v1/chat-completions/stream", async (req, reply) => {
|
||||
requireAdmin(req);
|
||||
|
||||
const Body = z
|
||||
.object({
|
||||
chatId: z.string().optional(),
|
||||
persist: z.boolean().optional(),
|
||||
provider: z.enum(["openai", "anthropic", "xai"]),
|
||||
model: z.string().min(1),
|
||||
messages: z.array(CompletionMessageSchema),
|
||||
temperature: z.number().min(0).max(2).optional(),
|
||||
maxTokens: z.number().int().positive().optional(),
|
||||
})
|
||||
.superRefine((value, ctx) => {
|
||||
if (value.persist === false && value.chatId) {
|
||||
ctx.addIssue({
|
||||
code: z.ZodIssueCode.custom,
|
||||
message: "chatId must be omitted when persist is false",
|
||||
path: ["chatId"],
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
const parsed = Body.safeParse(req.body);
|
||||
const parsed = CompletionStreamBody.safeParse(req.body);
|
||||
if (!parsed.success) return app.httpErrors.badRequest(parsed.error.message);
|
||||
const body = parsed.data;
|
||||
|
||||
@@ -970,23 +1093,24 @@ export async function registerRoutes(app: FastifyInstance) {
|
||||
await storeNonAssistantMessages(body.chatId, body.messages);
|
||||
}
|
||||
|
||||
if (body.persist !== false && body.chatId) {
|
||||
if (activeChatStreams.has(body.chatId)) {
|
||||
return app.httpErrors.conflict("chat completion already running");
|
||||
}
|
||||
const stream = startActiveChatStream(body.chatId, body);
|
||||
return streamActiveRun(req, reply, stream);
|
||||
}
|
||||
|
||||
reply.raw.writeHead(200, buildSseHeaders(typeof req.headers.origin === "string" ? req.headers.origin : undefined));
|
||||
reply.raw.flushHeaders();
|
||||
|
||||
const send = (event: string, data: any) => {
|
||||
reply.raw.write(`event: ${event}\n`);
|
||||
reply.raw.write(`data: ${JSON.stringify(data)}\n\n`);
|
||||
};
|
||||
|
||||
for await (const ev of runMultiplexStream(body)) {
|
||||
if (ev.type === "meta") send("meta", ev);
|
||||
else if (ev.type === "tool_call") send("tool_call", ev.event);
|
||||
else if (ev.type === "delta") send("delta", ev);
|
||||
else if (ev.type === "done") send("done", ev);
|
||||
else if (ev.type === "error") send("error", ev);
|
||||
writeSseEvent(reply, mapChatStreamEvent(ev));
|
||||
}
|
||||
|
||||
reply.raw.end();
|
||||
if (!reply.raw.destroyed && !reply.raw.writableEnded) {
|
||||
reply.raw.end();
|
||||
}
|
||||
return reply;
|
||||
});
|
||||
}
|
||||
|
||||
34
server/tests/active-streams.test.ts
Normal file
34
server/tests/active-streams.test.ts
Normal file
@@ -0,0 +1,34 @@
|
||||
import assert from "node:assert/strict";
|
||||
import test from "node:test";
|
||||
import { ActiveSseStream, type SseStreamEvent } from "../src/active-streams.js";
|
||||
|
||||
test("ActiveSseStream replays buffered events to late subscribers", () => {
|
||||
const stream = new ActiveSseStream();
|
||||
stream.emit("delta", { text: "hel" });
|
||||
stream.emit("delta", { text: "lo" });
|
||||
|
||||
const events: SseStreamEvent[] = [];
|
||||
const unsubscribe = stream.subscribe((event) => events.push(event));
|
||||
unsubscribe();
|
||||
|
||||
assert.deepEqual(events, [
|
||||
{ event: "delta", data: { text: "hel" } },
|
||||
{ event: "delta", data: { text: "lo" } },
|
||||
]);
|
||||
});
|
||||
|
||||
test("ActiveSseStream replays terminal events after completion", async () => {
|
||||
const stream = new ActiveSseStream();
|
||||
stream.emit("delta", { text: "done" });
|
||||
stream.complete({ event: "done", data: { text: "done" } });
|
||||
await stream.done;
|
||||
|
||||
const events: SseStreamEvent[] = [];
|
||||
stream.subscribe((event) => events.push(event));
|
||||
|
||||
assert.equal(stream.isCompleted, true);
|
||||
assert.deepEqual(events, [
|
||||
{ event: "delta", data: { text: "done" } },
|
||||
{ event: "done", data: { text: "done" } },
|
||||
]);
|
||||
});
|
||||
@@ -1,6 +1,7 @@
|
||||
import assert from "node:assert/strict";
|
||||
import test from "node:test";
|
||||
import {
|
||||
runPlainChatCompletionsStream,
|
||||
runToolAwareChatCompletionsStream,
|
||||
runToolAwareOpenAIChatStream,
|
||||
type ToolAwareStreamingEvent,
|
||||
@@ -105,3 +106,37 @@ test("OpenAI-compatible Chat Completions stream emits text deltas as they arrive
|
||||
);
|
||||
assert.equal(events.at(-1)?.type === "done" ? events.at(-1)?.result.text : null, "Hello");
|
||||
});
|
||||
|
||||
test("plain Chat Completions stream does not send Sybil-managed tools", async () => {
|
||||
let requestBody: any = null;
|
||||
const client = {
|
||||
chat: {
|
||||
completions: {
|
||||
create: async (body: any) => {
|
||||
requestBody = body;
|
||||
return streamFrom([
|
||||
{ choices: [{ delta: { content: "Hi" } }] },
|
||||
{ choices: [{ delta: {}, finish_reason: "stop" }] },
|
||||
]);
|
||||
},
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
const events = await collectEvents(
|
||||
runPlainChatCompletionsStream({
|
||||
client: client as any,
|
||||
model: "hermes-agent",
|
||||
messages: [{ role: "user", content: "Say hi" }],
|
||||
})
|
||||
);
|
||||
|
||||
assert.equal(requestBody.model, "hermes-agent");
|
||||
assert.equal(requestBody.stream, true);
|
||||
assert.equal("tools" in requestBody, false);
|
||||
assert.deepEqual(
|
||||
events.map((event) => event.type),
|
||||
["delta", "done"]
|
||||
);
|
||||
assert.equal(events.at(-1)?.type === "done" ? events.at(-1)?.result.text : null, "Hi");
|
||||
});
|
||||
|
||||
12
server/tests/provider-ids.test.ts
Normal file
12
server/tests/provider-ids.test.ts
Normal file
@@ -0,0 +1,12 @@
|
||||
import assert from "node:assert/strict";
|
||||
import test from "node:test";
|
||||
import { fromPrismaProvider, serializeProviderFields, toPrismaProvider } from "../src/llm/provider-ids.js";
|
||||
|
||||
test("Hermes Agent provider id maps between API and Prisma enum forms", () => {
|
||||
assert.equal(toPrismaProvider("hermes-agent"), "hermes_agent");
|
||||
assert.equal(fromPrismaProvider("hermes_agent"), "hermes-agent");
|
||||
assert.deepEqual(serializeProviderFields({ initiatedProvider: "hermes_agent", lastUsedProvider: "xai" }), {
|
||||
initiatedProvider: "hermes-agent",
|
||||
lastUsedProvider: "xai",
|
||||
});
|
||||
});
|
||||
@@ -23,7 +23,7 @@ Configuration is environment-only (no in-app settings).
|
||||
|
||||
- `SYBIL_TUI_API_BASE_URL`: API base URL. Default: `http://127.0.0.1:8787`
|
||||
- `SYBIL_TUI_ADMIN_TOKEN`: optional bearer token for token-mode servers
|
||||
- `SYBIL_TUI_DEFAULT_PROVIDER`: `openai` | `anthropic` | `xai` (default: `openai`)
|
||||
- `SYBIL_TUI_DEFAULT_PROVIDER`: `openai` | `anthropic` | `xai` | `hermes-agent` (default: `openai`)
|
||||
- `SYBIL_TUI_DEFAULT_MODEL`: optional default model name
|
||||
- `SYBIL_TUI_SEARCH_NUM_RESULTS`: results per search run (default: `10`)
|
||||
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import type { Provider } from "./types.js";
|
||||
|
||||
const PROVIDERS: Provider[] = ["openai", "anthropic", "xai"];
|
||||
const PROVIDERS: Provider[] = ["openai", "anthropic", "xai", "hermes-agent"];
|
||||
|
||||
function normalizeBaseUrl(value: string) {
|
||||
const trimmed = value.trim();
|
||||
|
||||
@@ -39,11 +39,13 @@ type ToolLogMetadata = {
|
||||
resultPreview?: string | null;
|
||||
};
|
||||
|
||||
const PROVIDERS: Provider[] = ["openai", "anthropic", "xai"];
|
||||
const BASE_PROVIDERS: Provider[] = ["openai", "anthropic", "xai"];
|
||||
const PROVIDERS: Provider[] = [...BASE_PROVIDERS, "hermes-agent"];
|
||||
const PROVIDER_FALLBACK_MODELS: Record<Provider, string[]> = {
|
||||
openai: ["gpt-4.1-mini"],
|
||||
anthropic: ["claude-3-5-sonnet-latest"],
|
||||
xai: ["grok-3-mini"],
|
||||
"hermes-agent": ["hermes-agent"],
|
||||
};
|
||||
|
||||
const EMPTY_MODEL_CATALOG: ModelCatalogResponse["providers"] = {
|
||||
@@ -74,6 +76,7 @@ function getProviderLabel(provider: Provider | null | undefined) {
|
||||
if (provider === "openai") return "OpenAI";
|
||||
if (provider === "anthropic") return "Anthropic";
|
||||
if (provider === "xai") return "xAI";
|
||||
if (provider === "hermes-agent") return "Hermes Agent";
|
||||
return "";
|
||||
}
|
||||
|
||||
@@ -159,6 +162,10 @@ function getModelOptions(catalog: ModelCatalogResponse["providers"], provider: P
|
||||
return PROVIDER_FALLBACK_MODELS[provider];
|
||||
}
|
||||
|
||||
function getVisibleProviders(catalog: ModelCatalogResponse["providers"]) {
|
||||
return PROVIDERS.filter((provider) => provider !== "hermes-agent" || catalog[provider] !== undefined);
|
||||
}
|
||||
|
||||
function pickProviderModel(options: string[], preferred: string | null, fallback: string | null = null) {
|
||||
if (fallback && options.includes(fallback)) return fallback;
|
||||
if (preferred && options.includes(preferred)) return preferred;
|
||||
@@ -202,6 +209,7 @@ async function main() {
|
||||
openai: null,
|
||||
anthropic: null,
|
||||
xai: null,
|
||||
"hermes-agent": null,
|
||||
};
|
||||
let model: string = config.defaultModel ?? pickProviderModel(getModelOptions(modelCatalog, provider), null);
|
||||
let errorMessage: string | null = null;
|
||||
@@ -1257,8 +1265,10 @@ async function main() {
|
||||
}
|
||||
|
||||
function cycleProvider() {
|
||||
const currentIndex = PROVIDERS.indexOf(provider);
|
||||
const nextProvider: Provider = PROVIDERS[(currentIndex + 1) % PROVIDERS.length] ?? "openai";
|
||||
const visibleProviders = getVisibleProviders(modelCatalog);
|
||||
const cycleProviders = visibleProviders.length ? visibleProviders : BASE_PROVIDERS;
|
||||
const currentIndex = Math.max(0, cycleProviders.indexOf(provider));
|
||||
const nextProvider: Provider = cycleProviders[(currentIndex + 1) % cycleProviders.length] ?? "openai";
|
||||
provider = nextProvider;
|
||||
syncModelForProvider();
|
||||
updateUI();
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
export type Provider = "openai" | "anthropic" | "xai";
|
||||
export type Provider = "openai" | "anthropic" | "xai" | "hermes-agent";
|
||||
|
||||
export type ProviderModelInfo = {
|
||||
models: string[];
|
||||
@@ -7,7 +7,7 @@ export type ProviderModelInfo = {
|
||||
};
|
||||
|
||||
export type ModelCatalogResponse = {
|
||||
providers: Record<Provider, ProviderModelInfo>;
|
||||
providers: Partial<Record<Provider, ProviderModelInfo>>;
|
||||
};
|
||||
|
||||
export type ChatSummary = {
|
||||
|
||||
911
web/src/App.tsx
911
web/src/App.tsx
File diff suppressed because it is too large
Load Diff
@@ -206,17 +206,31 @@ textarea {
|
||||
}
|
||||
|
||||
.md-content code {
|
||||
background: hsl(288 22% 23%);
|
||||
border-radius: 0.25rem;
|
||||
background: hsl(249 40% 10% / 0.78);
|
||||
border-radius: 0.3rem;
|
||||
padding: 0.05rem 0.3rem;
|
||||
font-size: 0.86em;
|
||||
box-decoration-break: clone;
|
||||
-webkit-box-decoration-break: clone;
|
||||
}
|
||||
|
||||
.md-content pre {
|
||||
overflow-x: auto;
|
||||
border-radius: 0.5rem;
|
||||
background: hsl(287 28% 13%);
|
||||
padding: 0.6rem 0.75rem;
|
||||
border: 1px solid hsl(253 31% 29% / 0.72);
|
||||
border-radius: 0.625rem;
|
||||
background: hsl(249 40% 10% / 0.82);
|
||||
padding: 0.75rem;
|
||||
box-shadow: inset 0 1px 0 hsl(258 80% 88% / 0.05);
|
||||
}
|
||||
|
||||
.md-content pre code {
|
||||
display: block;
|
||||
background: transparent;
|
||||
border-radius: 0;
|
||||
padding: 0;
|
||||
font-size: 0.88em;
|
||||
line-height: 1.55;
|
||||
white-space: pre;
|
||||
}
|
||||
|
||||
.md-content a {
|
||||
|
||||
@@ -127,7 +127,7 @@ export type CompletionRequestMessage = {
|
||||
attachments?: ChatAttachment[];
|
||||
};
|
||||
|
||||
export type Provider = "openai" | "anthropic" | "xai";
|
||||
export type Provider = "openai" | "anthropic" | "xai" | "hermes-agent";
|
||||
|
||||
export type ProviderModelInfo = {
|
||||
models: string[];
|
||||
@@ -136,7 +136,12 @@ export type ProviderModelInfo = {
|
||||
};
|
||||
|
||||
export type ModelCatalogResponse = {
|
||||
providers: Record<Provider, ProviderModelInfo>;
|
||||
providers: Partial<Record<Provider, ProviderModelInfo>>;
|
||||
};
|
||||
|
||||
export type ActiveRunsResponse = {
|
||||
chats: string[];
|
||||
searches: string[];
|
||||
};
|
||||
|
||||
type CompletionResponse = {
|
||||
@@ -217,6 +222,10 @@ export async function listModels() {
|
||||
return api<ModelCatalogResponse>("/v1/models");
|
||||
}
|
||||
|
||||
export async function getActiveRuns() {
|
||||
return api<ActiveRunsResponse>("/v1/active-runs");
|
||||
}
|
||||
|
||||
export async function createChat(input?: string | CreateChatRequest) {
|
||||
const body = typeof input === "string" ? { title: input } : input ?? {};
|
||||
const data = await api<{ chat: ChatSummary }>("/v1/chats", {
|
||||
@@ -333,6 +342,85 @@ type RunSearchStreamHandlers = {
|
||||
onError?: (payload: { message: string }) => void;
|
||||
};
|
||||
|
||||
async function readSseStream(response: Response, dispatch: (eventName: string, payload: any) => void) {
|
||||
if (!response.ok) {
|
||||
const fallback = `${response.status} ${response.statusText}`;
|
||||
let message = fallback;
|
||||
try {
|
||||
const body = (await response.json()) as { message?: string };
|
||||
if (body.message) message = body.message;
|
||||
} catch {
|
||||
// keep fallback message
|
||||
}
|
||||
throw new Error(message);
|
||||
}
|
||||
|
||||
if (!response.body) {
|
||||
throw new Error("No response stream");
|
||||
}
|
||||
|
||||
const reader = response.body.getReader();
|
||||
const decoder = new TextDecoder();
|
||||
let buffer = "";
|
||||
let eventName = "message";
|
||||
let dataLines: string[] = [];
|
||||
|
||||
const flushEvent = () => {
|
||||
if (!dataLines.length) {
|
||||
eventName = "message";
|
||||
return;
|
||||
}
|
||||
|
||||
const dataText = dataLines.join("\n");
|
||||
let payload: any = null;
|
||||
try {
|
||||
payload = JSON.parse(dataText);
|
||||
} catch {
|
||||
payload = { message: dataText };
|
||||
}
|
||||
|
||||
dispatch(eventName, payload);
|
||||
|
||||
dataLines = [];
|
||||
eventName = "message";
|
||||
};
|
||||
|
||||
while (true) {
|
||||
const { value, done } = await reader.read();
|
||||
if (done) break;
|
||||
|
||||
buffer += decoder.decode(value, { stream: true });
|
||||
let newlineIndex = buffer.indexOf("\n");
|
||||
|
||||
while (newlineIndex >= 0) {
|
||||
const rawLine = buffer.slice(0, newlineIndex);
|
||||
buffer = buffer.slice(newlineIndex + 1);
|
||||
const line = rawLine.endsWith("\r") ? rawLine.slice(0, -1) : rawLine;
|
||||
|
||||
if (!line) {
|
||||
flushEvent();
|
||||
} else if (line.startsWith("event:")) {
|
||||
eventName = line.slice("event:".length).trim();
|
||||
} else if (line.startsWith("data:")) {
|
||||
dataLines.push(line.slice("data:".length).trimStart());
|
||||
}
|
||||
|
||||
newlineIndex = buffer.indexOf("\n");
|
||||
}
|
||||
}
|
||||
|
||||
buffer += decoder.decode();
|
||||
if (buffer.length) {
|
||||
const line = buffer.endsWith("\r") ? buffer.slice(0, -1) : buffer;
|
||||
if (line.startsWith("event:")) {
|
||||
eventName = line.slice("event:".length).trim();
|
||||
} else if (line.startsWith("data:")) {
|
||||
dataLines.push(line.slice("data:".length).trimStart());
|
||||
}
|
||||
}
|
||||
flushEvent();
|
||||
}
|
||||
|
||||
export async function runSearchStream(
|
||||
searchId: string,
|
||||
body: SearchRunRequest,
|
||||
@@ -437,6 +525,30 @@ export async function runSearchStream(
|
||||
flushEvent();
|
||||
}
|
||||
|
||||
export async function attachSearchStream(searchId: string, handlers: RunSearchStreamHandlers, options?: { signal?: AbortSignal }) {
|
||||
const headers = new Headers({
|
||||
Accept: "text/event-stream",
|
||||
});
|
||||
if (authToken) {
|
||||
headers.set("Authorization", `Bearer ${authToken}`);
|
||||
}
|
||||
|
||||
const response = await fetch(`${API_BASE_URL}/v1/searches/${searchId}/run/stream/attach`, {
|
||||
method: "POST",
|
||||
headers,
|
||||
signal: options?.signal,
|
||||
});
|
||||
|
||||
await readSseStream(response, (eventName, payload) => {
|
||||
if (eventName === "search_results") handlers.onSearchResults?.(payload);
|
||||
else if (eventName === "search_error") handlers.onSearchError?.(payload);
|
||||
else if (eventName === "answer") handlers.onAnswer?.(payload);
|
||||
else if (eventName === "answer_error") handlers.onAnswerError?.(payload);
|
||||
else if (eventName === "done") handlers.onDone?.(payload);
|
||||
else if (eventName === "error") handlers.onError?.(payload);
|
||||
});
|
||||
}
|
||||
|
||||
export async function runCompletion(body: {
|
||||
chatId: string;
|
||||
provider: Provider;
|
||||
@@ -556,3 +668,26 @@ export async function runCompletionStream(
|
||||
}
|
||||
flushEvent();
|
||||
}
|
||||
|
||||
export async function attachCompletionStream(chatId: string, handlers: CompletionStreamHandlers, options?: { signal?: AbortSignal }) {
|
||||
const headers = new Headers({
|
||||
Accept: "text/event-stream",
|
||||
});
|
||||
if (authToken) {
|
||||
headers.set("Authorization", `Bearer ${authToken}`);
|
||||
}
|
||||
|
||||
const response = await fetch(`${API_BASE_URL}/v1/chats/${chatId}/stream/attach`, {
|
||||
method: "POST",
|
||||
headers,
|
||||
signal: options?.signal,
|
||||
});
|
||||
|
||||
await readSseStream(response, (eventName, payload) => {
|
||||
if (eventName === "meta") handlers.onMeta?.(payload);
|
||||
else if (eventName === "tool_call") handlers.onToolCall?.(payload);
|
||||
else if (eventName === "delta") handlers.onDelta?.(payload);
|
||||
else if (eventName === "done") handlers.onDone?.(payload);
|
||||
else if (eventName === "error") handlers.onError?.(payload);
|
||||
});
|
||||
}
|
||||
|
||||
Reference in New Issue
Block a user