8 Commits

40 changed files with 2128 additions and 599 deletions

View File

@@ -12,6 +12,9 @@ services:
OPENAI_API_KEY: ${OPENAI_API_KEY:-}
ANTHROPIC_API_KEY: ${ANTHROPIC_API_KEY:-}
XAI_API_KEY: ${XAI_API_KEY:-}
HERMES_AGENT_API_BASE_URL: ${HERMES_AGENT_API_BASE_URL:-http://127.0.0.1:8642/v1}
HERMES_AGENT_API_KEY: ${HERMES_AGENT_API_KEY:-}
HERMES_AGENT_MODEL: ${HERMES_AGENT_MODEL:-}
EXA_API_KEY: ${EXA_API_KEY:-}
CHAT_WEB_SEARCH_ENGINE: ${CHAT_WEB_SEARCH_ENGINE:-exa}
SEARXNG_BASE_URL: ${SEARXNG_BASE_URL:-}

View File

@@ -33,11 +33,13 @@ Chat upload limits:
"providers": {
"openai": { "models": ["gpt-4.1-mini"], "loadedAt": "2026-02-14T00:00:00.000Z", "error": null },
"anthropic": { "models": ["claude-3-5-sonnet-latest"], "loadedAt": null, "error": null },
"xai": { "models": ["grok-3-mini"], "loadedAt": null, "error": null }
"xai": { "models": ["grok-3-mini"], "loadedAt": null, "error": null },
"hermes-agent": { "models": ["hermes-agent"], "loadedAt": null, "error": null }
}
}
```
- OpenAI model lists are filtered to models that are expected to work with the backend's Responses API implementation.
- `hermes-agent` is included only when `HERMES_AGENT_API_KEY` is configured. Set it to Hermes `API_SERVER_KEY`, or any non-empty value if that local server does not require auth. `HERMES_AGENT_API_BASE_URL` defaults to `http://127.0.0.1:8642/v1`; set `HERMES_AGENT_MODEL` only when you need an additional fallback/override model id.
## Active Runs
@@ -65,7 +67,7 @@ Behavior notes:
```json
{
"title": "optional title",
"provider": "optional openai|anthropic|xai",
"provider": "optional openai|anthropic|xai|hermes-agent",
"model": "optional model id",
"messages": [
{
@@ -152,7 +154,7 @@ Notes:
```json
{
"chatId": "optional-chat-id",
"provider": "openai|anthropic|xai",
"provider": "openai|anthropic|xai|hermes-agent",
"model": "string",
"messages": [
{
@@ -206,11 +208,12 @@ Behavior notes:
- Text files are forwarded as explicit text blocks rather than provider-managed file references. Large text attachments should already be truncated client-side before submission.
- For `openai`, backend calls OpenAI's Responses API and enables internal tool use with an internal system instruction.
- For `xai`, backend calls xAI's OpenAI-compatible Chat Completions API and enables internal tool use with the same internal system instruction.
- For `hermes-agent`, backend calls the configured Hermes Agent OpenAI-compatible Chat Completions API without adding Sybil-managed tool definitions; Hermes Agent handles its own tools server-side.
- For `openai`, image attachments are sent as Responses `input_image` items and text attachments are sent as `input_text` items.
- For `xai`, image attachments are sent as Chat Completions content parts alongside text.
- For `xai` and `hermes-agent`, image attachments are sent as Chat Completions content parts alongside text.
- For `openai`, Responses calls that can enter the server-managed tool loop use `store: true` so reasoning and function-call items can be passed between tool rounds.
- For `anthropic`, image attachments are sent as Messages API `image` blocks using base64 source data; text attachments are added as `text` blocks.
- Available tool calls for chat: `web_search` and `fetch_url`. When `CHAT_CODEX_TOOL_ENABLED=true`, `codex_exec` is also available. When `CHAT_SHELL_TOOL_ENABLED=true`, `shell_exec` is also available.
- Available Sybil-managed tool calls for `openai` and `xai`: `web_search` and `fetch_url`. When `CHAT_CODEX_TOOL_ENABLED=true`, `codex_exec` is also available. When `CHAT_SHELL_TOOL_ENABLED=true`, `shell_exec` is also available.
- `web_search` returns ranked results with per-result summaries/snippets. Its backend engine is selected by `CHAT_WEB_SEARCH_ENGINE` (`exa` default, or `searxng` with `SEARXNG_BASE_URL` set). SearXNG mode requires the instance to allow `format=json`.
- `fetch_url` fetches a URL and returns plaintext page content (HTML converted to text server-side).
- `codex_exec` delegates coding, shell, repository inspection, and other complex software tasks to a persistent remote Codex CLI workspace over SSH. The server runs `codex exec --dangerously-bypass-approvals-and-sandbox --skip-git-repo-check <non-interactive wrapped prompt>` on the configured devbox inside `CHAT_CODEX_REMOTE_WORKDIR`, with SSH stdin closed.
@@ -311,9 +314,9 @@ Behavior notes:
"title": null,
"createdAt": "...",
"updatedAt": "...",
"initiatedProvider": "openai|anthropic|xai|null",
"initiatedProvider": "openai|anthropic|xai|hermes-agent|null",
"initiatedModel": "string|null",
"lastUsedProvider": "openai|anthropic|xai|null",
"lastUsedProvider": "openai|anthropic|xai|hermes-agent|null",
"lastUsedModel": "string|null"
}
```
@@ -359,9 +362,9 @@ Behavior notes:
"title": null,
"createdAt": "...",
"updatedAt": "...",
"initiatedProvider": "openai|anthropic|xai|null",
"initiatedProvider": "openai|anthropic|xai|hermes-agent|null",
"initiatedModel": "string|null",
"lastUsedProvider": "openai|anthropic|xai|null",
"lastUsedProvider": "openai|anthropic|xai|hermes-agent|null",
"lastUsedModel": "string|null",
"messages": [Message]
}

View File

@@ -21,7 +21,7 @@ Authentication:
{
"chatId": "optional-chat-id",
"persist": true,
"provider": "openai|anthropic|xai",
"provider": "openai|anthropic|xai|hermes-agent",
"model": "string",
"messages": [
{
@@ -152,8 +152,9 @@ For `persist: false` streams, `chatId` and `callId` are `null`.
- `openai`: backend uses OpenAI's Responses API and may execute internal function tool calls (`web_search`, `fetch_url`, optional `codex_exec`, and optional `shell_exec`) before producing final text.
- `xai`: backend uses xAI's OpenAI-compatible Chat Completions API and may execute the same internal tool calls before producing final text.
- `hermes-agent`: backend uses the configured Hermes Agent OpenAI-compatible Chat Completions API. Sybil does not add its own tool definitions for this provider; Hermes Agent handles its own tools server-side. Custom Hermes stream events are normalized away unless they produce text deltas in this SSE contract.
- `openai`: image attachments are sent as Responses `input_image` items; text attachments are sent as `input_text` items.
- `xai`: image attachments are sent as Chat Completions content parts; text attachments are inlined as text parts.
- `xai` and `hermes-agent`: image attachments are sent as Chat Completions content parts; text attachments are inlined as text parts.
- `openai`: Responses calls that can enter the server-managed tool loop use `store: true` so reasoning and function-call items can be passed between tool rounds.
- `anthropic`: streamed via event stream; emits `delta` from `content_block_delta` with `text_delta`. Image attachments are sent as base64 `image` blocks and text attachments are appended as `text` blocks.
- `web_search` uses `CHAT_WEB_SEARCH_ENGINE` (`exa` default, or `searxng` with `SEARXNG_BASE_URL` set). SearXNG mode requires the instance to allow `format=json`. This only affects chat-mode tool calls, not search-mode endpoints.

View File

@@ -51,3 +51,4 @@ Instructions for work under `/Users/buzzert/src/sybil-2/ios`.
- OpenAI: `gpt-4.1-mini`
- Anthropic: `claude-3-5-sonnet-latest`
- xAI: `grok-3-mini`
- Hermes Agent: `hermes-agent`

17
ios/Apps/Sybil/Info.plist Normal file
View File

@@ -0,0 +1,17 @@
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
<key>UIApplicationShortcutItems</key>
<array>
<dict>
<key>UIApplicationShortcutItemType</key>
<string>net.buzzert.sybil2.quick-question</string>
<key>UIApplicationShortcutItemTitle</key>
<string>Quick question</string>
<key>UIApplicationShortcutItemIconSymbolName</key>
<string>sparkles</string>
</dict>
</array>
</dict>
</plist>

View File

@@ -5,6 +5,8 @@ import UIKit
@main
struct SybilApp: App
{
@UIApplicationDelegateAdaptor(SybilAppDelegate.self) private var appDelegate
var body: some Scene {
WindowGroup {
SplitView()
@@ -14,3 +16,79 @@ struct SybilApp: App
}
}
}
@MainActor
final class SybilAppDelegate: NSObject, UIApplicationDelegate {
func application(
_ application: UIApplication,
didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]? = nil
) -> Bool {
SybilHomeScreenQuickActionHandler.configureQuickActions()
return true
}
func application(
_ application: UIApplication,
configurationForConnecting connectingSceneSession: UISceneSession,
options: UIScene.ConnectionOptions
) -> UISceneConfiguration {
let configuration = UISceneConfiguration(
name: "Default Configuration",
sessionRole: connectingSceneSession.role
)
configuration.delegateClass = SybilSceneDelegate.self
return configuration
}
func application(
_ application: UIApplication,
performActionFor shortcutItem: UIApplicationShortcutItem,
completionHandler: @escaping (Bool) -> Void
) {
completionHandler(SybilHomeScreenQuickActionHandler.handle(shortcutItem))
}
}
@MainActor
final class SybilSceneDelegate: NSObject, UIWindowSceneDelegate {
func scene(
_ scene: UIScene,
willConnectTo session: UISceneSession,
options connectionOptions: UIScene.ConnectionOptions
) {
if let shortcutItem = connectionOptions.shortcutItem {
_ = SybilHomeScreenQuickActionHandler.handle(shortcutItem)
}
}
func windowScene(
_ windowScene: UIWindowScene,
performActionFor shortcutItem: UIApplicationShortcutItem,
completionHandler: @escaping (Bool) -> Void
) {
completionHandler(SybilHomeScreenQuickActionHandler.handle(shortcutItem))
}
func sceneWillResignActive(_ scene: UIScene) {
SybilHomeScreenQuickActionHandler.configureQuickActions()
}
}
@MainActor
private enum SybilHomeScreenQuickActionHandler {
static func configureQuickActions() {
// The quick question action is static in Info.plist so it is available before first launch.
UIApplication.shared.shortcutItems = []
}
static func handle(_ shortcutItem: UIApplicationShortcutItem) -> Bool {
guard shortcutItem.type == SybilHomeScreenQuickAction.quickQuestionType else {
return false
}
Task { @MainActor in
SybilQuickActionRouter.shared.requestQuickQuestionPresentation()
}
return true
}
}

View File

@@ -22,9 +22,10 @@ targets:
SUPPORTS_MAC_DESIGNED_FOR_IPHONE_IPAD: NO
TARGETED_DEVICE_FAMILY: "1,2,6"
GENERATE_INFOPLIST_FILE: YES
INFOPLIST_FILE: Apps/Sybil/Info.plist
ASSETCATALOG_COMPILER_APPICON_NAME: AppIcon
MARKETING_VERSION: 1.5
CURRENT_PROJECT_VERSION: 6
MARKETING_VERSION: 1.7
CURRENT_PROJECT_VERSION: 8
INFOPLIST_KEY_CFBundleDisplayName: Sybil
INFOPLIST_KEY_ITSAppUsesNonExemptEncryption: NO
INFOPLIST_KEY_UIApplicationSupportsIndirectInputEvents: YES

View File

@@ -2,10 +2,14 @@ import SwiftUI
public struct SplitView: View {
@State private var viewModel = SybilViewModel()
@ObservedObject private var quickActionRouter = SybilQuickActionRouter.shared
@Environment(\.horizontalSizeClass) private var horizontalSizeClass
@Environment(\.scenePhase) private var scenePhase
@State private var shouldRefreshOnForeground = false
@State private var composerFocusRequest = 0
@State private var quickQuestionFocusRequest = 0
@State private var hasPendingQuickQuestionPresentation = false
@State private var isQuickQuestionPresented = false
@State private var columnVisibility: NavigationSplitViewVisibility = .automatic
private var keyboardActions: SybilKeyboardActions? {
@@ -74,8 +78,28 @@ public struct SplitView: View {
.font(.sybil(.body))
.preferredColorScheme(.dark)
.focusedSceneValue(\.sybilKeyboardActions, keyboardActions)
.sheet(isPresented: $isQuickQuestionPresented, onDismiss: handleQuickQuestionDismissed) {
SybilQuickQuestionView(
viewModel: viewModel,
focusRequest: quickQuestionFocusRequest
)
.presentationDragIndicator(.visible)
}
.task {
await viewModel.bootstrap()
presentPendingQuickQuestionIfPossible()
}
.onReceive(quickActionRouter.$quickQuestionPresentationRequest) { request in
guard request > 0 else {
return
}
queueQuickQuestionPresentation()
}
.onChange(of: viewModel.isCheckingSession) { _, _ in
presentPendingQuickQuestionIfPossible()
}
.onChange(of: viewModel.isAuthenticated) { _, _ in
presentPendingQuickQuestionIfPossible()
}
.onChange(of: scenePhase) { _, nextPhase in
switch nextPhase {
@@ -112,6 +136,28 @@ public struct SplitView: View {
columnVisibility = .all
}
}
private func queueQuickQuestionPresentation() {
hasPendingQuickQuestionPresentation = true
presentPendingQuickQuestionIfPossible()
}
private func presentPendingQuickQuestionIfPossible() {
guard hasPendingQuickQuestionPresentation,
!viewModel.isCheckingSession,
viewModel.isAuthenticated
else {
return
}
hasPendingQuickQuestionPresentation = false
quickQuestionFocusRequest += 1
isQuickQuestionPresented = true
}
private func handleQuickQuestionDismissed() {
viewModel.cancelQuickQuestion()
}
}
public struct SybilCommands: Commands {

View File

@@ -49,11 +49,16 @@ actor SybilAPIClient: SybilAPIClienting {
return response.chats
}
func createChat(title: String? = nil) async throws -> ChatSummary {
func createChat(
title: String? = nil,
provider: Provider? = nil,
model: String? = nil,
messages: [CompletionRequestMessage]? = nil
) async throws -> ChatSummary {
let response = try await request(
"/v1/chats",
method: "POST",
body: AnyEncodable(ChatCreateBody(title: title)),
body: AnyEncodable(ChatCreateBody(title: title, provider: provider, model: model, messages: messages)),
responseType: ChatCreateResponse.self
)
return response.chat
@@ -617,6 +622,7 @@ actor SybilAPIClient: SybilAPIClienting {
struct CompletionStreamRequest: Codable, Sendable {
var chatId: String?
var persist: Bool? = nil
var provider: Provider
var model: String
var messages: [CompletionRequestMessage]
@@ -624,6 +630,9 @@ struct CompletionStreamRequest: Codable, Sendable {
private struct ChatCreateBody: Encodable {
var title: String?
var provider: Provider?
var model: String?
var messages: [CompletionRequestMessage]?
}
private struct SearchCreateBody: Encodable {

View File

@@ -3,7 +3,12 @@ import Foundation
protocol SybilAPIClienting: Sendable {
func verifySession() async throws -> AuthSession
func listChats() async throws -> [ChatSummary]
func createChat(title: String?) async throws -> ChatSummary
func createChat(
title: String?,
provider: Provider?,
model: String?,
messages: [CompletionRequestMessage]?
) async throws -> ChatSummary
func getChat(chatID: String) async throws -> ChatDetail
func deleteChat(chatID: String) async throws
func suggestChatTitle(chatID: String, content: String) async throws -> ChatSummary
@@ -32,3 +37,9 @@ protocol SybilAPIClienting: Sendable {
onEvent: @escaping @Sendable (SearchStreamEvent) async -> Void
) async throws
}
extension SybilAPIClienting {
func createChat(title: String?) async throws -> ChatSummary {
try await createChat(title: title, provider: nil, model: nil, messages: nil)
}
}

View File

@@ -7,6 +7,9 @@ struct SybilChatTranscriptView: View {
var isSending: Bool
var topContentInset: CGFloat = 0
var bottomContentInset: CGFloat = 0
var tailSpacerHeight: CGFloat = 0
var onViewportHeightChange: ((CGFloat) -> Void)? = nil
var onPendingAssistantHeightChange: ((CGFloat) -> Void)? = nil
private var hasPendingAssistant: Bool {
messages.contains { message in
@@ -17,21 +20,19 @@ struct SybilChatTranscriptView: View {
var body: some View {
ScrollView {
LazyVStack(alignment: .leading, spacing: 26) {
if isSending && !hasPendingAssistant {
HStack(spacing: 8) {
ProgressView()
.controlSize(.small)
.tint(SybilTheme.textMuted)
Text("Assistant is typing…")
.font(.sybil(.footnote))
.foregroundStyle(SybilTheme.textMuted)
}
.scaleEffect(x: 1, y: -1)
}
ForEach(messages.reversed()) { message in
MessageBubble(message: message, isSending: isSending)
.frame(maxWidth: .infinity)
.background {
if isStreamingPendingAssistant(message) {
GeometryReader { proxy in
Color.clear.preference(
key: SybilPendingAssistantHeightPreferenceKey.self,
value: proxy.size.height
)
}
}
}
.scaleEffect(x: 1, y: -1)
}
@@ -45,13 +46,39 @@ struct SybilChatTranscriptView: View {
}
.frame(maxWidth: .infinity, alignment: .leading)
.padding(.horizontal, 14)
.padding(.top, 18 + bottomContentInset)
.padding(.top, 18 + bottomContentInset + tailSpacerHeight)
.padding(.bottom, 18 + topContentInset)
}
.frame(maxWidth: .infinity, alignment: .leading)
.scrollDismissesKeyboard(.interactively)
.background {
GeometryReader { proxy in
Color.clear
.onAppear {
onViewportHeightChange?(proxy.size.height)
}
.onChange(of: proxy.size.height) { _, height in
onViewportHeightChange?(height)
}
}
}
.onPreferenceChange(SybilPendingAssistantHeightPreferenceKey.self) { height in
onPendingAssistantHeightChange?(height)
}
.scaleEffect(x: 1, y: -1)
}
private func isStreamingPendingAssistant(_ message: Message) -> Bool {
isSending && message.id.hasPrefix("temp-assistant-")
}
}
private struct SybilPendingAssistantHeightPreferenceKey: PreferenceKey {
static let defaultValue: CGFloat = 0
static func reduce(value: inout CGFloat, nextValue: () -> CGFloat) {
value = max(value, nextValue())
}
}
private struct MessageBubble: View {

View File

@@ -4,12 +4,14 @@ public enum Provider: String, Codable, CaseIterable, Hashable, Sendable {
case openai
case anthropic
case xai
case hermesAgent = "hermes-agent"
public var displayName: String {
switch self {
case .openai: return "OpenAI"
case .anthropic: return "Anthropic"
case .xai: return "xAI"
case .hermesAgent: return "Hermes Agent"
}
}
}
@@ -404,8 +406,8 @@ public struct CompletionRequestMessage: Codable, Sendable {
}
public struct CompletionStreamMeta: Codable, Sendable {
public var chatId: String
public var callId: String
public var chatId: String?
public var callId: String?
public var provider: Provider
public var model: String
}

View File

@@ -22,75 +22,60 @@ enum PhoneRoute: Hashable {
struct SybilPhoneShellView: View {
@Bindable var viewModel: SybilViewModel
@State private var path: [PhoneRoute] = []
@State private var route: PhoneRoute = .draftChat
@Environment(\.scenePhase) private var scenePhase
@State private var shouldRefreshOnForeground = false
@State private var composerFocusRequest = 0
@State private var phoneStackWidth: CGFloat = BackSwipeMetrics.referenceWidth
@State private var backSwipeOffset: CGFloat = 0
@State private var backSwipeCompletionOffset: CGFloat = 0
@State private var backSwipeIsActive = false
@State private var backSwipeIsCompleting = false
@State private var backSwipeHasLatched = false
@State private var isSidebarOverlayPresented = false
@State private var sidebarSwipeOffset: CGFloat = 0
@State private var sidebarSwipeIsActive = false
@State private var sidebarSwipeIsCompleting = false
@State private var sidebarSwipeHasLatched = false
@State private var sidebarHighlightSelection: SidebarSelection?
@State private var sidebarHighlightClearTask: Task<Void, Never>?
@State private var openingSelectionRequestID: UUID?
private var canRecognizeBackSwipe: Bool {
!path.isEmpty && !backSwipeIsCompleting
private var canRecognizeSidebarSwipe: Bool {
!isSidebarOverlayPresented && !sidebarSwipeIsCompleting
}
private var backSwipeVisualOffset: CGFloat {
backSwipeOffset + backSwipeCompletionOffset
private var sidebarOverlayProgress: CGFloat {
if isSidebarOverlayPresented {
return 1
}
return SidebarOverlaySwipeMetrics.progress(
for: sidebarSwipeOffset,
width: phoneStackWidth
)
}
private var shouldRenderSidebarOverlay: Bool {
isSidebarOverlayPresented ||
sidebarSwipeIsActive ||
sidebarSwipeIsCompleting ||
sidebarOverlayProgress > 0.001
}
private var currentRouteSelection: SidebarSelection? {
switch route {
case let .chat(chatID):
return .chat(chatID)
case let .search(searchID):
return .search(searchID)
case .draftChat, .draftSearch, .settings:
return nil
}
}
private var highlightedSidebarSelection: SidebarSelection? {
sidebarHighlightSelection ?? currentRouteSelection
}
var body: some View {
GeometryReader { proxy in
ZStack(alignment: .topLeading) {
SybilPhoneSidebarRoot(viewModel: viewModel, path: $path)
.safeAreaInset(edge: .top, spacing: 0) {
phoneRootTopBar
}
.zIndex(0)
if let route = path.last {
SybilPhoneDestinationView(
viewModel: viewModel,
composerFocusRequest: $composerFocusRequest,
route: route,
onRequestBack: requestBack,
onRequestNewChat: startNewChatFromDestination
)
.background(SybilTheme.background)
.offset(x: backSwipeVisualOffset)
.shadow(
color: backSwipeVisualOffset > 0 ? Color.black.opacity(0.34) : Color.clear,
radius: backSwipeVisualOffset > 0 ? 18 : 0,
x: -8,
y: 0
)
.transition(.move(edge: .trailing))
.zIndex(1)
.background {
WorkspaceSwipePanInstaller(
direction: .right,
isEnabled: canRecognizeBackSwipe,
onBegan: { width in
beginBackSwipe(containerWidth: width)
},
onChanged: { translationX, width in
updateBackSwipe(with: translationX, containerWidth: width)
},
onEnded: { translationX, width, velocityX, didFinish in
finishBackSwipe(
translationX: translationX,
containerWidth: width,
velocityX: velocityX,
didFinish: didFinish
)
}
)
.frame(maxWidth: .infinity, maxHeight: .infinity)
}
}
}
phoneStack(width: proxy.size.width)
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
.onAppear {
updatePhoneStackWidth(proxy.size.width)
@@ -100,13 +85,8 @@ struct SybilPhoneShellView: View {
}
}
.tint(SybilTheme.primary)
.animation(.easeOut(duration: 0.22), value: path.last)
.onChange(of: path) { _, nextPath in
guard nextPath.isEmpty else {
return
}
resetBackSwipe(animated: false)
}
.animation(.easeOut(duration: 0.22), value: route)
.animation(.easeOut(duration: 0.18), value: isSidebarOverlayPresented)
.onChange(of: scenePhase) { _, nextPhase in
switch nextPhase {
case .background:
@@ -120,8 +100,8 @@ struct SybilPhoneShellView: View {
shouldRefreshOnForeground = false
Task {
await viewModel.refreshAfterAppBecameActive(
refreshCollections: path.isEmpty,
refreshSelection: !path.isEmpty && viewModel.hasRefreshableSelection
refreshCollections: isSidebarOverlayPresented,
refreshSelection: !isSidebarOverlayPresented && viewModel.hasRefreshableSelection
)
}
case .inactive:
@@ -133,16 +113,117 @@ struct SybilPhoneShellView: View {
}
}
private var phoneRootTopBar: some View {
HStack {
private func phoneStack(width: CGFloat) -> some View {
ZStack(alignment: .topLeading) {
phoneWorkspaceLayer
.zIndex(0)
phoneSidebarOverlayLayer(width: width)
.zIndex(1)
}
}
private var phoneWorkspaceLayer: some View {
SybilPhoneDestinationView(
viewModel: viewModel,
composerFocusRequest: $composerFocusRequest,
route: route,
onRequestBack: { _ in showSidebarOverlay() },
onRequestNewChat: sidebarWorkspaceNewChatAction,
onShowSidebar: showSidebarOverlay
)
.background(SybilTheme.background)
.blur(radius: SidebarOverlaySwipeMetrics.workspaceBlurRadius(for: sidebarOverlayProgress))
.opacity(SidebarOverlaySwipeMetrics.workspaceOpacity(for: sidebarOverlayProgress))
.allowsHitTesting(!shouldRenderSidebarOverlay)
.background {
sidebarSwipeInstaller
}
}
private func phoneSidebarOverlayLayer(width: CGFloat) -> some View {
VStack(spacing: 0) {
phoneOverlayTopBar
SybilPhoneSidebarRoot(
viewModel: viewModel,
highlightedSelection: highlightedSidebarSelection,
onSelect: openSidebarSelection,
onRoute: showRouteAndClearSidebarHighlight
)
}
.opacity(sidebarOverlayProgress)
.blur(radius: SidebarOverlaySwipeMetrics.overlayBlurRadius(for: sidebarOverlayProgress))
.offset(x: SidebarOverlaySwipeMetrics.overlayOffset(for: sidebarOverlayProgress, width: width))
.allowsHitTesting(isSidebarOverlayPresented)
.accessibilityHidden(!isSidebarOverlayPresented)
}
private var sidebarSwipeInstaller: some View {
WorkspaceSwipePanInstaller(
direction: .right,
isEnabled: canRecognizeSidebarSwipe,
onBegan: { width in
beginSidebarSwipe(containerWidth: width)
},
onChanged: { translationX, width in
updateSidebarSwipe(with: translationX, containerWidth: width)
},
onEnded: { translationX, width, velocityX, didFinish in
finishSidebarSwipe(
translationX: translationX,
containerWidth: width,
velocityX: velocityX,
didFinish: didFinish
)
}
)
.frame(maxWidth: .infinity, maxHeight: .infinity)
}
private var sidebarWorkspaceNewChatAction: (() -> Void)? {
guard !isSidebarOverlayPresented else {
return nil
}
return {
startNewChatFromDestination()
}
}
private var phoneOverlayTopBar: some View {
HStack(spacing: 12) {
SybilWordmark(size: 21)
Spacer()
Button {
hideSidebarOverlay()
} label: {
Image(systemName: "chevron.right.2")
.font(.system(size: 21, weight: .bold))
.foregroundStyle(SybilTheme.text)
.frame(width: 54, height: 54)
.background(
Circle()
.fill(.ultraThinMaterial)
.overlay(
Circle()
.fill(SybilTheme.surface.opacity(0.76))
)
)
.overlay(
Circle()
.stroke(SybilTheme.border.opacity(0.64), lineWidth: 1)
)
}
.buttonStyle(.plain)
.accessibilityLabel("Hide conversations")
}
.padding(.horizontal, 16)
.padding(.top, 10)
.padding(.bottom, 12)
.background {
SybilTheme.panelGradient
SybilPhoneOverlayBlurBand(edge: .top)
.ignoresSafeArea(edges: .top)
}
}
@@ -151,62 +232,98 @@ struct SybilPhoneShellView: View {
phoneStackWidth = max(width, 1)
}
private func requestBack(animateNavigation: Bool = true) {
guard !path.isEmpty, !backSwipeIsCompleting else {
return
}
if animateNavigation {
Task {
await completeBackSwipe(containerWidth: phoneStackWidth, releaseVelocityX: 0)
}
} else {
popRoute(disablesAnimations: true)
resetBackSwipe(animated: false)
}
}
private func startNewChatFromDestination() {
viewModel.startNewChat()
composerFocusRequest += 1
replaceTopRoute(with: .draftChat)
showRoute(.draftChat)
}
private func replaceTopRoute(with route: PhoneRoute) {
if path.isEmpty {
private func showRoute(_ nextRoute: PhoneRoute) {
let update = {
route = nextRoute
}
if isSidebarOverlayPresented {
withAnimation(.easeOut(duration: 0.22)) {
path = [route]
update()
isSidebarOverlayPresented = false
}
} else {
path[path.index(before: path.endIndex)] = route
update()
}
resetSidebarSwipe(animated: false)
}
private func popRoute(disablesAnimations: Bool) {
let pop = {
guard !path.isEmpty else {
private func showRouteAndClearSidebarHighlight(_ nextRoute: PhoneRoute) {
showRoute(nextRoute)
clearSidebarHighlight()
}
private func showSidebarOverlay() {
withAnimation(.easeOut(duration: 0.18)) {
isSidebarOverlayPresented = true
}
resetSidebarSwipe(animated: false)
}
private func hideSidebarOverlay() {
withAnimation(.easeOut(duration: 0.18)) {
isSidebarOverlayPresented = false
}
resetSidebarSwipe(animated: false)
}
private func openSidebarSelection(_ selection: SidebarSelection) {
if openingSelectionRequestID != nil, sidebarHighlightSelection == selection {
return
}
let requestID = UUID()
openingSelectionRequestID = requestID
setSidebarHighlight(selection)
Task {
await viewModel.selectForNavigation(selection)
guard openingSelectionRequestID == requestID else {
return
}
_ = path.removeLast()
}
if disablesAnimations {
var transaction = Transaction()
transaction.disablesAnimations = true
withTransaction(transaction) {
pop()
}
} else {
withAnimation(.easeOut(duration: 0.22)) {
pop()
}
showRoute(PhoneRoute.from(selection: selection))
openingSelectionRequestID = nil
clearSidebarHighlight(selection, after: .milliseconds(260))
}
}
private func beginBackSwipe(containerWidth: CGFloat) {
private func setSidebarHighlight(_ selection: SidebarSelection) {
sidebarHighlightClearTask?.cancel()
sidebarHighlightSelection = selection
}
private func clearSidebarHighlight(_ selection: SidebarSelection, after delay: Duration) {
sidebarHighlightClearTask?.cancel()
sidebarHighlightClearTask = Task { @MainActor in
try? await Task.sleep(for: delay)
guard !Task.isCancelled,
sidebarHighlightSelection == selection,
openingSelectionRequestID == nil else {
return
}
sidebarHighlightSelection = nil
}
}
private func clearSidebarHighlight() {
sidebarHighlightClearTask?.cancel()
openingSelectionRequestID = nil
sidebarHighlightSelection = nil
}
private func beginSidebarSwipe(containerWidth: CGFloat) {
let update = {
backSwipeIsActive = true
backSwipeHasLatched = false
phoneStackWidth = max(containerWidth, 1)
sidebarSwipeIsActive = true
sidebarSwipeHasLatched = false
}
var transaction = Transaction()
@@ -214,97 +331,79 @@ struct SybilPhoneShellView: View {
withTransaction(transaction, update)
}
private func updateBackSwipe(with rawTranslation: CGFloat, containerWidth: CGFloat) {
let nextOffset = BackSwipeMetrics.clampedOffset(for: rawTranslation, width: containerWidth)
let nextLatched = BackSwipeMetrics.isLatched(
private func updateSidebarSwipe(with rawTranslation: CGFloat, containerWidth: CGFloat) {
let nextOffset = SidebarOverlaySwipeMetrics.clampedOffset(for: rawTranslation, width: containerWidth)
let nextLatched = SidebarOverlaySwipeMetrics.isLatched(
offset: nextOffset,
width: containerWidth,
isCurrentlyLatched: backSwipeHasLatched
isCurrentlyLatched: sidebarSwipeHasLatched
)
var transaction = Transaction()
transaction.disablesAnimations = true
withTransaction(transaction) {
backSwipeOffset = nextOffset
backSwipeHasLatched = nextLatched
phoneStackWidth = max(containerWidth, 1)
sidebarSwipeOffset = nextOffset
sidebarSwipeHasLatched = nextLatched
}
}
private func finishBackSwipe(
private func finishSidebarSwipe(
translationX: CGFloat,
containerWidth: CGFloat,
velocityX: CGFloat,
didFinish: Bool
) {
guard backSwipeIsActive else {
resetBackSwipe(animated: false)
guard sidebarSwipeIsActive else {
resetSidebarSwipe(animated: false)
return
}
let finalOffset = BackSwipeMetrics.clampedOffset(for: translationX, width: containerWidth)
let finalLatched = BackSwipeMetrics.isLatched(
let finalOffset = SidebarOverlaySwipeMetrics.clampedOffset(for: translationX, width: containerWidth)
let finalLatched = SidebarOverlaySwipeMetrics.isLatched(
offset: finalOffset,
width: containerWidth,
isCurrentlyLatched: backSwipeHasLatched
isCurrentlyLatched: sidebarSwipeHasLatched
)
updateBackSwipe(with: translationX, containerWidth: containerWidth)
updateSidebarSwipe(with: translationX, containerWidth: containerWidth)
if didFinish && BackSwipeMetrics.shouldComplete(
if didFinish && SidebarOverlaySwipeMetrics.shouldComplete(
offset: finalOffset,
velocityX: velocityX,
width: containerWidth,
isLatched: finalLatched
) {
Task {
await completeBackSwipe(containerWidth: containerWidth, releaseVelocityX: velocityX)
}
completeSidebarSwipe()
return
}
resetBackSwipe(animated: true, velocityX: velocityX)
resetSidebarSwipe(animated: true, velocityX: velocityX)
}
@MainActor
private func completeBackSwipe(containerWidth: CGFloat, releaseVelocityX: CGFloat) async {
guard !path.isEmpty else {
resetBackSwipe(animated: false)
return
}
guard !backSwipeIsCompleting else {
private func completeSidebarSwipe() {
guard !sidebarSwipeIsCompleting else {
return
}
backSwipeIsCompleting = true
let targetOffset = BackSwipeMetrics.completionTargetOffset(for: containerWidth)
withAnimation(
BackSwipeMetrics.springAnimation(
currentOffset: backSwipeOffset,
targetOffset: targetOffset,
velocityX: releaseVelocityX
)
) {
backSwipeCompletionOffset = targetOffset - backSwipeOffset
sidebarSwipeIsCompleting = true
withAnimation(.easeOut(duration: 0.18)) {
isSidebarOverlayPresented = true
}
try? await Task.sleep(for: .milliseconds(BackSwipeMetrics.completionAnimationDelayMs))
popRoute(disablesAnimations: true)
resetBackSwipe(animated: false)
resetSidebarSwipe(animated: false)
}
private func resetBackSwipe(animated: Bool, velocityX: CGFloat = 0) {
let currentOffset = backSwipeOffset + backSwipeCompletionOffset
private func resetSidebarSwipe(animated: Bool, velocityX: CGFloat = 0) {
let currentOffset = sidebarSwipeOffset
let reset = {
backSwipeOffset = 0
backSwipeCompletionOffset = 0
backSwipeIsActive = false
backSwipeIsCompleting = false
backSwipeHasLatched = false
sidebarSwipeOffset = 0
sidebarSwipeIsActive = false
sidebarSwipeIsCompleting = false
sidebarSwipeHasLatched = false
}
if animated {
withAnimation(
BackSwipeMetrics.springAnimation(
SidebarOverlaySwipeMetrics.springAnimation(
currentOffset: currentOffset,
targetOffset: 0,
velocityX: velocityX
@@ -318,31 +417,79 @@ struct SybilPhoneShellView: View {
}
}
private struct SybilPhoneSidebarRoot: View {
@Bindable var viewModel: SybilViewModel
@Binding var path: [PhoneRoute]
@State private var openingSelection: SidebarSelection?
@State private var openingRequestID: UUID?
private enum SidebarOverlaySwipeMetrics {
static func clampedOffset(for rawTranslation: CGFloat, width: CGFloat) -> CGFloat {
BackSwipeMetrics.clampedOffset(for: rawTranslation, width: width)
}
private var highlightedSelection: SidebarSelection? {
if let openingSelection {
return openingSelection
}
static func progress(for offset: CGFloat, width: CGFloat) -> CGFloat {
BackSwipeMetrics.progress(for: offset, width: width)
}
guard let route = path.last else {
return nil
}
static func isLatched(offset: CGFloat, width: CGFloat, isCurrentlyLatched: Bool = false) -> Bool {
BackSwipeMetrics.isLatched(offset: offset, width: width, isCurrentlyLatched: isCurrentlyLatched)
}
switch route {
case let .chat(chatID):
return .chat(chatID)
case let .search(searchID):
return .search(searchID)
case .draftChat, .draftSearch, .settings:
return nil
static func shouldComplete(offset: CGFloat, velocityX: CGFloat, width: CGFloat, isLatched: Bool) -> Bool {
BackSwipeMetrics.shouldComplete(offset: offset, velocityX: velocityX, width: width, isLatched: isLatched)
}
static func springAnimation(currentOffset: CGFloat, targetOffset: CGFloat, velocityX: CGFloat) -> Animation {
BackSwipeMetrics.springAnimation(currentOffset: currentOffset, targetOffset: targetOffset, velocityX: velocityX)
}
static func overlayOffset(for progress: CGFloat, width: CGFloat) -> CGFloat {
-(1 - min(max(progress, 0), 1)) * min(max(width * 0.18, 44), 76)
}
static func overlayBlurRadius(for progress: CGFloat) -> CGFloat {
(1 - min(max(progress, 0), 1)) * 18
}
static func workspaceBlurRadius(for progress: CGFloat) -> CGFloat {
min(max(progress, 0), 1) * 14
}
static func workspaceOpacity(for progress: CGFloat) -> CGFloat {
1 - (min(max(progress, 0), 1) * 0.22)
}
}
private struct SybilPhoneOverlayBlurBand: View {
var edge: VerticalEdge
var body: some View {
ZStack {
Rectangle()
.fill(.ultraThinMaterial)
.opacity(0.34)
Rectangle()
.fill(
LinearGradient(
colors: gradientColors,
startPoint: edge == .top ? .top : .bottom,
endPoint: edge == .top ? .bottom : .top
)
)
}
}
private var gradientColors: [Color] {
[
Color.black.opacity(0.94),
SybilTheme.background.opacity(0.78),
Color.black.opacity(0)
]
}
}
private struct SybilPhoneSidebarRoot: View {
@Bindable var viewModel: SybilViewModel
var highlightedSelection: SidebarSelection?
var onSelect: (SidebarSelection) -> Void
var onRoute: (PhoneRoute) -> Void
var body: some View {
VStack(spacing: 0) {
if let errorMessage = viewModel.errorMessage {
@@ -357,64 +504,15 @@ private struct SybilPhoneSidebarRoot: View {
.overlay(SybilTheme.border)
}
if viewModel.isLoadingCollections && viewModel.sidebarItems.isEmpty {
VStack(alignment: .leading, spacing: 8) {
ProgressView()
.tint(SybilTheme.primary)
Text("Loading conversations…")
.font(.sybil(.footnote))
.foregroundStyle(SybilTheme.textMuted)
SybilSidebarItemList(
viewModel: viewModel,
isSelected: { item in
highlightedSelection == item.selection
},
onSelect: { item in
onSelect(item.selection)
}
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
.padding(16)
} else if viewModel.sidebarItems.isEmpty {
VStack(spacing: 10) {
Image(systemName: "message.badge")
.font(.system(size: 20, weight: .medium))
.foregroundStyle(SybilTheme.textMuted)
Text("Start a chat or run your first search.")
.font(.sybil(.footnote))
.multilineTextAlignment(.center)
.foregroundStyle(SybilTheme.textMuted)
}
.frame(maxWidth: .infinity, maxHeight: .infinity)
.padding(16)
} else {
ScrollView {
LazyVStack(alignment: .leading, spacing: 0) {
ForEach(viewModel.sidebarItems) { item in
Button {
open(item.selection)
} label: {
VStack(spacing: 0.0) {
SybilPhoneSidebarRow(item: item)
Divider()
}
}
.buttonStyle(
SybilPhoneSidebarRowButtonStyle(
isHighlighted: highlightedSelection == item.selection
)
)
.contextMenu {
Button(role: .destructive) {
Task {
await viewModel.deleteItem(item.selection)
}
} label: {
Label("Delete", systemImage: "trash")
}
}
}
}
}
.refreshable {
await viewModel.refreshVisibleContent(
refreshCollections: true,
refreshSelection: false
)
}
}
)
}
.background(SybilTheme.panelGradient)
.safeAreaInset(edge: .bottom, spacing: 0) {
@@ -429,22 +527,20 @@ private struct SybilPhoneSidebarRoot: View {
HStack(spacing: 12) {
toolbarIconButton(systemImage: "gearshape", accessibilityLabel: "Settings") {
clearOpeningSelection()
showRoute(.settings)
viewModel.openSettings()
onRoute(.settings)
}
Spacer()
toolbarIconButton(systemImage: "magnifyingglass", accessibilityLabel: "New search") {
clearOpeningSelection()
viewModel.startNewSearch()
showRoute(.draftSearch)
onRoute(.draftSearch)
}
toolbarIconButton(systemImage: "plus", accessibilityLabel: "New chat", isPrimary: true) {
clearOpeningSelection()
viewModel.startNewChat()
showRoute(.draftChat)
onRoute(.draftChat)
}
}
.padding(.horizontal, 18)
@@ -480,121 +576,6 @@ private struct SybilPhoneSidebarRoot: View {
.buttonStyle(.plain)
.accessibilityLabel(accessibilityLabel)
}
private func clearOpeningSelection() {
openingRequestID = nil
openingSelection = nil
}
private func showRoute(_ route: PhoneRoute) {
withAnimation(.easeOut(duration: 0.22)) {
path = [route]
}
}
private func open(_ selection: SidebarSelection) {
guard openingSelection != selection else {
return
}
let requestID = UUID()
openingRequestID = requestID
openingSelection = selection
Task {
await viewModel.selectForNavigation(selection)
guard openingRequestID == requestID else {
return
}
showRoute(PhoneRoute.from(selection: selection))
openingRequestID = nil
openingSelection = nil
}
}
}
private struct SybilPhoneSidebarRowIsActiveKey: EnvironmentKey {
static let defaultValue = false
}
private extension EnvironmentValues {
var sybilPhoneSidebarRowIsActive: Bool {
get { self[SybilPhoneSidebarRowIsActiveKey.self] }
set { self[SybilPhoneSidebarRowIsActiveKey.self] = newValue }
}
}
private struct SybilPhoneSidebarRowButtonStyle: ButtonStyle {
var isHighlighted: Bool
func makeBody(configuration: Configuration) -> some View {
configuration.label
.environment(\.sybilPhoneSidebarRowIsActive, isHighlighted || configuration.isPressed)
}
}
private struct SybilPhoneSidebarRow: View {
@Environment(\.sybilPhoneSidebarRowIsActive) private var isHighlighted
var item: SidebarItem
var body: some View {
let leadingWidth = 22.0
VStack(alignment: .leading, spacing: 8) {
HStack(spacing: 8) {
Image(systemName: item.kind == .chat ? "message" : "globe")
.font(.system(size: 12, weight: .semibold))
.foregroundStyle(isHighlighted ? SybilTheme.accent : SybilTheme.textMuted)
.frame(width: leadingWidth, height: leadingWidth)
.background(
Rectangle()
.fill(isHighlighted ? SybilTheme.accent.opacity(0.12) : SybilTheme.surface.opacity(0.72))
)
Text(item.title)
.font(.sybil(.subheadline, weight: .semibold))
.lineLimit(1)
.layoutPriority(1)
Spacer(minLength: 8)
if item.isRunning {
SybilSidebarActivityIndicator()
}
}
HStack(spacing: 8) {
Spacer()
.frame(width: leadingWidth)
Text(item.updatedAt.sybilRelativeLabel)
.font(.sybil(.caption2))
.foregroundStyle(SybilTheme.textMuted)
if let initiated = item.initiatedLabel {
Spacer(minLength: 0)
Text(initiated)
.font(.sybil(.caption2))
.foregroundStyle(SybilTheme.textMuted.opacity(0.88))
.lineLimit(1)
.multilineTextAlignment(.trailing)
.frame(maxWidth: .infinity, alignment: .trailing)
}
}
}
.foregroundStyle(SybilTheme.text)
.padding(18.0)
.frame(maxWidth: .infinity, alignment: .leading)
.background(
Rectangle()
.fill(
isHighlighted
? SybilTheme.selectedRowGradient
: LinearGradient(colors: [SybilTheme.surface.opacity(0.56), SybilTheme.surface.opacity(0.36)], startPoint: .topLeading, endPoint: .bottomTrailing)
)
)
}
}
private struct SybilPhoneDestinationView: View {
@@ -602,12 +583,15 @@ private struct SybilPhoneDestinationView: View {
@Binding var composerFocusRequest: Int
let route: PhoneRoute
let onRequestBack: (_ animateNavigation: Bool) -> Void
let onRequestNewChat: () -> Void
let onRequestNewChat: (() -> Void)?
let onShowSidebar: () -> Void
var body: some View {
SybilWorkspaceView(
viewModel: viewModel,
composerFocusRequest: composerFocusRequest,
navigationLeadingControl: .showSidebar,
onShowSidebar: onShowSidebar,
onRequestBack: onRequestBack,
onRequestNewChat: onRequestNewChat
)

View File

@@ -0,0 +1,19 @@
import Combine
import Foundation
public enum SybilHomeScreenQuickAction {
public static let quickQuestionType = "net.buzzert.sybil2.quick-question"
}
@MainActor
public final class SybilQuickActionRouter: ObservableObject {
public static let shared = SybilQuickActionRouter()
@Published public private(set) var quickQuestionPresentationRequest = 0
private init() {}
public func requestQuickQuestionPresentation() {
quickQuestionPresentationRequest += 1
}
}

View File

@@ -0,0 +1,297 @@
import MarkdownUI
import Observation
import SwiftUI
struct SybilQuickQuestionView: View {
@Bindable var viewModel: SybilViewModel
var focusRequest: Int
@Environment(\.dismiss) private var dismiss
@FocusState private var promptFocused: Bool
private var hasAnswerContent: Bool {
!viewModel.quickQuestionMessages.isEmpty || viewModel.quickQuestionError != nil
}
var body: some View {
VStack(spacing: 0) {
VStack(alignment: .leading, spacing: 16) {
header
answerArea
composer
}
.padding(.horizontal, 16)
.padding(.top, 18)
.padding(.bottom, 12)
.frame(maxWidth: 640, maxHeight: .infinity, alignment: .top)
}
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .top)
.background(SybilTheme.backgroundGradient)
.preferredColorScheme(.dark)
.task(id: focusRequest) {
try? await Task.sleep(for: .milliseconds(260))
guard !Task.isCancelled else {
return
}
promptFocused = true
}
}
private var header: some View {
HStack {
Image(systemName: "sparkles")
.font(.system(size: 21, weight: .semibold))
.foregroundStyle(SybilTheme.primary)
Text("Quick question")
.font(.title3.weight(.semibold))
.foregroundStyle(SybilTheme.text)
.lineLimit(1)
}
.frame(maxWidth: .infinity, alignment: .leading)
}
private var answerArea: some View {
ScrollView {
VStack(alignment: .leading, spacing: 12) {
if hasAnswerContent {
ForEach(viewModel.quickQuestionMessages) { message in
QuickQuestionMessageView(message: message, isSending: viewModel.isQuickQuestionSending)
}
if let error = viewModel.quickQuestionError {
Text(error)
.font(.caption)
.foregroundStyle(SybilTheme.danger)
.fixedSize(horizontal: false, vertical: true)
}
}
}
.frame(maxWidth: .infinity, alignment: .topLeading)
.padding(14)
}
.scrollDismissesKeyboard(.interactively)
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
.background(
RoundedRectangle(cornerRadius: 12)
.fill(Color.black.opacity(0.36))
)
.overlay(
RoundedRectangle(cornerRadius: 12)
.stroke(SybilTheme.border.opacity(0.55), lineWidth: 1)
)
}
private var composer: some View {
VStack(alignment: .leading, spacing: 10) {
HStack(alignment: .bottom, spacing: 10) {
TextField(
"Ask anything...",
text: Binding(
get: { viewModel.quickQuestionPrompt },
set: { viewModel.updateQuickQuestionPrompt($0) }
),
axis: .vertical
)
.focused($promptFocused)
.font(.body)
.textInputAutocapitalization(.sentences)
.autocorrectionDisabled(false)
.lineLimit(1 ... 6)
.submitLabel(.send)
.onSubmit(submitQuestion)
.padding(.horizontal, 12)
.padding(.vertical, 10)
.background(
RoundedRectangle(cornerRadius: 12)
.fill(SybilTheme.composerGradient)
.opacity(0.98)
)
.foregroundStyle(SybilTheme.text)
Button(action: submitQuestion) {
Image(systemName: "arrow.up")
.font(.body.weight(.semibold))
.frame(width: 40, height: 40)
.background(
Circle()
.fill(
viewModel.canSendQuickQuestion
? AnyShapeStyle(SybilTheme.primaryGradient)
: AnyShapeStyle(SybilTheme.surfaceStrong.opacity(0.92))
)
)
.foregroundStyle(viewModel.canSendQuickQuestion ? SybilTheme.text : SybilTheme.textMuted)
}
.buttonStyle(.plain)
.disabled(!viewModel.canSendQuickQuestion)
.accessibilityLabel("Ask quick question")
}
controlsRow
}
}
private var convertButton: some View {
Button {
Task {
let didConvert = await viewModel.convertQuickQuestionToChat()
if didConvert {
dismiss()
}
}
} label: {
Label("Chat", systemImage: "bubble.left")
.font(.caption.weight(.medium))
.lineLimit(1)
.minimumScaleFactor(0.8)
}
.buttonStyle(.plain)
.foregroundStyle(viewModel.canConvertQuickQuestion ? SybilTheme.text : SybilTheme.textMuted)
.padding(.horizontal, 10)
.frame(maxWidth: .infinity, minHeight: 40)
.background(
RoundedRectangle(cornerRadius: 12)
.fill(SybilTheme.surfaceStrong.opacity(0.78))
.overlay(
RoundedRectangle(cornerRadius: 12)
.stroke(SybilTheme.border.opacity(0.78), lineWidth: 1)
)
)
.disabled(!viewModel.canConvertQuickQuestion)
}
private var controlsRow: some View {
HStack(alignment: .center, spacing: 10) {
providerMenu
modelMenu
convertButton
}
}
private var providerMenu: some View {
Menu {
ForEach(viewModel.providerOptions, id: \.self) { provider in
Button {
viewModel.setQuickQuestionProvider(provider)
} label: {
if viewModel.quickQuestionProvider == provider {
Label(provider.displayName, systemImage: "checkmark")
} else {
Text(provider.displayName)
}
}
}
} label: {
QuickQuestionPickerPill(title: viewModel.quickQuestionProvider.displayName)
}
.frame(maxWidth: .infinity)
.disabled(viewModel.isQuickQuestionSending || viewModel.isConvertingQuickQuestion)
.accessibilityLabel("Quick question provider")
}
private var modelMenu: some View {
Menu {
if viewModel.quickQuestionProviderModelOptions.isEmpty {
Text("No models")
} else {
ForEach(viewModel.quickQuestionProviderModelOptions, id: \.self) { model in
Button {
viewModel.setQuickQuestionModel(model)
} label: {
if viewModel.quickQuestionModel == model {
Label(model, systemImage: "checkmark")
} else {
Text(model)
}
}
}
}
} label: {
QuickQuestionPickerPill(title: viewModel.quickQuestionModel.isEmpty ? "No model" : viewModel.quickQuestionModel)
}
.frame(maxWidth: .infinity)
.disabled(viewModel.isQuickQuestionSending || viewModel.isConvertingQuickQuestion)
.accessibilityLabel("Quick question model")
}
private func submitQuestion() {
_ = viewModel.sendQuickQuestion()
}
}
private struct QuickQuestionPickerPill: View {
var title: String
var body: some View {
HStack(spacing: 8) {
Text(title)
.font(.caption.weight(.medium))
.foregroundStyle(SybilTheme.text)
.lineLimit(1)
.minimumScaleFactor(0.8)
Image(systemName: "chevron.down")
.font(.caption.weight(.semibold))
.foregroundStyle(SybilTheme.textMuted)
}
.padding(.horizontal, 10)
.frame(maxWidth: .infinity, minHeight: 40)
.background(
RoundedRectangle(cornerRadius: 12)
.fill(SybilTheme.surfaceStrong.opacity(0.78))
.overlay(
RoundedRectangle(cornerRadius: 12)
.stroke(SybilTheme.border.opacity(0.78), lineWidth: 1)
)
)
}
}
private struct QuickQuestionMessageView: View {
var message: Message
var isSending: Bool
private var isPendingAssistant: Bool {
message.id.hasPrefix("temp-assistant-quick-") &&
isSending &&
message.content.trimmingCharacters(in: .whitespacesAndNewlines).isEmpty
}
var body: some View {
if let metadata = message.toolCallMetadata {
Text(toolCallSummary(for: metadata, fallbackContent: message.content))
.font(.caption)
.foregroundStyle(SybilTheme.textMuted)
.fixedSize(horizontal: false, vertical: true)
} else if isPendingAssistant {
HStack(spacing: 8) {
ProgressView()
.controlSize(.small)
.tint(SybilTheme.primary)
Text("Thinking...")
.font(.caption)
.foregroundStyle(SybilTheme.textMuted)
}
} else if !message.content.trimmingCharacters(in: .whitespacesAndNewlines).isEmpty {
Markdown(message.content)
.font(.body)
.tint(SybilTheme.primary)
.foregroundStyle(SybilTheme.text.opacity(0.96))
.textSelection(.enabled)
}
}
private func toolCallSummary(for metadata: ToolCallMetadata, fallbackContent: String) -> String {
if let summary = metadata.summary?.trimmingCharacters(in: .whitespacesAndNewlines), !summary.isEmpty {
return summary
}
if !fallbackContent.trimmingCharacters(in: .whitespacesAndNewlines).isEmpty {
return fallbackContent
}
return "Ran \(metadata.toolName ?? "tool")."
}
}

View File

@@ -11,6 +11,12 @@ final class SybilSettingsStore {
static let preferredOpenAIModel = "sybil.ios.preferredOpenAIModel"
static let preferredAnthropicModel = "sybil.ios.preferredAnthropicModel"
static let preferredXAIModel = "sybil.ios.preferredXAIModel"
static let preferredHermesAgentModel = "sybil.ios.preferredHermesAgentModel"
static let quickQuestionPreferredProvider = "sybil.ios.quickQuestionPreferredProvider"
static let quickQuestionPreferredOpenAIModel = "sybil.ios.quickQuestionPreferredOpenAIModel"
static let quickQuestionPreferredAnthropicModel = "sybil.ios.quickQuestionPreferredAnthropicModel"
static let quickQuestionPreferredXAIModel = "sybil.ios.quickQuestionPreferredXAIModel"
static let quickQuestionPreferredHermesAgentModel = "sybil.ios.quickQuestionPreferredHermesAgentModel"
}
private let defaults: UserDefaults
@@ -19,6 +25,8 @@ final class SybilSettingsStore {
var adminToken: String
var preferredProvider: Provider
var preferredModelByProvider: [Provider: String]
var quickQuestionPreferredProvider: Provider
var quickQuestionPreferredModelByProvider: [Provider: String]
init(defaults: UserDefaults = .standard) {
self.defaults = defaults
@@ -32,10 +40,21 @@ final class SybilSettingsStore {
let provider = defaults.string(forKey: Keys.preferredProvider).flatMap(Provider.init(rawValue:)) ?? .openai
self.preferredProvider = provider
self.preferredModelByProvider = [
let preferredModels: [Provider: String] = [
.openai: defaults.string(forKey: Keys.preferredOpenAIModel) ?? "gpt-4.1-mini",
.anthropic: defaults.string(forKey: Keys.preferredAnthropicModel) ?? "claude-3-5-sonnet-latest",
.xai: defaults.string(forKey: Keys.preferredXAIModel) ?? "grok-3-mini"
.xai: defaults.string(forKey: Keys.preferredXAIModel) ?? "grok-3-mini",
.hermesAgent: defaults.string(forKey: Keys.preferredHermesAgentModel) ?? "hermes-agent"
]
self.preferredModelByProvider = preferredModels
self.quickQuestionPreferredProvider =
defaults.string(forKey: Keys.quickQuestionPreferredProvider).flatMap(Provider.init(rawValue:)) ?? provider
self.quickQuestionPreferredModelByProvider = [
.openai: defaults.string(forKey: Keys.quickQuestionPreferredOpenAIModel) ?? preferredModels[.openai] ?? "gpt-4.1-mini",
.anthropic: defaults.string(forKey: Keys.quickQuestionPreferredAnthropicModel) ?? preferredModels[.anthropic] ?? "claude-3-5-sonnet-latest",
.xai: defaults.string(forKey: Keys.quickQuestionPreferredXAIModel) ?? preferredModels[.xai] ?? "grok-3-mini",
.hermesAgent: defaults.string(forKey: Keys.quickQuestionPreferredHermesAgentModel) ?? preferredModels[.hermesAgent] ?? "hermes-agent"
]
}
@@ -53,6 +72,13 @@ final class SybilSettingsStore {
defaults.set(preferredModelByProvider[.openai], forKey: Keys.preferredOpenAIModel)
defaults.set(preferredModelByProvider[.anthropic], forKey: Keys.preferredAnthropicModel)
defaults.set(preferredModelByProvider[.xai], forKey: Keys.preferredXAIModel)
defaults.set(preferredModelByProvider[.hermesAgent], forKey: Keys.preferredHermesAgentModel)
defaults.set(quickQuestionPreferredProvider.rawValue, forKey: Keys.quickQuestionPreferredProvider)
defaults.set(quickQuestionPreferredModelByProvider[.openai], forKey: Keys.quickQuestionPreferredOpenAIModel)
defaults.set(quickQuestionPreferredModelByProvider[.anthropic], forKey: Keys.quickQuestionPreferredAnthropicModel)
defaults.set(quickQuestionPreferredModelByProvider[.xai], forKey: Keys.quickQuestionPreferredXAIModel)
defaults.set(quickQuestionPreferredModelByProvider[.hermesAgent], forKey: Keys.quickQuestionPreferredHermesAgentModel)
}
var trimmedTokenOrNil: String? {
@@ -68,7 +94,7 @@ final class SybilSettingsStore {
raw.removeLast()
}
guard var components = URLComponents(string: raw) else {
guard let components = URLComponents(string: raw) else {
return nil
}

View File

@@ -4,13 +4,6 @@ import SwiftUI
struct SybilSidebarView: View {
@Bindable var viewModel: SybilViewModel
private func iconName(for item: SidebarItem) -> String {
switch item.kind {
case .chat: return "message"
case .search: return "globe"
}
}
private func isSelected(_ item: SidebarItem) -> Bool {
viewModel.draftKind == nil && viewModel.selectedItem == item.selection
}
@@ -57,112 +50,13 @@ struct SybilSidebarView: View {
.overlay(SybilTheme.border)
}
if viewModel.isLoadingCollections && viewModel.sidebarItems.isEmpty {
VStack(alignment: .leading, spacing: 8) {
ProgressView()
.tint(SybilTheme.primary)
Text("Loading conversations…")
.font(.sybil(.footnote))
.foregroundStyle(SybilTheme.textMuted)
SybilSidebarItemList(
viewModel: viewModel,
isSelected: isSelected,
onSelect: { item in
viewModel.select(item.selection)
}
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
.padding(16)
} else if viewModel.sidebarItems.isEmpty {
VStack(spacing: 10) {
Image(systemName: "message.badge")
.font(.system(size: 20, weight: .medium))
.foregroundStyle(SybilTheme.textMuted)
Text("Start a chat or run your first search.")
.font(.sybil(.footnote))
.multilineTextAlignment(.center)
.foregroundStyle(SybilTheme.textMuted)
}
.frame(maxWidth: .infinity, maxHeight: .infinity)
.padding(16)
} else {
ScrollView {
LazyVStack(alignment: .leading, spacing: 8) {
ForEach(viewModel.sidebarItems) { item in
Button {
viewModel.select(item.selection)
} label: {
VStack(alignment: .leading, spacing: 6) {
HStack(spacing: 8) {
Image(systemName: iconName(for: item))
.font(.system(size: 12, weight: .semibold))
.foregroundStyle(isSelected(item) ? SybilTheme.accent : SybilTheme.textMuted)
.frame(width: 22, height: 22)
.background(
RoundedRectangle(cornerRadius: 7)
.fill(isSelected(item) ? SybilTheme.accent.opacity(0.12) : SybilTheme.surface.opacity(0.72))
.overlay(
RoundedRectangle(cornerRadius: 7)
.stroke(isSelected(item) ? SybilTheme.accent.opacity(0.36) : SybilTheme.border.opacity(0.72), lineWidth: 1)
)
)
Text(item.title)
.font(.sybil(.subheadline, weight: .semibold))
.lineLimit(1)
.layoutPriority(1)
Spacer(minLength: 8)
if item.isRunning {
SybilSidebarActivityIndicator()
}
}
HStack(spacing: 8) {
Text(item.updatedAt.sybilRelativeLabel)
.font(.sybil(.caption2))
.foregroundStyle(SybilTheme.textMuted)
if let initiated = item.initiatedLabel {
Spacer(minLength: 0)
Text(initiated)
.font(.sybil(.caption2))
.foregroundStyle(SybilTheme.textMuted.opacity(0.88))
.lineLimit(1)
.multilineTextAlignment(.trailing)
.frame(maxWidth: .infinity, alignment: .trailing)
}
}
}
.foregroundStyle(SybilTheme.text)
.padding(.horizontal, 12)
.padding(.vertical, 10)
.frame(maxWidth: .infinity, alignment: .leading)
.background(
RoundedRectangle(cornerRadius: 12)
.fill(isSelected(item) ? SybilTheme.selectedRowGradient : LinearGradient(colors: [SybilTheme.surface.opacity(0.56), SybilTheme.surface.opacity(0.36)], startPoint: .topLeading, endPoint: .bottomTrailing))
)
.overlay(
RoundedRectangle(cornerRadius: 12)
.stroke(isSelected(item) ? SybilTheme.primary.opacity(0.55) : SybilTheme.border.opacity(0.72), lineWidth: 1)
)
}
.buttonStyle(.plain)
.contextMenu {
Button(role: .destructive) {
Task {
await viewModel.deleteItem(item.selection)
}
} label: {
Label("Delete", systemImage: "trash")
}
}
}
}
.padding(10)
}
.refreshable {
await viewModel.refreshVisibleContent(
refreshCollections: true,
refreshSelection: false
)
}
}
)
}
.background(SybilTheme.panelGradient)
@@ -213,6 +107,142 @@ struct SybilSidebarView: View {
}
}
struct SybilSidebarItemList: View {
@Bindable var viewModel: SybilViewModel
var isSelected: (SidebarItem) -> Bool
var onSelect: (SidebarItem) -> Void
var body: some View {
if viewModel.isLoadingCollections && viewModel.sidebarItems.isEmpty {
VStack(alignment: .leading, spacing: 8) {
ProgressView()
.tint(SybilTheme.primary)
Text("Loading conversations…")
.font(.sybil(.footnote))
.foregroundStyle(SybilTheme.textMuted)
}
.frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .topLeading)
.padding(16)
} else if viewModel.sidebarItems.isEmpty {
VStack(spacing: 10) {
Image(systemName: "message.badge")
.font(.system(size: 20, weight: .medium))
.foregroundStyle(SybilTheme.textMuted)
Text("Start a chat or run your first search.")
.font(.sybil(.footnote))
.multilineTextAlignment(.center)
.foregroundStyle(SybilTheme.textMuted)
}
.frame(maxWidth: .infinity, maxHeight: .infinity)
.padding(16)
} else {
ScrollView {
LazyVStack(alignment: .leading, spacing: 8) {
ForEach(viewModel.sidebarItems) { item in
Button {
onSelect(item)
} label: {
SybilSidebarRow(item: item, isSelected: isSelected(item))
}
.buttonStyle(.plain)
.contextMenu {
Button(role: .destructive) {
Task {
await viewModel.deleteItem(item.selection)
}
} label: {
Label("Delete", systemImage: "trash")
}
}
}
}
.padding(10)
}
.refreshable {
await viewModel.refreshVisibleContent(
refreshCollections: true,
refreshSelection: false
)
}
}
}
}
struct SybilSidebarRow: View {
var item: SidebarItem
var isSelected: Bool
private var isHighlighted: Bool {
isSelected
}
private var iconName: String {
switch item.kind {
case .chat: return "message"
case .search: return "globe"
}
}
var body: some View {
VStack(alignment: .leading, spacing: 6) {
HStack(spacing: 8) {
Image(systemName: iconName)
.font(.system(size: 12, weight: .semibold))
.foregroundStyle(isHighlighted ? SybilTheme.accent : SybilTheme.textMuted)
.frame(width: 22, height: 22)
.background(
RoundedRectangle(cornerRadius: 7)
.fill(isHighlighted ? SybilTheme.accent.opacity(0.12) : SybilTheme.surface.opacity(0.72))
.overlay(
RoundedRectangle(cornerRadius: 7)
.stroke(isHighlighted ? SybilTheme.accent.opacity(0.36) : SybilTheme.border.opacity(0.72), lineWidth: 1)
)
)
Text(item.title)
.font(.sybil(.subheadline, weight: .semibold))
.lineLimit(1)
.layoutPriority(1)
Spacer(minLength: 8)
if item.isRunning {
SybilSidebarActivityIndicator()
}
}
HStack(spacing: 8) {
Text(item.updatedAt.sybilRelativeLabel)
.font(.sybil(.caption2))
.foregroundStyle(SybilTheme.textMuted)
if let initiated = item.initiatedLabel {
Spacer(minLength: 0)
Text(initiated)
.font(.sybil(.caption2))
.foregroundStyle(SybilTheme.textMuted.opacity(0.88))
.lineLimit(1)
.multilineTextAlignment(.trailing)
.frame(maxWidth: .infinity, alignment: .trailing)
}
}
}
.foregroundStyle(SybilTheme.text)
.padding(.horizontal, 12)
.padding(.vertical, 10)
.frame(maxWidth: .infinity, alignment: .leading)
.background(
RoundedRectangle(cornerRadius: 12)
.fill(isHighlighted ? SybilTheme.selectedRowGradient : LinearGradient(colors: [SybilTheme.surface.opacity(0.56), SybilTheme.surface.opacity(0.36)], startPoint: .topLeading, endPoint: .bottomTrailing))
)
.overlay(
RoundedRectangle(cornerRadius: 12)
.stroke(isHighlighted ? SybilTheme.primary.opacity(0.55) : SybilTheme.border.opacity(0.72), lineWidth: 1)
)
.contentShape(RoundedRectangle(cornerRadius: 12))
}
}
struct SybilSidebarActivityIndicator: View {
var body: some View {
ProgressView()

View File

@@ -111,6 +111,16 @@ final class SybilViewModel {
var provider: Provider
var modelCatalog: [Provider: ProviderModelInfo] = [:]
var model: String
var quickQuestionPrompt = ""
var quickQuestionMessages: [Message] = []
var quickQuestionError: String?
var quickQuestionProvider: Provider
var quickQuestionModel: String
var quickQuestionSubmittedPrompt: String?
var quickQuestionSubmittedProvider: Provider?
var quickQuestionSubmittedModel: String?
var isQuickQuestionSending = false
var isConvertingQuickQuestion = false
@ObservationIgnored
private var hasBootstrapped = false
@@ -132,6 +142,10 @@ final class SybilViewModel {
@ObservationIgnored
private var activeSearchAttachTasks: [String: Task<Void, Never>] = [:]
@ObservationIgnored
private var quickQuestionTask: Task<Void, Never>?
@ObservationIgnored
private var quickQuestionRunID: UUID?
@ObservationIgnored
private var isAppActive = true
@ObservationIgnored
private var appLifecycleGeneration = 0
@@ -141,7 +155,8 @@ final class SybilViewModel {
private let fallbackModels: [Provider: [String]] = [
.openai: ["gpt-4.1-mini"],
.anthropic: ["claude-3-5-sonnet-latest"],
.xai: ["grok-3-mini"]
.xai: ["grok-3-mini"],
.hermesAgent: ["hermes-agent"]
]
init(
@@ -152,14 +167,56 @@ final class SybilViewModel {
) {
self.settings = settings
self.clientFactory = clientFactory
self.provider = settings.preferredProvider
self.model = settings.preferredModelByProvider[settings.preferredProvider] ?? "gpt-4.1-mini"
let initialProvider = settings.preferredProvider
let initialModel = settings.preferredModelByProvider[initialProvider] ?? "gpt-4.1-mini"
self.provider = initialProvider
self.model = initialModel
let initialQuickQuestionProvider = settings.quickQuestionPreferredProvider
let initialQuickQuestionModel = settings.quickQuestionPreferredModelByProvider[initialQuickQuestionProvider] ?? initialModel
self.quickQuestionProvider = initialQuickQuestionProvider
self.quickQuestionModel = initialQuickQuestionModel
}
var providerModelOptions: [String] {
modelOptions(for: provider)
}
var providerOptions: [Provider] {
Provider.allCases.filter { candidate in
candidate != .hermesAgent || modelCatalog[candidate] != nil
}
}
var quickQuestionProviderModelOptions: [String] {
modelOptions(for: quickQuestionProvider)
}
var canSendQuickQuestion: Bool {
!isQuickQuestionSending &&
!isConvertingQuickQuestion &&
!quickQuestionPrompt.trimmingCharacters(in: .whitespacesAndNewlines).isEmpty &&
!quickQuestionModel.trimmingCharacters(in: .whitespacesAndNewlines).isEmpty
}
var quickQuestionAnswerText: String {
for message in quickQuestionMessages.reversed() where message.role == .assistant {
let content = message.content.trimmingCharacters(in: .whitespacesAndNewlines)
if !content.isEmpty {
return content
}
}
return ""
}
var canConvertQuickQuestion: Bool {
!isQuickQuestionSending &&
!isConvertingQuickQuestion &&
!(quickQuestionSubmittedPrompt?.trimmingCharacters(in: .whitespacesAndNewlines).isEmpty ?? true) &&
!quickQuestionAnswerText.isEmpty &&
quickQuestionSubmittedProvider != nil &&
!(quickQuestionSubmittedModel?.trimmingCharacters(in: .whitespacesAndNewlines).isEmpty ?? true)
}
func modelOptions(for candidate: Provider) -> [String] {
let serverModels = modelCatalog[candidate]?.models ?? []
if !serverModels.isEmpty {
@@ -415,6 +472,7 @@ final class SybilViewModel {
localActiveSearchIDs = []
serverActiveChatIDs = []
serverActiveSearchIDs = []
resetQuickQuestion()
draftIdentity = UUID()
composerAttachments = []
settings.persist()
@@ -487,6 +545,159 @@ final class SybilViewModel {
SybilLog.info(SybilLog.ui, "Provider changed to \(nextProvider.rawValue), model=\(nextModel)")
}
func setQuickQuestionProvider(_ nextProvider: Provider) {
quickQuestionProvider = nextProvider
let options = modelOptions(for: nextProvider)
if let preferred = settings.quickQuestionPreferredModelByProvider[nextProvider], options.contains(preferred) {
quickQuestionModel = preferred
} else if let first = options.first {
quickQuestionModel = first
} else {
quickQuestionModel = ""
}
persistQuickQuestionModelSelection()
}
func setQuickQuestionModel(_ nextModel: String) {
quickQuestionModel = nextModel
persistQuickQuestionModelSelection()
}
private func persistQuickQuestionModelSelection() {
settings.quickQuestionPreferredProvider = quickQuestionProvider
let trimmedModel = quickQuestionModel.trimmingCharacters(in: .whitespacesAndNewlines)
if !trimmedModel.isEmpty {
settings.quickQuestionPreferredModelByProvider[quickQuestionProvider] = trimmedModel
}
settings.persist()
}
func updateQuickQuestionPrompt(_ nextPrompt: String) {
guard nextPrompt != quickQuestionPrompt else {
return
}
if isQuickQuestionSending || quickQuestionSubmittedPrompt != nil || !quickQuestionMessages.isEmpty {
cancelQuickQuestion()
quickQuestionSubmittedPrompt = nil
quickQuestionSubmittedProvider = nil
quickQuestionSubmittedModel = nil
quickQuestionMessages = []
quickQuestionError = nil
}
quickQuestionPrompt = nextPrompt
}
func resetQuickQuestion() {
cancelQuickQuestion()
quickQuestionPrompt = ""
quickQuestionMessages = []
quickQuestionError = nil
quickQuestionSubmittedPrompt = nil
quickQuestionSubmittedProvider = nil
quickQuestionSubmittedModel = nil
isConvertingQuickQuestion = false
}
func cancelQuickQuestion() {
quickQuestionTask?.cancel()
quickQuestionTask = nil
quickQuestionRunID = nil
isQuickQuestionSending = false
}
@discardableResult
func sendQuickQuestion() -> Task<Void, Never>? {
let content = quickQuestionPrompt.trimmingCharacters(in: .whitespacesAndNewlines)
guard !content.isEmpty, !isQuickQuestionSending, !isConvertingQuickQuestion else {
return nil
}
let selectedModel = quickQuestionModel.trimmingCharacters(in: .whitespacesAndNewlines)
guard !selectedModel.isEmpty else {
quickQuestionError = "No model available for selected provider."
return nil
}
cancelQuickQuestion()
let selectedProvider = quickQuestionProvider
let task = Task { [weak self] in
guard let self else {
return
}
await self.runQuickQuestion(prompt: content, provider: selectedProvider, model: selectedModel)
}
quickQuestionTask = task
return task
}
@discardableResult
func convertQuickQuestionToChat() async -> Bool {
let question = quickQuestionSubmittedPrompt?.trimmingCharacters(in: .whitespacesAndNewlines) ?? ""
let answer = quickQuestionAnswerText
guard !question.isEmpty,
!answer.isEmpty,
let submittedProvider = quickQuestionSubmittedProvider,
let submittedModel = quickQuestionSubmittedModel?.trimmingCharacters(in: .whitespacesAndNewlines),
!submittedModel.isEmpty,
!isQuickQuestionSending,
!isConvertingQuickQuestion
else {
return false
}
isConvertingQuickQuestion = true
quickQuestionError = nil
defer {
isConvertingQuickQuestion = false
}
do {
let titleSeed = question.split(whereSeparator: \.isNewline).first.map(String.init) ?? question
let title = String(titleSeed.trimmingCharacters(in: .whitespacesAndNewlines).prefix(48))
let chat = try await client().createChat(
title: title.isEmpty ? "Quick question" : title,
provider: submittedProvider,
model: submittedModel,
messages: [
CompletionRequestMessage(role: .user, content: question),
CompletionRequestMessage(role: .assistant, content: answer)
]
)
setProvider(submittedProvider, model: submittedModel)
chats.removeAll(where: { $0.id == chat.id })
chats.insert(chat, at: 0)
draftKind = nil
selectedItem = .chat(chat.id)
selectedChat = ChatDetail(
id: chat.id,
title: chat.title,
createdAt: chat.createdAt,
updatedAt: chat.updatedAt,
initiatedProvider: chat.initiatedProvider,
initiatedModel: chat.initiatedModel,
lastUsedProvider: chat.lastUsedProvider,
lastUsedModel: chat.lastUsedModel,
messages: []
)
selectedSearch = nil
composer = ""
composerAttachments = []
await refreshCollections(preferredSelection: .chat(chat.id))
resetQuickQuestion()
return true
} catch {
quickQuestionError = normalizeAPIError(error)
SybilLog.error(SybilLog.ui, "Convert quick question to chat failed", error: error)
return false
}
}
func startNewChat() {
SybilLog.debug(SybilLog.ui, "Starting draft chat")
resetSelectionLoading()
@@ -830,6 +1041,91 @@ final class SybilViewModel {
isCreatingSearchChat = false
}
private func runQuickQuestion(prompt: String, provider: Provider, model: String) async {
let runID = UUID()
quickQuestionRunID = runID
quickQuestionError = nil
quickQuestionSubmittedPrompt = prompt
quickQuestionSubmittedProvider = provider
quickQuestionSubmittedModel = model
quickQuestionMessages = [
Message(
id: "temp-assistant-quick-\(UUID().uuidString)",
createdAt: Date(),
role: .assistant,
content: "",
name: nil
)
]
isQuickQuestionSending = true
defer {
if quickQuestionRunID == runID {
quickQuestionTask = nil
quickQuestionRunID = nil
isQuickQuestionSending = false
}
}
let streamStatus = CompletionStreamStatus()
do {
try await client().runCompletionStream(
body: CompletionStreamRequest(
chatId: nil,
persist: false,
provider: provider,
model: model,
messages: [CompletionRequestMessage(role: .user, content: prompt)]
)
) { [weak self] event in
guard let self else { return }
await self.applyQuickQuestionCompletionEvent(event, streamStatus: streamStatus)
}
if let streamError = await streamStatus.error() {
throw APIError.httpError(statusCode: 502, message: streamError)
}
} catch {
guard quickQuestionRunID == runID else {
return
}
if isCancellation(error) {
return
}
quickQuestionError = normalizeAPIError(error)
SybilLog.error(SybilLog.ui, "Quick question failed", error: error)
}
}
private func applyQuickQuestionCompletionEvent(_ event: CompletionStreamEvent, streamStatus: CompletionStreamStatus) async {
switch event {
case .meta:
break
case let .toolCall(payload):
insertQuickQuestionToolCallMessage(payload)
case let .delta(payload):
guard !payload.text.isEmpty else { return }
mutateQuickQuestionAssistantMessage { existing in
existing + payload.text
}
case let .done(payload):
mutateQuickQuestionAssistantMessage { _ in
payload.text
}
case let .error(payload):
await streamStatus.setError(payload.message)
case .ignored:
break
}
}
private func loadInitialData(using client: any SybilAPIClienting) async {
isLoadingCollections = true
errorMessage = nil
@@ -893,6 +1189,11 @@ final class SybilViewModel {
}
private func syncModelSelectionWithServerCatalog() {
if !providerOptions.contains(provider), let firstProvider = providerOptions.first {
provider = firstProvider
settings.preferredProvider = firstProvider
}
if !providerModelOptions.contains(model), let first = providerModelOptions.first {
model = first
settings.preferredModelByProvider[provider] = first
@@ -902,6 +1203,22 @@ final class SybilViewModel {
model = preferred
}
if !providerOptions.contains(quickQuestionProvider), let firstProvider = providerOptions.first {
quickQuestionProvider = firstProvider
settings.quickQuestionPreferredProvider = firstProvider
}
if !quickQuestionProviderModelOptions.contains(quickQuestionModel), let first = quickQuestionProviderModelOptions.first {
quickQuestionModel = first
settings.quickQuestionPreferredModelByProvider[quickQuestionProvider] = first
}
if let preferred = settings.quickQuestionPreferredModelByProvider[quickQuestionProvider],
quickQuestionProviderModelOptions.contains(preferred)
{
quickQuestionModel = preferred
}
settings.persist()
}
@@ -1752,6 +2069,15 @@ final class SybilViewModel {
pendingChatStates[chatID] = pending
}
private func mutateQuickQuestionAssistantMessage(_ transform: (String) -> String) {
let index = quickQuestionMessages.indices.last { quickQuestionMessages[$0].id.hasPrefix("temp-assistant-quick-") }
guard let index else {
return
}
quickQuestionMessages[index].content = transform(quickQuestionMessages[index].content)
}
private func insertPendingToolCallMessage(_ payload: CompletionStreamToolCall, chatID: String) {
guard var pending = pendingChatStates[chatID] else {
return
@@ -1761,6 +2087,31 @@ final class SybilViewModel {
return
}
let message = toolCallMessage(for: payload)
if let assistantIndex = pending.messages.indices.last(where: { pending.messages[$0].id.hasPrefix("temp-assistant-") }) {
pending.messages.insert(message, at: assistantIndex)
} else {
pending.messages.append(message)
}
pendingChatStates[chatID] = pending
}
private func insertQuickQuestionToolCallMessage(_ payload: CompletionStreamToolCall) {
if quickQuestionMessages.contains(where: { $0.toolCallMetadata?.toolCallId == payload.toolCallId }) {
return
}
let message = toolCallMessage(for: payload)
if let assistantIndex = quickQuestionMessages.indices.last(where: { quickQuestionMessages[$0].id.hasPrefix("temp-assistant-quick-") }) {
quickQuestionMessages.insert(message, at: assistantIndex)
} else {
quickQuestionMessages.append(message)
}
}
private func toolCallMessage(for payload: CompletionStreamToolCall) -> Message {
let metadata: JSONValue = .object([
"kind": .string("tool_call"),
"toolCallId": .string(payload.toolCallId),
@@ -1779,7 +2130,7 @@ final class SybilViewModel {
? "Ran tool '\(payload.name)'."
: payload.summary
let message = Message(
return Message(
id: "temp-tool-\(payload.toolCallId)",
createdAt: Date(),
role: .tool,
@@ -1787,14 +2138,6 @@ final class SybilViewModel {
name: payload.name,
metadata: metadata
)
if let assistantIndex = pending.messages.indices.last(where: { pending.messages[$0].id.hasPrefix("temp-assistant-") }) {
pending.messages.insert(message, at: assistantIndex)
} else {
pending.messages.append(message)
}
pendingChatStates[chatID] = pending
}
private var currentChatID: String? {

View File

@@ -26,6 +26,10 @@ struct SybilWorkspaceView: View {
@State private var isShowingPhotoPicker = false
@State private var photoPickerItems: [PhotosPickerItem] = []
@State private var isComposerDropTargeted = false
@State private var transcriptTailSpacerHeight = SybilTranscriptTailSpacer.minimumHeight
@State private var transcriptTailSpacerTargetHeight = SybilTranscriptTailSpacer.minimumHeight
@State private var transcriptViewportHeight: CGFloat = 0
@State private var pendingAssistantBaselineHeight: CGFloat?
@State private var newChatSwipeOffset: CGFloat = 0
@State private var newChatSwipeCompletionOffset: CGFloat = 0
@State private var newChatSwipeContainerWidth: CGFloat = NewChatSwipeMetrics.referenceWidth
@@ -38,6 +42,10 @@ struct SybilWorkspaceView: View {
private let customWorkspaceNavigationContentInset: CGFloat = 96
private let composerOverlayContentInset: CGFloat = 112
private var visibleTranscriptTailSpacerHeight: CGFloat {
viewModel.showsComposer && !viewModel.isSearchMode ? transcriptTailSpacerHeight : 0
}
private var isSettingsSelected: Bool {
if case .settings = viewModel.selectedItem {
return true
@@ -50,7 +58,7 @@ struct SybilWorkspaceView: View {
}
private var showsCustomWorkspaceNavigation: Bool {
usesCustomWorkspaceNavigation && (!isSettingsSelected || navigationLeadingControl == .back)
usesCustomWorkspaceNavigation && (!isSettingsSelected || navigationLeadingControl != .hidden)
}
private var transcriptScrollContextID: String {
@@ -145,6 +153,17 @@ struct SybilWorkspaceView: View {
}
resetNewChatSwipe(animated: false)
}
.onChange(of: transcriptScrollContextID) { _, _ in
handleTranscriptContextChange()
}
.onChange(of: viewModel.isSendingVisibleChat) { wasSending, isSending in
handleVisibleChatSendingChange(wasSending: wasSending, isSending: isSending)
}
.onChange(of: viewModel.errorMessage) { _, message in
if message != nil && !viewModel.isSendingVisibleChat {
resetTranscriptTailSpacer(animated: true)
}
}
.task(id: composerFocusPolicyID) {
await applyComposerFocusPolicy()
}
@@ -194,7 +213,14 @@ struct SybilWorkspaceView: View {
isLoading: viewModel.isLoadingSelection,
isSending: viewModel.isSendingVisibleChat,
topContentInset: showsCustomWorkspaceNavigation ? customWorkspaceNavigationContentInset : 0,
bottomContentInset: viewModel.showsComposer ? composerOverlayContentInset : 0
bottomContentInset: viewModel.showsComposer ? composerOverlayContentInset : 0,
tailSpacerHeight: visibleTranscriptTailSpacerHeight,
onViewportHeightChange: { height in
handleTranscriptViewportHeightChange(height)
},
onPendingAssistantHeightChange: { height in
handlePendingAssistantHeightChange(height)
}
)
.id(transcriptScrollContextID)
}
@@ -285,6 +311,86 @@ struct SybilWorkspaceView: View {
}
}
private func handleTranscriptContextChange() {
resetTranscriptTailSpacer(animated: false)
}
private func handleVisibleChatSendingChange(wasSending: Bool, isSending: Bool) {
guard !viewModel.isSearchMode else {
resetTranscriptTailSpacer(animated: true)
return
}
if isSending {
prepareTranscriptTailSpacerForReply(animated: false)
return
}
if wasSending {
if viewModel.errorMessage != nil {
resetTranscriptTailSpacer(animated: true)
}
pendingAssistantBaselineHeight = nil
}
}
private func handleTranscriptViewportHeightChange(_ height: CGFloat) {
transcriptViewportHeight = height
if viewModel.isSendingVisibleChat,
transcriptTailSpacerTargetHeight <= SybilTranscriptTailSpacer.minimumHeight {
prepareTranscriptTailSpacerForReply(animated: false)
}
}
private func handlePendingAssistantHeightChange(_ height: CGFloat) {
guard viewModel.isSendingVisibleChat, !viewModel.isSearchMode, height > 0 else {
return
}
if pendingAssistantBaselineHeight == nil {
pendingAssistantBaselineHeight = height
}
let measuredHeight = SybilTranscriptTailSpacer.placeholderHeight(
targetHeight: transcriptTailSpacerTargetHeight,
baselineAssistantHeight: pendingAssistantBaselineHeight ?? height,
currentAssistantHeight: height
)
let nextHeight = min(transcriptTailSpacerHeight, measuredHeight)
setTranscriptTailSpacer(nextHeight, animated: false)
}
private func prepareTranscriptTailSpacerForReply(animated: Bool) {
let targetHeight = SybilTranscriptTailSpacer.replyBufferHeight(for: transcriptViewportHeight)
transcriptTailSpacerTargetHeight = targetHeight
pendingAssistantBaselineHeight = nil
setTranscriptTailSpacer(targetHeight, animated: animated)
}
private func resetTranscriptTailSpacer(animated: Bool) {
transcriptTailSpacerTargetHeight = SybilTranscriptTailSpacer.minimumHeight
pendingAssistantBaselineHeight = nil
setTranscriptTailSpacer(SybilTranscriptTailSpacer.minimumHeight, animated: animated)
}
private func setTranscriptTailSpacer(_ height: CGFloat, animated: Bool) {
let nextHeight = SybilTranscriptTailSpacer.clampedHeight(height)
guard abs(nextHeight - transcriptTailSpacerHeight) >= 0.5 else {
return
}
let update = {
transcriptTailSpacerHeight = nextHeight
}
if animated {
withAnimation(.easeOut(duration: 0.22), update)
} else {
update()
}
}
private func beginNewChatSwipe(containerWidth: CGFloat) {
let update = {
newChatSwipeContainerWidth = max(containerWidth, 1)
@@ -495,7 +601,7 @@ struct SybilWorkspaceView: View {
Divider()
ForEach(Provider.allCases, id: \.self) { candidate in
ForEach(viewModel.providerOptions, id: \.self) { candidate in
Menu(candidate.displayName) {
let models = viewModel.modelOptions(for: candidate)
if models.isEmpty {
@@ -702,6 +808,10 @@ struct SybilWorkspaceView: View {
return
}
if !viewModel.isSearchMode {
prepareTranscriptTailSpacerForReply(animated: false)
}
#if !targetEnvironment(macCatalyst)
if !viewModel.isSearchMode {
composerFocused = false
@@ -771,6 +881,37 @@ struct SybilWorkspaceView: View {
}
}
enum SybilTranscriptTailSpacer {
static let minimumHeight: CGFloat = 20
static let replyBufferMin: CGFloat = 288
static let replyBufferMax: CGFloat = 576
static let replyBufferViewportRatio: CGFloat = 0.52
static func replyBufferHeight(for viewportHeight: CGFloat) -> CGFloat {
guard viewportHeight > 0 else {
return replyBufferMin
}
return min(
replyBufferMax,
max(replyBufferMin, (viewportHeight * replyBufferViewportRatio).rounded())
)
}
static func clampedHeight(_ height: CGFloat) -> CGFloat {
max(minimumHeight, height.rounded(.up))
}
static func placeholderHeight(
targetHeight: CGFloat,
baselineAssistantHeight: CGFloat,
currentAssistantHeight: CGFloat
) -> CGFloat {
let consumedHeight = max(currentAssistantHeight - baselineAssistantHeight, 0)
return clampedHeight(targetHeight - consumedHeight)
}
}
enum NewChatSwipeMetrics {
static let referenceWidth: CGFloat = 390
static let horizontalActivationDistance: CGFloat = 18

View File

@@ -6,13 +6,22 @@ import Testing
private struct MockClientCallSnapshot: Sendable {
var listChats = 0
var listSearches = 0
var createChat = 0
var getChat = 0
var getSearch = 0
var getActiveRuns = 0
var runCompletionStream = 0
var attachCompletionStream = 0
var attachSearchStream = 0
}
private struct ChatCreateCallSnapshot: Sendable {
var title: String?
var provider: Provider?
var model: String?
var messages: [CompletionRequestMessage]?
}
private struct UnexpectedClientCall: Error {}
private actor MockSybilClient: SybilAPIClienting {
@@ -24,6 +33,9 @@ private actor MockSybilClient: SybilAPIClienting {
private let activeRunsResponse: ActiveRunsResponse
private var snapshot = MockClientCallSnapshot()
private var lastCreateChatCall: ChatCreateCallSnapshot?
private var lastCompletionStreamBody: CompletionStreamRequest?
private var completionStreamEvents: [CompletionStreamEvent]?
private var getChatDelayNanoseconds: UInt64 = 0
private var getSearchDelayNanoseconds: UInt64 = 0
private var completionStreamNetworkErrorMessage: String?
@@ -55,6 +67,19 @@ private actor MockSybilClient: SybilAPIClienting {
snapshot
}
func currentCreateChatCall() -> ChatCreateCallSnapshot? {
lastCreateChatCall
}
func currentCompletionStreamBody() -> CompletionStreamRequest? {
lastCompletionStreamBody
}
func setCompletionStreamEvents(_ events: [CompletionStreamEvent], delayNanoseconds: UInt64 = 0) {
completionStreamEvents = events
completionStreamDelayNanoseconds = delayNanoseconds
}
func setCompletionStreamNetworkError(_ message: String, delayNanoseconds: UInt64 = 0) {
completionStreamNetworkErrorMessage = message
completionStreamDelayNanoseconds = delayNanoseconds
@@ -100,7 +125,19 @@ private actor MockSybilClient: SybilAPIClienting {
return chatsResponse
}
func createChat(title: String?) async throws -> ChatSummary {
func createChat(
title: String?,
provider: Provider?,
model: String?,
messages: [CompletionRequestMessage]?
) async throws -> ChatSummary {
snapshot.createChat += 1
lastCreateChatCall = ChatCreateCallSnapshot(
title: title,
provider: provider,
model: model,
messages: messages
)
if let createChatResponse {
return createChatResponse
}
@@ -167,12 +204,20 @@ private actor MockSybilClient: SybilAPIClienting {
body: CompletionStreamRequest,
onEvent: @escaping @Sendable (CompletionStreamEvent) async -> Void
) async throws {
snapshot.runCompletionStream += 1
lastCompletionStreamBody = body
if completionStreamDelayNanoseconds > 0 {
try await Task.sleep(nanoseconds: completionStreamDelayNanoseconds)
}
if let completionStreamNetworkErrorMessage {
throw APIError.networkError(message: completionStreamNetworkErrorMessage)
}
if let completionStreamEvents {
for event in completionStreamEvents {
await onEvent(event)
}
return
}
throw UnexpectedClientCall()
}
@@ -470,6 +515,117 @@ private func makeSearchDetail(id: String, date: Date, answer: String) -> SearchD
await sendTask.value
}
@MainActor
@Test func quickQuestionRunsNonPersistentCompletionStream() async throws {
let client = MockSybilClient()
await client.setCompletionStreamEvents([
.delta(CompletionStreamDelta(text: "Reset it from ")),
.done(CompletionStreamDone(text: "Reset it from Settings."))
])
let viewModel = SybilViewModel(settings: testSettings(named: #function)) { _ in client }
viewModel.isAuthenticated = true
viewModel.isCheckingSession = false
viewModel.quickQuestionPrompt = "How do I reset my password?"
let task = viewModel.sendQuickQuestion()
await task?.value
let snapshot = await client.currentSnapshot()
let body = await client.currentCompletionStreamBody()
#expect(snapshot.runCompletionStream == 1)
#expect(body?.persist == false)
#expect(body?.chatId == nil)
#expect(body?.provider == .openai)
#expect(body?.messages.first?.role == .user)
#expect(body?.messages.first?.content == "How do I reset my password?")
#expect(viewModel.quickQuestionAnswerText == "Reset it from Settings.")
#expect(!viewModel.isQuickQuestionSending)
}
@MainActor
@Test func quickQuestionConvertCreatesSeededChat() async throws {
let date = Date(timeIntervalSince1970: 1_700_000_250)
let chat = makeChatSummary(id: "quick-chat", date: date)
let detail = ChatDetail(
id: chat.id,
title: chat.title,
createdAt: chat.createdAt,
updatedAt: chat.updatedAt,
initiatedProvider: .openai,
initiatedModel: "gpt-4.1-mini",
lastUsedProvider: .openai,
lastUsedModel: "gpt-4.1-mini",
messages: [
Message(id: "quick-user", createdAt: date, role: .user, content: "How do I reset my password?", name: nil),
Message(id: "quick-assistant", createdAt: date, role: .assistant, content: "Reset it from Settings.", name: nil)
]
)
let client = MockSybilClient(
chatsResponse: [chat],
chatDetails: [chat.id: detail],
createChatResponse: chat
)
let viewModel = SybilViewModel(settings: testSettings(named: #function)) { _ in client }
viewModel.isAuthenticated = true
viewModel.isCheckingSession = false
viewModel.quickQuestionSubmittedPrompt = "How do I reset my password?"
viewModel.quickQuestionSubmittedProvider = .openai
viewModel.quickQuestionSubmittedModel = "gpt-4.1-mini"
viewModel.quickQuestionMessages = [
Message(
id: "temp-assistant-quick",
createdAt: date,
role: .assistant,
content: "Reset it from Settings.",
name: nil
)
]
let didConvert = await viewModel.convertQuickQuestionToChat()
let snapshot = await client.currentSnapshot()
let createCall = await client.currentCreateChatCall()
#expect(didConvert)
#expect(snapshot.createChat == 1)
#expect(createCall?.title == "How do I reset my password?")
#expect(createCall?.provider == .openai)
#expect(createCall?.model == "gpt-4.1-mini")
#expect(createCall?.messages?.map(\.role) == [.user, .assistant])
#expect(createCall?.messages?.map(\.content) == ["How do I reset my password?", "Reset it from Settings."])
#expect(viewModel.selectedItem == .chat("quick-chat"))
#expect(viewModel.quickQuestionPrompt.isEmpty)
}
@MainActor
@Test func quickQuestionProviderAndModelSelectionPersistSeparately() async throws {
let defaults = UserDefaults(suiteName: #function)!
defaults.removePersistentDomain(forName: #function)
let settings = SybilSettingsStore(defaults: defaults)
settings.apiBaseURL = "http://127.0.0.1:8787"
let viewModel = SybilViewModel(settings: settings) { _ in MockSybilClient() }
viewModel.modelCatalog = [
.openai: ProviderModelInfo(models: ["gpt-4.1-mini", "gpt-4o"], loadedAt: nil, error: nil),
.anthropic: ProviderModelInfo(models: ["claude-3-5-sonnet-latest", "claude-3-haiku"], loadedAt: nil, error: nil)
]
viewModel.setQuickQuestionProvider(.anthropic)
viewModel.setQuickQuestionModel("claude-3-haiku")
#expect(viewModel.quickQuestionProvider == .anthropic)
#expect(viewModel.quickQuestionModel == "claude-3-haiku")
#expect(settings.preferredProvider == .openai)
let reloadedSettings = SybilSettingsStore(defaults: defaults)
#expect(reloadedSettings.quickQuestionPreferredProvider == .anthropic)
#expect(reloadedSettings.quickQuestionPreferredModelByProvider[.anthropic] == "claude-3-haiku")
#expect(reloadedSettings.preferredProvider == .openai)
let reloadedViewModel = SybilViewModel(settings: reloadedSettings) { _ in MockSybilClient() }
#expect(reloadedViewModel.quickQuestionProvider == .anthropic)
#expect(reloadedViewModel.quickQuestionModel == "claude-3-haiku")
#expect(reloadedViewModel.provider == .openai)
}
@MainActor
@Test func reconnectAttachesSelectedActiveChatStream() async throws {
let date = Date(timeIntervalSince1970: 1_700_000_260)
@@ -628,3 +784,23 @@ private func makeSearchDetail(id: String, date: Date, answer: String) -> SearchD
#expect(BackSwipeMetrics.shouldComplete(offset: 24, velocityX: 800, width: width, isLatched: false))
#expect(!BackSwipeMetrics.shouldComplete(offset: latchDistance + 1, velocityX: -800, width: width, isLatched: true))
}
@Test func transcriptTailSpacerContractsAsContentGrows() async throws {
let targetHeight: CGFloat = 320
let baselineAssistantHeight: CGFloat = 28
#expect(
SybilTranscriptTailSpacer.placeholderHeight(
targetHeight: targetHeight,
baselineAssistantHeight: baselineAssistantHeight,
currentAssistantHeight: baselineAssistantHeight + 120
) == 200
)
#expect(
SybilTranscriptTailSpacer.placeholderHeight(
targetHeight: targetHeight,
baselineAssistantHeight: baselineAssistantHeight,
currentAssistantHeight: baselineAssistantHeight + 500
) == SybilTranscriptTailSpacer.minimumHeight
)
}

View File

@@ -1,7 +1,7 @@
# Sybil Server
Backend API for:
- LLM multiplexer (OpenAI Responses / Anthropic / xAI Chat Completions-compatible Grok)
- LLM multiplexer (OpenAI Responses / Anthropic / xAI Chat Completions-compatible Grok / Hermes Agent)
- Personal chat database (chats/messages + LLM call log)
## Stack
@@ -43,6 +43,9 @@ If `ADMIN_TOKEN` is not set, the server runs in open mode (dev).
- `OPENAI_API_KEY`
- `ANTHROPIC_API_KEY`
- `XAI_API_KEY`
- `HERMES_AGENT_API_BASE_URL` (`http://127.0.0.1:8642/v1` by default; include the `/v1` suffix)
- `HERMES_AGENT_API_KEY` (enables the Hermes Agent provider; set to Hermes `API_SERVER_KEY`, or any non-empty value if that local server does not require auth)
- `HERMES_AGENT_MODEL` (optional fallback/override model id; defaults client-side to `hermes-agent`)
- `EXA_API_KEY`
- `CHAT_WEB_SEARCH_ENGINE` (`exa` by default, or `searxng` for chat tool calls only)
- `SEARXNG_BASE_URL` (required when `CHAT_WEB_SEARCH_ENGINE=searxng`; instance must allow `format=json`)

View File

@@ -13,6 +13,7 @@ enum Provider {
openai
anthropic
xai
hermes_agent @map("hermes-agent")
}
enum MessageRole {

View File

@@ -11,6 +11,13 @@ const OptionalUrlSchema = z.preprocess(
z.string().trim().url().optional()
);
const DEFAULT_HERMES_AGENT_API_BASE_URL = "http://127.0.0.1:8642/v1";
const HermesAgentApiBaseUrlSchema = z.preprocess(
(value) => (typeof value === "string" && value.trim() === "" ? undefined : value),
z.string().trim().url().default(DEFAULT_HERMES_AGENT_API_BASE_URL)
);
const ChatWebSearchEngineSchema = z.preprocess(
(value) => {
if (typeof value !== "string") return value;
@@ -59,6 +66,9 @@ const EnvSchema = z.object({
OPENAI_API_KEY: z.string().optional(),
ANTHROPIC_API_KEY: z.string().optional(),
XAI_API_KEY: z.string().optional(),
HERMES_AGENT_API_BASE_URL: HermesAgentApiBaseUrlSchema,
HERMES_AGENT_API_KEY: OptionalTrimmedStringSchema,
HERMES_AGENT_MODEL: OptionalTrimmedStringSchema,
EXA_API_KEY: z.string().optional(),
// Chat-mode web_search tool configuration. Search mode remains Exa-only for now.

View File

@@ -385,6 +385,10 @@ function normalizeIncomingMessages(messages: ChatMessage[]) {
return [{ role: "system", content: CHAT_TOOL_SYSTEM_PROMPT }, ...normalized];
}
function normalizePlainIncomingMessages(messages: ChatMessage[]) {
return messages.map((message) => buildOpenAIConversationMessage(message));
}
function normalizeIncomingResponsesInput(messages: ChatMessage[]) {
const normalized = messages.map((message) => buildOpenAIResponsesInputMessage(message));
@@ -853,6 +857,20 @@ function extractResponsesText(response: any, fallback = "") {
return parts.join("") || fallback;
}
function extractChatCompletionContent(message: any) {
if (typeof message?.content === "string") return message.content;
if (!Array.isArray(message?.content)) return "";
return message.content
.map((part: any) => {
if (typeof part === "string") return part;
if (typeof part?.text === "string") return part.text;
if (typeof part?.content === "string") return part.content;
return "";
})
.join("");
}
function getUnstreamedText(finalText: string, streamedText: string) {
if (!finalText) return "";
if (!streamedText) return finalText;
@@ -1093,6 +1111,26 @@ export async function runToolAwareChatCompletions(params: ToolAwareCompletionPar
};
}
export async function runPlainChatCompletions(params: ToolAwareCompletionParams): Promise<ToolAwareCompletionResult> {
const completion = await params.client.chat.completions.create({
model: params.model,
messages: normalizePlainIncomingMessages(params.messages),
temperature: params.temperature,
max_tokens: params.maxTokens,
} as any);
const usageAcc: Required<ToolAwareUsage> = { inputTokens: 0, outputTokens: 0, totalTokens: 0 };
const sawUsage = mergeUsage(usageAcc, completion?.usage);
const message = completion?.choices?.[0]?.message;
return {
text: extractChatCompletionContent(message),
usage: sawUsage ? usageAcc : undefined,
raw: { response: completion, api: "chat.completions" },
toolEvents: [],
};
}
export async function* runToolAwareOpenAIChatStream(
params: ToolAwareCompletionParams
): AsyncGenerator<ToolAwareStreamingEvent> {
@@ -1354,3 +1392,41 @@ export async function* runToolAwareChatCompletionsStream(
},
};
}
export async function* runPlainChatCompletionsStream(
params: ToolAwareCompletionParams
): AsyncGenerator<ToolAwareStreamingEvent> {
const rawResponses: unknown[] = [];
const usageAcc: Required<ToolAwareUsage> = { inputTokens: 0, outputTokens: 0, totalTokens: 0 };
let sawUsage = false;
let text = "";
const stream = await params.client.chat.completions.create({
model: params.model,
messages: normalizePlainIncomingMessages(params.messages),
temperature: params.temperature,
max_tokens: params.maxTokens,
stream: true,
} as any);
for await (const chunk of stream as any as AsyncIterable<any>) {
rawResponses.push(chunk);
sawUsage = mergeUsage(usageAcc, chunk?.usage) || sawUsage;
const deltaText = chunk?.choices?.[0]?.delta?.content ?? "";
if (typeof deltaText === "string" && deltaText.length) {
text += deltaText;
yield { type: "delta", text: deltaText };
}
}
yield {
type: "done",
result: {
text,
usage: sawUsage ? usageAcc : undefined,
raw: { streamed: true, responses: rawResponses, api: "chat.completions" },
toolEvents: [],
},
};
}

View File

@@ -1,5 +1,6 @@
import type { FastifyBaseLogger } from "fastify";
import { anthropicClient, openaiClient, xaiClient } from "./providers.js";
import { env } from "../env.js";
import { anthropicClient, hermesAgentClient, isHermesAgentConfigured, openaiClient, xaiClient } from "./providers.js";
import type { Provider } from "./types.js";
export type ProviderModelSnapshot = {
@@ -8,9 +9,9 @@ export type ProviderModelSnapshot = {
error: string | null;
};
export type ModelCatalogSnapshot = Record<Provider, ProviderModelSnapshot>;
export type ModelCatalogSnapshot = Partial<Record<Provider, ProviderModelSnapshot>>;
const providers: Provider[] = ["openai", "anthropic", "xai"];
const baseProviders: Provider[] = ["openai", "anthropic", "xai"];
const MODEL_FETCH_TIMEOUT_MS = 15000;
const modelCatalog: ModelCatalogSnapshot = {
@@ -19,6 +20,10 @@ const modelCatalog: ModelCatalogSnapshot = {
xai: { models: [], loadedAt: null, error: null },
};
function getCatalogProviders(): Provider[] {
return isHermesAgentConfigured() ? [...baseProviders, "hermes-agent"] : baseProviders;
}
function uniqSorted(models: string[]) {
return [...new Set(models.map((value) => value.trim()).filter(Boolean))].sort((a, b) => a.localeCompare(b));
}
@@ -59,8 +64,15 @@ async function fetchProviderModels(provider: Provider) {
return uniqSorted(page.data.map((model) => model.id));
}
const page = await xaiClient().models.list();
return uniqSorted(page.data.map((model) => model.id));
if (provider === "xai") {
const page = await xaiClient().models.list();
return uniqSorted(page.data.map((model) => model.id));
}
const page = await hermesAgentClient().models.list();
const models = page.data.map((model) => model.id);
if (env.HERMES_AGENT_MODEL) models.push(env.HERMES_AGENT_MODEL);
return uniqSorted(models);
}
async function refreshProviderModels(provider: Provider, logger?: FastifyBaseLogger) {
@@ -75,7 +87,7 @@ async function refreshProviderModels(provider: Provider, logger?: FastifyBaseLog
} catch (err: any) {
const message = err?.message ?? String(err);
modelCatalog[provider] = {
models: [],
models: provider === "hermes-agent" && env.HERMES_AGENT_MODEL ? [env.HERMES_AGENT_MODEL] : [],
loadedAt: new Date().toISOString(),
error: message,
};
@@ -84,25 +96,18 @@ async function refreshProviderModels(provider: Provider, logger?: FastifyBaseLog
}
export async function warmModelCatalog(logger?: FastifyBaseLogger) {
await Promise.all(providers.map((provider) => refreshProviderModels(provider, logger)));
await Promise.all(getCatalogProviders().map((provider) => refreshProviderModels(provider, logger)));
}
export function getModelCatalogSnapshot(): ModelCatalogSnapshot {
return {
openai: {
models: [...modelCatalog.openai.models],
loadedAt: modelCatalog.openai.loadedAt,
error: modelCatalog.openai.error,
},
anthropic: {
models: [...modelCatalog.anthropic.models],
loadedAt: modelCatalog.anthropic.loadedAt,
error: modelCatalog.anthropic.error,
},
xai: {
models: [...modelCatalog.xai.models],
loadedAt: modelCatalog.xai.loadedAt,
error: modelCatalog.xai.error,
},
};
const snapshot: ModelCatalogSnapshot = {};
for (const provider of getCatalogProviders()) {
const entry = modelCatalog[provider] ?? { models: [], loadedAt: null, error: null };
snapshot[provider] = {
models: [...entry.models],
loadedAt: entry.loadedAt,
error: entry.error,
};
}
return snapshot;
}

View File

@@ -1,13 +1,13 @@
import { performance } from "node:perf_hooks";
import { prisma } from "../db.js";
import { anthropicClient, openaiClient, xaiClient } from "./providers.js";
import { buildToolLogMessageData, runToolAwareChatCompletions, runToolAwareOpenAIChat } from "./chat-tools.js";
import { anthropicClient, hermesAgentClient, openaiClient, xaiClient } from "./providers.js";
import { buildToolLogMessageData, runPlainChatCompletions, runToolAwareChatCompletions, runToolAwareOpenAIChat } from "./chat-tools.js";
import { buildAnthropicConversationMessage, getAnthropicSystemPrompt } from "./message-content.js";
import { toPrismaProvider } from "./provider-ids.js";
import type { MultiplexRequest, MultiplexResponse, Provider } from "./types.js";
function asProviderEnum(p: Provider) {
// Prisma enum values match these strings.
return p;
return toPrismaProvider(p);
}
export async function runMultiplex(req: MultiplexRequest): Promise<MultiplexResponse> {
@@ -84,6 +84,23 @@ export async function runMultiplex(req: MultiplexRequest): Promise<MultiplexResp
outText = r.text;
usage = r.usage;
toolMessages = r.toolEvents.map((event) => buildToolLogMessageData(call.chatId, event));
} else if (req.provider === "hermes-agent") {
const client = hermesAgentClient();
const r = await runPlainChatCompletions({
client,
model: req.model,
messages: req.messages,
temperature: req.temperature,
maxTokens: req.maxTokens,
logContext: {
provider: req.provider,
model: req.model,
chatId,
},
});
raw = r.raw;
outText = r.text;
usage = r.usage;
} else if (req.provider === "anthropic") {
const client = anthropicClient();

View File

@@ -0,0 +1,31 @@
import type { Provider } from "./types.js";
type PrismaProvider = Exclude<Provider, "hermes-agent"> | "hermes_agent";
export function toPrismaProvider(provider: Provider): PrismaProvider {
return provider === "hermes-agent" ? "hermes_agent" : provider;
}
export function fromPrismaProvider(provider: unknown): Provider | null {
if (provider === null || provider === undefined) return null;
if (provider === "hermes_agent" || provider === "hermes-agent") return "hermes-agent";
if (provider === "openai" || provider === "anthropic" || provider === "xai") return provider;
return null;
}
export function serializeProviderFields<T extends Record<string, any>>(value: T): T {
const next: Record<string, any> = { ...value };
if ("initiatedProvider" in next) {
next.initiatedProvider = fromPrismaProvider(next.initiatedProvider);
}
if ("lastUsedProvider" in next) {
next.lastUsedProvider = fromPrismaProvider(next.lastUsedProvider);
}
if ("provider" in next) {
next.provider = fromPrismaProvider(next.provider);
}
if (Array.isArray(next.calls)) {
next.calls = next.calls.map((call: Record<string, any>) => serializeProviderFields(call));
}
return next as T;
}

View File

@@ -13,6 +13,18 @@ export function xaiClient() {
return new OpenAI({ apiKey: env.XAI_API_KEY, baseURL: "https://api.x.ai/v1" });
}
export function isHermesAgentConfigured() {
return Boolean(env.HERMES_AGENT_API_KEY);
}
export function hermesAgentClient() {
if (!env.HERMES_AGENT_API_KEY) throw new Error("HERMES_AGENT_API_KEY not set");
return new OpenAI({
apiKey: env.HERMES_AGENT_API_KEY,
baseURL: env.HERMES_AGENT_API_BASE_URL,
});
}
export function anthropicClient() {
if (!env.ANTHROPIC_API_KEY) throw new Error("ANTHROPIC_API_KEY not set");
return new Anthropic({ apiKey: env.ANTHROPIC_API_KEY });

View File

@@ -1,13 +1,15 @@
import { performance } from "node:perf_hooks";
import { prisma } from "../db.js";
import { anthropicClient, openaiClient, xaiClient } from "./providers.js";
import { anthropicClient, hermesAgentClient, openaiClient, xaiClient } from "./providers.js";
import {
buildToolLogMessageData,
runPlainChatCompletionsStream,
runToolAwareChatCompletionsStream,
runToolAwareOpenAIChatStream,
type ToolExecutionEvent,
} from "./chat-tools.js";
import { buildAnthropicConversationMessage, getAnthropicSystemPrompt } from "./message-content.js";
import { toPrismaProvider } from "./provider-ids.js";
import type { MultiplexRequest, Provider } from "./types.js";
type StreamUsage = {
@@ -38,7 +40,7 @@ export async function* runMultiplexStream(req: MultiplexRequest): AsyncGenerator
? await prisma.llmCall.create({
data: {
chatId,
provider: req.provider as any,
provider: toPrismaProvider(req.provider) as any,
model: req.model,
request: req as any,
},
@@ -51,14 +53,14 @@ export async function* runMultiplexStream(req: MultiplexRequest): AsyncGenerator
prisma.chat.update({
where: { id: chatId },
data: {
lastUsedProvider: req.provider as any,
lastUsedProvider: toPrismaProvider(req.provider) as any,
lastUsedModel: req.model,
},
}),
prisma.chat.updateMany({
where: { id: chatId, initiatedProvider: null },
data: {
initiatedProvider: req.provider as any,
initiatedProvider: toPrismaProvider(req.provider) as any,
initiatedModel: req.model,
},
}),
@@ -72,8 +74,8 @@ export async function* runMultiplexStream(req: MultiplexRequest): AsyncGenerator
let raw: unknown = { streamed: true };
try {
if (req.provider === "openai" || req.provider === "xai") {
const client = req.provider === "openai" ? openaiClient() : xaiClient();
if (req.provider === "openai" || req.provider === "xai" || req.provider === "hermes-agent") {
const client = req.provider === "openai" ? openaiClient() : req.provider === "xai" ? xaiClient() : hermesAgentClient();
const streamEvents =
req.provider === "openai"
? runToolAwareOpenAIChatStream({
@@ -88,6 +90,19 @@ export async function* runMultiplexStream(req: MultiplexRequest): AsyncGenerator
chatId: chatId ?? undefined,
},
})
: req.provider === "hermes-agent"
? runPlainChatCompletionsStream({
client,
model: req.model,
messages: req.messages,
temperature: req.temperature,
maxTokens: req.maxTokens,
logContext: {
provider: req.provider,
model: req.model,
chatId: chatId ?? undefined,
},
})
: runToolAwareChatCompletionsStream({
client,
model: req.model,

View File

@@ -1,4 +1,6 @@
export type Provider = "openai" | "anthropic" | "xai";
export const PROVIDERS = ["openai", "anthropic", "xai", "hermes-agent"] as const;
export type Provider = (typeof PROVIDERS)[number];
export type ChatImageAttachment = {
kind: "image";

View File

@@ -10,9 +10,12 @@ import { runMultiplex } from "./llm/multiplexer.js";
import { runMultiplexStream, type StreamEvent } from "./llm/streaming.js";
import { getModelCatalogSnapshot } from "./llm/model-catalog.js";
import { openaiClient } from "./llm/providers.js";
import { serializeProviderFields, toPrismaProvider } from "./llm/provider-ids.js";
import { exaClient } from "./search/exa.js";
import type { ChatAttachment } from "./llm/types.js";
const ProviderSchema = z.enum(["openai", "anthropic", "xai", "hermes-agent"]);
type IncomingChatMessage = {
role: "system" | "user" | "assistant" | "tool";
content: string;
@@ -125,7 +128,7 @@ const CompletionStreamBody = z
.object({
chatId: z.string().optional(),
persist: z.boolean().optional(),
provider: z.enum(["openai", "anthropic", "xai"]),
provider: ProviderSchema,
model: z.string().min(1),
messages: z.array(CompletionMessageSchema),
temperature: z.number().min(0).max(2).optional(),
@@ -591,7 +594,7 @@ export async function registerRoutes(app: FastifyInstance) {
lastUsedModel: true,
},
});
return { chats };
return { chats: chats.map((chat) => serializeProviderFields(chat)) };
});
app.post("/v1/chats", async (req) => {
@@ -599,7 +602,7 @@ export async function registerRoutes(app: FastifyInstance) {
const Body = z
.object({
title: z.string().optional(),
provider: z.enum(["openai", "anthropic", "xai"]).optional(),
provider: ProviderSchema.optional(),
model: z.string().trim().min(1).optional(),
messages: z.array(CompletionMessageSchema).optional(),
})
@@ -625,9 +628,9 @@ export async function registerRoutes(app: FastifyInstance) {
const chat = await prisma.chat.create({
data: {
title: body.title,
initiatedProvider: body.provider as any,
initiatedProvider: body.provider ? (toPrismaProvider(body.provider) as any) : undefined,
initiatedModel: body.model,
lastUsedProvider: body.provider as any,
lastUsedProvider: body.provider ? (toPrismaProvider(body.provider) as any) : undefined,
lastUsedModel: body.model,
messages: body.messages?.length
? {
@@ -651,7 +654,7 @@ export async function registerRoutes(app: FastifyInstance) {
lastUsedModel: true,
},
});
return { chat };
return { chat: serializeProviderFields(chat) };
});
app.patch("/v1/chats/:chatId", async (req) => {
@@ -682,7 +685,7 @@ export async function registerRoutes(app: FastifyInstance) {
},
});
if (!chat) return app.httpErrors.notFound("chat not found");
return { chat };
return { chat: serializeProviderFields(chat) };
});
app.post("/v1/chats/title/suggest", async (req) => {
@@ -707,7 +710,7 @@ export async function registerRoutes(app: FastifyInstance) {
},
});
if (!existing) return app.httpErrors.notFound("chat not found");
if (existing.title?.trim()) return { chat: existing };
if (existing.title?.trim()) return { chat: serializeProviderFields(existing) };
const fallback = body.content.split(/\r?\n/)[0]?.trim().slice(0, 48) || "New chat";
const suggestedRaw = await generateChatTitle(body.content);
@@ -728,7 +731,7 @@ export async function registerRoutes(app: FastifyInstance) {
},
});
return { chat };
return { chat: serializeProviderFields(chat) };
});
app.delete("/v1/chats/:chatId", async (req) => {
@@ -848,7 +851,7 @@ export async function registerRoutes(app: FastifyInstance) {
},
});
return { chat };
return { chat: serializeProviderFields(chat) };
});
app.post("/v1/searches/:searchId/run", async (req) => {
@@ -994,7 +997,7 @@ export async function registerRoutes(app: FastifyInstance) {
include: { messages: { orderBy: { createdAt: "asc" } }, calls: { orderBy: { createdAt: "desc" } } },
});
if (!chat) return app.httpErrors.notFound("chat not found");
return { chat };
return { chat: serializeProviderFields(chat) };
});
app.post("/v1/chats/:chatId/messages", async (req) => {
@@ -1041,7 +1044,7 @@ export async function registerRoutes(app: FastifyInstance) {
const Body = z.object({
chatId: z.string().optional(),
provider: z.enum(["openai", "anthropic", "xai"]),
provider: ProviderSchema,
model: z.string().min(1),
messages: z.array(CompletionMessageSchema),
temperature: z.number().min(0).max(2).optional(),

View File

@@ -1,6 +1,7 @@
import assert from "node:assert/strict";
import test from "node:test";
import {
runPlainChatCompletionsStream,
runToolAwareChatCompletionsStream,
runToolAwareOpenAIChatStream,
type ToolAwareStreamingEvent,
@@ -105,3 +106,37 @@ test("OpenAI-compatible Chat Completions stream emits text deltas as they arrive
);
assert.equal(events.at(-1)?.type === "done" ? events.at(-1)?.result.text : null, "Hello");
});
test("plain Chat Completions stream does not send Sybil-managed tools", async () => {
let requestBody: any = null;
const client = {
chat: {
completions: {
create: async (body: any) => {
requestBody = body;
return streamFrom([
{ choices: [{ delta: { content: "Hi" } }] },
{ choices: [{ delta: {}, finish_reason: "stop" }] },
]);
},
},
},
};
const events = await collectEvents(
runPlainChatCompletionsStream({
client: client as any,
model: "hermes-agent",
messages: [{ role: "user", content: "Say hi" }],
})
);
assert.equal(requestBody.model, "hermes-agent");
assert.equal(requestBody.stream, true);
assert.equal("tools" in requestBody, false);
assert.deepEqual(
events.map((event) => event.type),
["delta", "done"]
);
assert.equal(events.at(-1)?.type === "done" ? events.at(-1)?.result.text : null, "Hi");
});

View File

@@ -0,0 +1,12 @@
import assert from "node:assert/strict";
import test from "node:test";
import { fromPrismaProvider, serializeProviderFields, toPrismaProvider } from "../src/llm/provider-ids.js";
test("Hermes Agent provider id maps between API and Prisma enum forms", () => {
assert.equal(toPrismaProvider("hermes-agent"), "hermes_agent");
assert.equal(fromPrismaProvider("hermes_agent"), "hermes-agent");
assert.deepEqual(serializeProviderFields({ initiatedProvider: "hermes_agent", lastUsedProvider: "xai" }), {
initiatedProvider: "hermes-agent",
lastUsedProvider: "xai",
});
});

View File

@@ -23,7 +23,7 @@ Configuration is environment-only (no in-app settings).
- `SYBIL_TUI_API_BASE_URL`: API base URL. Default: `http://127.0.0.1:8787`
- `SYBIL_TUI_ADMIN_TOKEN`: optional bearer token for token-mode servers
- `SYBIL_TUI_DEFAULT_PROVIDER`: `openai` | `anthropic` | `xai` (default: `openai`)
- `SYBIL_TUI_DEFAULT_PROVIDER`: `openai` | `anthropic` | `xai` | `hermes-agent` (default: `openai`)
- `SYBIL_TUI_DEFAULT_MODEL`: optional default model name
- `SYBIL_TUI_SEARCH_NUM_RESULTS`: results per search run (default: `10`)

View File

@@ -1,6 +1,6 @@
import type { Provider } from "./types.js";
const PROVIDERS: Provider[] = ["openai", "anthropic", "xai"];
const PROVIDERS: Provider[] = ["openai", "anthropic", "xai", "hermes-agent"];
function normalizeBaseUrl(value: string) {
const trimmed = value.trim();

View File

@@ -39,11 +39,13 @@ type ToolLogMetadata = {
resultPreview?: string | null;
};
const PROVIDERS: Provider[] = ["openai", "anthropic", "xai"];
const BASE_PROVIDERS: Provider[] = ["openai", "anthropic", "xai"];
const PROVIDERS: Provider[] = [...BASE_PROVIDERS, "hermes-agent"];
const PROVIDER_FALLBACK_MODELS: Record<Provider, string[]> = {
openai: ["gpt-4.1-mini"],
anthropic: ["claude-3-5-sonnet-latest"],
xai: ["grok-3-mini"],
"hermes-agent": ["hermes-agent"],
};
const EMPTY_MODEL_CATALOG: ModelCatalogResponse["providers"] = {
@@ -74,6 +76,7 @@ function getProviderLabel(provider: Provider | null | undefined) {
if (provider === "openai") return "OpenAI";
if (provider === "anthropic") return "Anthropic";
if (provider === "xai") return "xAI";
if (provider === "hermes-agent") return "Hermes Agent";
return "";
}
@@ -159,6 +162,10 @@ function getModelOptions(catalog: ModelCatalogResponse["providers"], provider: P
return PROVIDER_FALLBACK_MODELS[provider];
}
function getVisibleProviders(catalog: ModelCatalogResponse["providers"]) {
return PROVIDERS.filter((provider) => provider !== "hermes-agent" || catalog[provider] !== undefined);
}
function pickProviderModel(options: string[], preferred: string | null, fallback: string | null = null) {
if (fallback && options.includes(fallback)) return fallback;
if (preferred && options.includes(preferred)) return preferred;
@@ -202,6 +209,7 @@ async function main() {
openai: null,
anthropic: null,
xai: null,
"hermes-agent": null,
};
let model: string = config.defaultModel ?? pickProviderModel(getModelOptions(modelCatalog, provider), null);
let errorMessage: string | null = null;
@@ -1257,8 +1265,10 @@ async function main() {
}
function cycleProvider() {
const currentIndex = PROVIDERS.indexOf(provider);
const nextProvider: Provider = PROVIDERS[(currentIndex + 1) % PROVIDERS.length] ?? "openai";
const visibleProviders = getVisibleProviders(modelCatalog);
const cycleProviders = visibleProviders.length ? visibleProviders : BASE_PROVIDERS;
const currentIndex = Math.max(0, cycleProviders.indexOf(provider));
const nextProvider: Provider = cycleProviders[(currentIndex + 1) % cycleProviders.length] ?? "openai";
provider = nextProvider;
syncModelForProvider();
updateUI();

View File

@@ -1,4 +1,4 @@
export type Provider = "openai" | "anthropic" | "xai";
export type Provider = "openai" | "anthropic" | "xai" | "hermes-agent";
export type ProviderModelInfo = {
models: string[];
@@ -7,7 +7,7 @@ export type ProviderModelInfo = {
};
export type ModelCatalogResponse = {
providers: Record<Provider, ProviderModelInfo>;
providers: Partial<Record<Provider, ProviderModelInfo>>;
};
export type ChatSummary = {

View File

@@ -95,6 +95,7 @@ const PROVIDER_FALLBACK_MODELS: Record<Provider, string[]> = {
openai: ["gpt-4.1-mini"],
anthropic: ["claude-3-5-sonnet-latest"],
xai: ["grok-3-mini"],
"hermes-agent": ["hermes-agent"],
};
const EMPTY_MODEL_CATALOG: ModelCatalogResponse["providers"] = {
@@ -103,6 +104,9 @@ const EMPTY_MODEL_CATALOG: ModelCatalogResponse["providers"] = {
xai: { models: [], loadedAt: null, error: null },
};
const BASE_PROVIDERS: Provider[] = ["openai", "anthropic", "xai"];
const ALL_PROVIDERS: Provider[] = [...BASE_PROVIDERS, "hermes-agent"];
const MODEL_PREFERENCES_STORAGE_KEY = "sybil:modelPreferencesByProvider";
const QUICK_QUESTION_MODEL_SELECTION_STORAGE_KEY = "sybil:quickQuestionModelSelection";
@@ -117,6 +121,7 @@ const EMPTY_MODEL_PREFERENCES: ProviderModelPreferences = {
openai: null,
anthropic: null,
xai: null,
"hermes-agent": null,
};
const EMPTY_ACTIVE_RUNS: ActiveRunsState = {
chats: {},
@@ -193,6 +198,10 @@ function getModelOptions(catalog: ModelCatalogResponse["providers"], provider: P
return PROVIDER_FALLBACK_MODELS[provider];
}
function getVisibleProviders(catalog: ModelCatalogResponse["providers"]) {
return ALL_PROVIDERS.filter((provider) => provider !== "hermes-agent" || catalog[provider] !== undefined);
}
function getReplyScrollBufferHeight() {
if (typeof window === "undefined") return REPLY_SCROLL_BUFFER_MIN;
return Math.min(
@@ -308,6 +317,8 @@ function loadStoredModelPreferences() {
openai: typeof parsed.openai === "string" && parsed.openai.trim() ? parsed.openai.trim() : null,
anthropic: typeof parsed.anthropic === "string" && parsed.anthropic.trim() ? parsed.anthropic.trim() : null,
xai: typeof parsed.xai === "string" && parsed.xai.trim() ? parsed.xai.trim() : null,
"hermes-agent":
typeof parsed["hermes-agent"] === "string" && parsed["hermes-agent"].trim() ? parsed["hermes-agent"].trim() : null,
};
} catch {
return EMPTY_MODEL_PREFERENCES;
@@ -315,17 +326,19 @@ function loadStoredModelPreferences() {
}
function normalizeStoredProvider(value: unknown): Provider {
return value === "anthropic" || value === "xai" || value === "openai" ? value : "openai";
return value === "anthropic" || value === "xai" || value === "openai" || value === "hermes-agent" ? value : "openai";
}
function normalizeStoredModelPreferences(value: unknown): ProviderModelPreferences {
if (!value || typeof value !== "object" || Array.isArray(value)) return EMPTY_MODEL_PREFERENCES;
const parsed = value as Partial<Record<Provider, unknown>>;
return {
openai: typeof parsed.openai === "string" && parsed.openai.trim() ? parsed.openai.trim() : null,
anthropic: typeof parsed.anthropic === "string" && parsed.anthropic.trim() ? parsed.anthropic.trim() : null,
xai: typeof parsed.xai === "string" && parsed.xai.trim() ? parsed.xai.trim() : null,
};
return {
openai: typeof parsed.openai === "string" && parsed.openai.trim() ? parsed.openai.trim() : null,
anthropic: typeof parsed.anthropic === "string" && parsed.anthropic.trim() ? parsed.anthropic.trim() : null,
xai: typeof parsed.xai === "string" && parsed.xai.trim() ? parsed.xai.trim() : null,
"hermes-agent":
typeof parsed["hermes-agent"] === "string" && parsed["hermes-agent"].trim() ? parsed["hermes-agent"].trim() : null,
};
}
function loadStoredQuickQuestionModelSelection(): QuickQuestionModelSelection {
@@ -354,6 +367,7 @@ function getProviderLabel(provider: Provider | null | undefined) {
if (provider === "openai") return "OpenAI";
if (provider === "anthropic") return "Anthropic";
if (provider === "xai") return "xAI";
if (provider === "hermes-agent") return "Hermes Agent";
return "";
}
@@ -715,6 +729,8 @@ export default function App() {
const wasSendingRef = useRef(false);
const pendingReplyScrollRef = useRef(false);
const transcriptTailSpacerHeightRef = useRef(TRANSCRIPT_BOTTOM_GAP);
const transcriptTailSpacerSettleFrameRef = useRef<number | null>(null);
const transcriptViewKeyRef = useRef<string | null>(null);
const [contextMenu, setContextMenu] = useState<ContextMenuState | null>(null);
const [isMobileSidebarOpen, setIsMobileSidebarOpen] = useState(false);
const [sidebarQuery, setSidebarQuery] = useState("");
@@ -739,6 +755,7 @@ export default function App() {
const settleTranscriptTailSpacer = () => {
const container = transcriptContainerRef.current;
const currentSpacerHeight = transcriptTailSpacerHeightRef.current;
if (currentSpacerHeight <= TRANSCRIPT_BOTTOM_GAP) return;
if (!container) {
setTranscriptTailSpacer(TRANSCRIPT_BOTTOM_GAP);
return;
@@ -746,7 +763,22 @@ export default function App() {
const scrollHeightWithoutSpacer = container.scrollHeight - currentSpacerHeight;
const requiredSpacerHeight = container.scrollTop + container.clientHeight - scrollHeightWithoutSpacer;
setTranscriptTailSpacer(requiredSpacerHeight);
setTranscriptTailSpacer(Math.min(currentSpacerHeight, requiredSpacerHeight));
};
const requestSettleTranscriptTailSpacer = () => {
if (transcriptTailSpacerHeightRef.current <= TRANSCRIPT_BOTTOM_GAP) return;
if (typeof window === "undefined") {
settleTranscriptTailSpacer();
return;
}
if (transcriptTailSpacerSettleFrameRef.current !== null) {
window.cancelAnimationFrame(transcriptTailSpacerSettleFrameRef.current);
}
transcriptTailSpacerSettleFrameRef.current = window.requestAnimationFrame(() => {
transcriptTailSpacerSettleFrameRef.current = null;
settleTranscriptTailSpacer();
});
};
const focusComposer = () => {
@@ -963,6 +995,7 @@ export default function App() {
const providerModelOptions = useMemo(() => getModelOptions(modelCatalog, provider), [modelCatalog, provider]);
const quickProviderModelOptions = useMemo(() => getModelOptions(modelCatalog, quickProvider), [modelCatalog, quickProvider]);
const providerOptions = useMemo(() => getVisibleProviders(modelCatalog), [modelCatalog]);
useEffect(() => {
if (model.trim()) return;
@@ -1017,6 +1050,7 @@ export default function App() {
}, [quickPrompt, isQuickQuestionOpen]);
const selectedKey = selectedItem ? `${selectedItem.kind}:${selectedItem.id}` : null;
const transcriptViewKey = draftKind ? `draft:${draftKind}` : selectedKey ?? "empty";
const selectedChatPendingState = selectedItem?.kind === "chat" ? pendingChatStates[selectedItem.id] ?? null : null;
const selectedSearchRunState = selectedItem?.kind === "search" ? runningSearchStates[selectedItem.id] ?? null : null;
const selectedChatIsActive = selectedItem?.kind === "chat" && (!!selectedChatPendingState || !!activeRuns.chats[selectedItem.id]);
@@ -1038,11 +1072,13 @@ export default function App() {
};
useEffect(() => {
const didViewChange = transcriptViewKeyRef.current !== transcriptViewKey;
transcriptViewKeyRef.current = transcriptViewKey;
shouldAutoScrollRef.current = true;
if (!isSendingActiveChat) {
if (didViewChange && !pendingReplyScrollRef.current) {
setTranscriptTailSpacer(TRANSCRIPT_BOTTOM_GAP);
}
}, [isSendingActiveChat, selectedKey]);
}, [transcriptViewKey]);
useEffect(() => {
selectedItemRef.current = selectedItem;
@@ -1101,6 +1137,10 @@ export default function App() {
useEffect(() => {
return () => {
if (transcriptTailSpacerSettleFrameRef.current !== null) {
window.cancelAnimationFrame(transcriptTailSpacerSettleFrameRef.current);
transcriptTailSpacerSettleFrameRef.current = null;
}
for (const controller of chatStreamAbortRefs.current.values()) {
controller.abort();
}
@@ -1129,6 +1169,16 @@ export default function App() {
if (selectedChatPendingState) return selectedChatPendingState.messages.filter(isDisplayableMessage);
return messages.filter(isDisplayableMessage);
}, [messages, selectedChatPendingState]);
const displayMessagesLayoutKey = useMemo(
() => displayMessages.map((message) => `${message.id}:${message.content.length}`).join("|"),
[displayMessages]
);
useEffect(() => {
if (isSearchMode || isSendingActiveChat) return;
requestSettleTranscriptTailSpacer();
}, [displayMessagesLayoutKey, isSearchMode, isSendingActiveChat, selectedKey]);
const quickAnswerText = useMemo(() => {
for (let index = quickQuestionMessages.length - 1; index >= 0; index -= 1) {
const message = quickQuestionMessages[index];
@@ -1465,6 +1515,11 @@ export default function App() {
};
const handleSendChat = async (content: string, attachments: ChatAttachment[]): Promise<SidebarSelection> => {
const selectedModel = model.trim();
if (!selectedModel) {
throw new Error("No model available for selected provider");
}
pendingReplyScrollRef.current = true;
expandTranscriptTailSpacer(getReplyScrollBufferHeight());
@@ -1548,11 +1603,6 @@ export default function App() {
},
];
const selectedModel = model.trim();
if (!selectedModel) {
throw new Error("No model available for selected provider");
}
const chatSummary = chats.find((chat) => chat.id === chatId);
const hasExistingTitle = Boolean(selectedChat?.id === chatId ? selectedChat.title?.trim() : chatSummary?.title?.trim());
if (!hasExistingTitle && !pendingTitleGenerationRef.current.has(chatId)) {
@@ -1672,13 +1722,17 @@ export default function App() {
if (currentSelection?.kind === "chat" && currentSelection.id === chatId) {
await refreshChat(chatId);
}
settleTranscriptTailSpacer();
removePendingChatState(chatId);
removeActiveRun("chat", chatId);
if (currentSelection?.kind === "chat" && currentSelection.id === chatId) {
requestSettleTranscriptTailSpacer();
}
return { kind: "chat", id: chatId };
} catch (err) {
removePendingChatState(chatId);
removeActiveRun("chat", chatId);
pendingReplyScrollRef.current = false;
setTranscriptTailSpacer(TRANSCRIPT_BOTTOM_GAP);
throw err;
}
};
@@ -1939,6 +1993,9 @@ export default function App() {
chatStreamAbortRefs.current.delete(chatId);
removePendingChatState(chatId);
removeActiveRun("chat", chatId);
if (isCurrentSelection(target)) {
requestSettleTranscriptTailSpacer();
}
}
};
@@ -2288,6 +2345,10 @@ export default function App() {
sentTarget = await handleSendChat(content, attachments);
}
} catch (err) {
if (!sentAsSearch) {
pendingReplyScrollRef.current = false;
setTranscriptTailSpacer(TRANSCRIPT_BOTTOM_GAP);
}
const message = err instanceof Error ? err.message : String(err);
if (message.includes("bearer token")) {
handleAuthFailure(message);
@@ -2512,9 +2573,11 @@ export default function App() {
}}
disabled={isActiveSelectionSending}
>
<option value="openai">OpenAI</option>
<option value="anthropic">Anthropic</option>
<option value="xai">xAI</option>
{providerOptions.map((candidate) => (
<option key={candidate} value={candidate}>
{getProviderLabel(candidate)}
</option>
))}
</select>
<ModelCombobox
options={providerModelOptions}
@@ -2547,6 +2610,9 @@ export default function App() {
if (!container) return;
const distanceFromBottom = container.scrollHeight - container.scrollTop - container.clientHeight;
shouldAutoScrollRef.current = distanceFromBottom < 96;
if (!isSearchMode && !isSendingActiveChat && distanceFromBottom > 0) {
settleTranscriptTailSpacer();
}
}}
>
{!isSearchMode ? (
@@ -2758,9 +2824,11 @@ export default function App() {
disabled={isQuickQuestionSending || isConvertingQuickQuestion}
aria-label="Quick question provider"
>
<option value="openai">OpenAI</option>
<option value="anthropic">Anthropic</option>
<option value="xai">xAI</option>
{providerOptions.map((candidate) => (
<option key={candidate} value={candidate}>
{getProviderLabel(candidate)}
</option>
))}
</select>
<ModelCombobox
options={quickProviderModelOptions}

View File

@@ -206,17 +206,31 @@ textarea {
}
.md-content code {
background: hsl(288 22% 23%);
border-radius: 0.25rem;
background: hsl(249 40% 10% / 0.78);
border-radius: 0.3rem;
padding: 0.05rem 0.3rem;
font-size: 0.86em;
box-decoration-break: clone;
-webkit-box-decoration-break: clone;
}
.md-content pre {
overflow-x: auto;
border-radius: 0.5rem;
background: hsl(287 28% 13%);
padding: 0.6rem 0.75rem;
border: 1px solid hsl(253 31% 29% / 0.72);
border-radius: 0.625rem;
background: hsl(249 40% 10% / 0.82);
padding: 0.75rem;
box-shadow: inset 0 1px 0 hsl(258 80% 88% / 0.05);
}
.md-content pre code {
display: block;
background: transparent;
border-radius: 0;
padding: 0;
font-size: 0.88em;
line-height: 1.55;
white-space: pre;
}
.md-content a {

View File

@@ -127,7 +127,7 @@ export type CompletionRequestMessage = {
attachments?: ChatAttachment[];
};
export type Provider = "openai" | "anthropic" | "xai";
export type Provider = "openai" | "anthropic" | "xai" | "hermes-agent";
export type ProviderModelInfo = {
models: string[];
@@ -136,7 +136,7 @@ export type ProviderModelInfo = {
};
export type ModelCatalogResponse = {
providers: Record<Provider, ProviderModelInfo>;
providers: Partial<Record<Provider, ProviderModelInfo>>;
};
export type ActiveRunsResponse = {