Merge pull request #899 from paperclipai/paperclip-subissues
Advanced Workspace Support
This commit is contained in:
1263
doc/plans/workspace-product-model-and-work-product.md
Normal file
1263
doc/plans/workspace-product-model-and-work-product.md
Normal file
File diff suppressed because it is too large
Load Diff
882
doc/plans/workspace-technical-implementation.md
Normal file
882
doc/plans/workspace-technical-implementation.md
Normal file
@@ -0,0 +1,882 @@
|
|||||||
|
# Workspace Technical Implementation Spec
|
||||||
|
|
||||||
|
## Role of This Document
|
||||||
|
|
||||||
|
This document translates [workspace-product-model-and-work-product.md](/Users/dotta/paperclip-subissues/doc/plans/workspace-product-model-and-work-product.md) into an implementation-ready engineering plan.
|
||||||
|
|
||||||
|
It is intentionally concrete:
|
||||||
|
|
||||||
|
- schema and migration shape
|
||||||
|
- shared contract updates
|
||||||
|
- route and service changes
|
||||||
|
- UI changes
|
||||||
|
- rollout and compatibility rules
|
||||||
|
|
||||||
|
This is the implementation target for the first workspace-aware delivery slice.
|
||||||
|
|
||||||
|
## Locked Decisions
|
||||||
|
|
||||||
|
These decisions are treated as settled for this implementation:
|
||||||
|
|
||||||
|
1. Add a new durable `execution_workspaces` table now.
|
||||||
|
2. Each issue has at most one current execution workspace at a time.
|
||||||
|
3. `issues` get explicit `project_workspace_id` and `execution_workspace_id`.
|
||||||
|
4. Workspace reuse is in scope for V1.
|
||||||
|
5. The feature is gated in the UI by `/instance/settings > Experimental > Workspaces`.
|
||||||
|
6. The gate is UI-only. Backend model changes and migrations always ship.
|
||||||
|
7. Existing users upgrade into compatibility-preserving defaults.
|
||||||
|
8. `project_workspaces` evolves in place rather than being replaced.
|
||||||
|
9. Work product is issue-first, with optional links to execution workspaces and runtime services.
|
||||||
|
10. GitHub is the only PR provider in the first slice.
|
||||||
|
11. Both `adapter_managed` and `cloud_sandbox` execution modes are in scope.
|
||||||
|
12. Workspace controls ship first inside existing project properties, not in a new global navigation area.
|
||||||
|
13. Subissues are out of scope for this implementation slice.
|
||||||
|
|
||||||
|
## Non-Goals
|
||||||
|
|
||||||
|
- Building a full code review system
|
||||||
|
- Solving subissue UX in this slice
|
||||||
|
- Implementing reusable shared workspace definitions across projects in this slice
|
||||||
|
- Reworking all current runtime service behavior before introducing execution workspaces
|
||||||
|
|
||||||
|
## Existing Baseline
|
||||||
|
|
||||||
|
The repo already has:
|
||||||
|
|
||||||
|
- `project_workspaces`
|
||||||
|
- `projects.execution_workspace_policy`
|
||||||
|
- `issues.execution_workspace_settings`
|
||||||
|
- runtime service persistence in `workspace_runtime_services`
|
||||||
|
- local git-worktree realization in `workspace-runtime.ts`
|
||||||
|
|
||||||
|
This implementation should build on that baseline rather than fork it.
|
||||||
|
|
||||||
|
## Terminology
|
||||||
|
|
||||||
|
- `Project workspace`: durable configured codebase/root for a project
|
||||||
|
- `Execution workspace`: actual runtime workspace used for one or more issues
|
||||||
|
- `Work product`: user-facing output such as PR, preview, branch, commit, artifact, document
|
||||||
|
- `Runtime service`: process or service owned or tracked for a workspace
|
||||||
|
- `Compatibility mode`: existing behavior preserved for upgraded installs with no explicit workspace opt-in
|
||||||
|
|
||||||
|
## Architecture Summary
|
||||||
|
|
||||||
|
The first slice should introduce three explicit layers:
|
||||||
|
|
||||||
|
1. `Project workspace`
|
||||||
|
- existing durable project-scoped codebase record
|
||||||
|
- extended to support local, git, non-git, and remote-managed shapes
|
||||||
|
|
||||||
|
2. `Execution workspace`
|
||||||
|
- new durable runtime record
|
||||||
|
- represents shared, isolated, operator-branch, or remote-managed execution context
|
||||||
|
|
||||||
|
3. `Issue work product`
|
||||||
|
- new durable output record
|
||||||
|
- stores PRs, previews, branches, commits, artifacts, and documents
|
||||||
|
|
||||||
|
The issue remains the planning and ownership unit.
|
||||||
|
The execution workspace remains the runtime unit.
|
||||||
|
The work product remains the deliverable/output unit.
|
||||||
|
|
||||||
|
## Configuration and Deployment Topology
|
||||||
|
|
||||||
|
## Important correction
|
||||||
|
|
||||||
|
This repo already uses `PAPERCLIP_DEPLOYMENT_MODE` for auth/deployment behavior (`local_trusted | authenticated`).
|
||||||
|
|
||||||
|
Do not overload that variable for workspace execution topology.
|
||||||
|
|
||||||
|
## New env var
|
||||||
|
|
||||||
|
Add a separate execution-host hint:
|
||||||
|
|
||||||
|
- `PAPERCLIP_EXECUTION_TOPOLOGY=local|cloud|hybrid`
|
||||||
|
|
||||||
|
Default:
|
||||||
|
|
||||||
|
- if unset, treat as `local`
|
||||||
|
|
||||||
|
Purpose:
|
||||||
|
|
||||||
|
- influences defaults and validation for workspace configuration
|
||||||
|
- does not change current auth/deployment semantics
|
||||||
|
- does not break existing installs
|
||||||
|
|
||||||
|
### Semantics
|
||||||
|
|
||||||
|
- `local`
|
||||||
|
- Paperclip may create host-local worktrees, processes, and paths
|
||||||
|
- `cloud`
|
||||||
|
- Paperclip should assume no durable host-local execution workspace management
|
||||||
|
- adapter-managed and cloud-sandbox flows should be treated as first-class
|
||||||
|
- `hybrid`
|
||||||
|
- both local and remote execution strategies may exist
|
||||||
|
|
||||||
|
This is a guardrail and defaulting aid, not a hard policy engine in the first slice.
|
||||||
|
|
||||||
|
## Instance Settings
|
||||||
|
|
||||||
|
Add a new `Experimental` section under `/instance/settings`.
|
||||||
|
|
||||||
|
### New setting
|
||||||
|
|
||||||
|
- `experimental.workspaces: boolean`
|
||||||
|
|
||||||
|
Rules:
|
||||||
|
|
||||||
|
- default `false`
|
||||||
|
- UI-only gate
|
||||||
|
- stored in instance config or instance settings API response
|
||||||
|
- backend routes and migrations remain available even when false
|
||||||
|
|
||||||
|
### UI behavior when off
|
||||||
|
|
||||||
|
- hide workspace-specific issue controls
|
||||||
|
- hide workspace-specific project configuration
|
||||||
|
- hide issue `Work Product` tab if it would otherwise be empty
|
||||||
|
- do not remove or invalidate any stored workspace data
|
||||||
|
|
||||||
|
## Data Model
|
||||||
|
|
||||||
|
## 1. Extend `project_workspaces`
|
||||||
|
|
||||||
|
Current table exists and should evolve in place.
|
||||||
|
|
||||||
|
### New columns
|
||||||
|
|
||||||
|
- `source_type text not null default 'local_path'`
|
||||||
|
- `local_path | git_repo | non_git_path | remote_managed`
|
||||||
|
- `default_ref text null`
|
||||||
|
- `visibility text not null default 'default'`
|
||||||
|
- `default | advanced`
|
||||||
|
- `setup_command text null`
|
||||||
|
- `cleanup_command text null`
|
||||||
|
- `remote_provider text null`
|
||||||
|
- examples: `github`, `openai`, `anthropic`, `custom`
|
||||||
|
- `remote_workspace_ref text null`
|
||||||
|
- `shared_workspace_key text null`
|
||||||
|
- reserved for future cross-project shared workspace definitions
|
||||||
|
|
||||||
|
### Backfill rules
|
||||||
|
|
||||||
|
- if existing row has `repo_url`, backfill `source_type='git_repo'`
|
||||||
|
- else if existing row has `cwd`, backfill `source_type='local_path'`
|
||||||
|
- else backfill `source_type='remote_managed'`
|
||||||
|
- copy existing `repo_ref` into `default_ref`
|
||||||
|
|
||||||
|
### Indexes
|
||||||
|
|
||||||
|
- retain current indexes
|
||||||
|
- add `(project_id, source_type)`
|
||||||
|
- add `(company_id, shared_workspace_key)` non-unique for future support
|
||||||
|
|
||||||
|
## 2. Add `execution_workspaces`
|
||||||
|
|
||||||
|
Create a new durable table.
|
||||||
|
|
||||||
|
### Columns
|
||||||
|
|
||||||
|
- `id uuid pk`
|
||||||
|
- `company_id uuid not null`
|
||||||
|
- `project_id uuid not null`
|
||||||
|
- `project_workspace_id uuid null`
|
||||||
|
- `source_issue_id uuid null`
|
||||||
|
- `mode text not null`
|
||||||
|
- `shared_workspace | isolated_workspace | operator_branch | adapter_managed | cloud_sandbox`
|
||||||
|
- `strategy_type text not null`
|
||||||
|
- `project_primary | git_worktree | adapter_managed | cloud_sandbox`
|
||||||
|
- `name text not null`
|
||||||
|
- `status text not null default 'active'`
|
||||||
|
- `active | idle | in_review | archived | cleanup_failed`
|
||||||
|
- `cwd text null`
|
||||||
|
- `repo_url text null`
|
||||||
|
- `base_ref text null`
|
||||||
|
- `branch_name text null`
|
||||||
|
- `provider_type text not null default 'local_fs'`
|
||||||
|
- `local_fs | git_worktree | adapter_managed | cloud_sandbox`
|
||||||
|
- `provider_ref text null`
|
||||||
|
- `derived_from_execution_workspace_id uuid null`
|
||||||
|
- `last_used_at timestamptz not null default now()`
|
||||||
|
- `opened_at timestamptz not null default now()`
|
||||||
|
- `closed_at timestamptz null`
|
||||||
|
- `cleanup_eligible_at timestamptz null`
|
||||||
|
- `cleanup_reason text null`
|
||||||
|
- `metadata jsonb null`
|
||||||
|
- `created_at timestamptz not null default now()`
|
||||||
|
- `updated_at timestamptz not null default now()`
|
||||||
|
|
||||||
|
### Foreign keys
|
||||||
|
|
||||||
|
- `company_id -> companies.id`
|
||||||
|
- `project_id -> projects.id`
|
||||||
|
- `project_workspace_id -> project_workspaces.id on delete set null`
|
||||||
|
- `source_issue_id -> issues.id on delete set null`
|
||||||
|
- `derived_from_execution_workspace_id -> execution_workspaces.id on delete set null`
|
||||||
|
|
||||||
|
### Indexes
|
||||||
|
|
||||||
|
- `(company_id, project_id, status)`
|
||||||
|
- `(company_id, project_workspace_id, status)`
|
||||||
|
- `(company_id, source_issue_id)`
|
||||||
|
- `(company_id, last_used_at desc)`
|
||||||
|
- `(company_id, branch_name)` non-unique
|
||||||
|
|
||||||
|
## 3. Extend `issues`
|
||||||
|
|
||||||
|
Add explicit workspace linkage.
|
||||||
|
|
||||||
|
### New columns
|
||||||
|
|
||||||
|
- `project_workspace_id uuid null`
|
||||||
|
- `execution_workspace_id uuid null`
|
||||||
|
- `execution_workspace_preference text null`
|
||||||
|
- `inherit | shared_workspace | isolated_workspace | operator_branch | reuse_existing`
|
||||||
|
|
||||||
|
### Foreign keys
|
||||||
|
|
||||||
|
- `project_workspace_id -> project_workspaces.id on delete set null`
|
||||||
|
- `execution_workspace_id -> execution_workspaces.id on delete set null`
|
||||||
|
|
||||||
|
### Backfill rules
|
||||||
|
|
||||||
|
- all existing issues get null values
|
||||||
|
- null should be interpreted as compatibility/inherit behavior
|
||||||
|
|
||||||
|
### Invariants
|
||||||
|
|
||||||
|
- if `project_workspace_id` is set, it must belong to the issue's project and company
|
||||||
|
- if `execution_workspace_id` is set, it must belong to the issue's company
|
||||||
|
- if `execution_workspace_id` is set, the referenced workspace's `project_id` must match the issue's `project_id`
|
||||||
|
|
||||||
|
## 4. Add `issue_work_products`
|
||||||
|
|
||||||
|
Create a new durable table for outputs.
|
||||||
|
|
||||||
|
### Columns
|
||||||
|
|
||||||
|
- `id uuid pk`
|
||||||
|
- `company_id uuid not null`
|
||||||
|
- `project_id uuid null`
|
||||||
|
- `issue_id uuid not null`
|
||||||
|
- `execution_workspace_id uuid null`
|
||||||
|
- `runtime_service_id uuid null`
|
||||||
|
- `type text not null`
|
||||||
|
- `preview_url | runtime_service | pull_request | branch | commit | artifact | document`
|
||||||
|
- `provider text not null`
|
||||||
|
- `paperclip | github | vercel | s3 | custom`
|
||||||
|
- `external_id text null`
|
||||||
|
- `title text not null`
|
||||||
|
- `url text null`
|
||||||
|
- `status text not null`
|
||||||
|
- `active | ready_for_review | approved | changes_requested | merged | closed | failed | archived`
|
||||||
|
- `review_state text not null default 'none'`
|
||||||
|
- `none | needs_board_review | approved | changes_requested`
|
||||||
|
- `is_primary boolean not null default false`
|
||||||
|
- `health_status text not null default 'unknown'`
|
||||||
|
- `unknown | healthy | unhealthy`
|
||||||
|
- `summary text null`
|
||||||
|
- `metadata jsonb null`
|
||||||
|
- `created_by_run_id uuid null`
|
||||||
|
- `created_at timestamptz not null default now()`
|
||||||
|
- `updated_at timestamptz not null default now()`
|
||||||
|
|
||||||
|
### Foreign keys
|
||||||
|
|
||||||
|
- `company_id -> companies.id`
|
||||||
|
- `project_id -> projects.id on delete set null`
|
||||||
|
- `issue_id -> issues.id on delete cascade`
|
||||||
|
- `execution_workspace_id -> execution_workspaces.id on delete set null`
|
||||||
|
- `runtime_service_id -> workspace_runtime_services.id on delete set null`
|
||||||
|
- `created_by_run_id -> heartbeat_runs.id on delete set null`
|
||||||
|
|
||||||
|
### Indexes
|
||||||
|
|
||||||
|
- `(company_id, issue_id, type)`
|
||||||
|
- `(company_id, execution_workspace_id, type)`
|
||||||
|
- `(company_id, provider, external_id)`
|
||||||
|
- `(company_id, updated_at desc)`
|
||||||
|
|
||||||
|
## 5. Extend `workspace_runtime_services`
|
||||||
|
|
||||||
|
This table already exists and should remain the system of record for owned/tracked services.
|
||||||
|
|
||||||
|
### New column
|
||||||
|
|
||||||
|
- `execution_workspace_id uuid null`
|
||||||
|
|
||||||
|
### Foreign key
|
||||||
|
|
||||||
|
- `execution_workspace_id -> execution_workspaces.id on delete set null`
|
||||||
|
|
||||||
|
### Behavior
|
||||||
|
|
||||||
|
- runtime services remain workspace-first
|
||||||
|
- issue UIs should surface them through linked execution workspaces and work products
|
||||||
|
|
||||||
|
## Shared Contracts
|
||||||
|
|
||||||
|
## 1. `packages/shared`
|
||||||
|
|
||||||
|
### Update project workspace types and validators
|
||||||
|
|
||||||
|
Add fields:
|
||||||
|
|
||||||
|
- `sourceType`
|
||||||
|
- `defaultRef`
|
||||||
|
- `visibility`
|
||||||
|
- `setupCommand`
|
||||||
|
- `cleanupCommand`
|
||||||
|
- `remoteProvider`
|
||||||
|
- `remoteWorkspaceRef`
|
||||||
|
- `sharedWorkspaceKey`
|
||||||
|
|
||||||
|
### Add execution workspace types and validators
|
||||||
|
|
||||||
|
New shared types:
|
||||||
|
|
||||||
|
- `ExecutionWorkspace`
|
||||||
|
- `ExecutionWorkspaceMode`
|
||||||
|
- `ExecutionWorkspaceStatus`
|
||||||
|
- `ExecutionWorkspaceProviderType`
|
||||||
|
|
||||||
|
### Add work product types and validators
|
||||||
|
|
||||||
|
New shared types:
|
||||||
|
|
||||||
|
- `IssueWorkProduct`
|
||||||
|
- `IssueWorkProductType`
|
||||||
|
- `IssueWorkProductStatus`
|
||||||
|
- `IssueWorkProductReviewState`
|
||||||
|
|
||||||
|
### Update issue types and validators
|
||||||
|
|
||||||
|
Add:
|
||||||
|
|
||||||
|
- `projectWorkspaceId`
|
||||||
|
- `executionWorkspaceId`
|
||||||
|
- `executionWorkspacePreference`
|
||||||
|
- `workProducts?: IssueWorkProduct[]`
|
||||||
|
|
||||||
|
### Extend project execution policy contract
|
||||||
|
|
||||||
|
Replace the current narrow policy with a more explicit shape:
|
||||||
|
|
||||||
|
- `enabled`
|
||||||
|
- `defaultMode`
|
||||||
|
- `shared_workspace | isolated_workspace | operator_branch | adapter_default`
|
||||||
|
- `allowIssueOverride`
|
||||||
|
- `defaultProjectWorkspaceId`
|
||||||
|
- `workspaceStrategy`
|
||||||
|
- `branchPolicy`
|
||||||
|
- `pullRequestPolicy`
|
||||||
|
- `runtimePolicy`
|
||||||
|
- `cleanupPolicy`
|
||||||
|
|
||||||
|
Do not try to encode every possible provider-specific field in V1. Keep provider-specific extensibility in nested JSON where needed.
|
||||||
|
|
||||||
|
## Service Layer Changes
|
||||||
|
|
||||||
|
## 1. Project service
|
||||||
|
|
||||||
|
Update project workspace CRUD to handle the extended schema.
|
||||||
|
|
||||||
|
### Required rules
|
||||||
|
|
||||||
|
- when setting a primary workspace, clear `is_primary` on siblings
|
||||||
|
- `source_type=remote_managed` may have null `cwd`
|
||||||
|
- local/git-backed workspaces should still require one of `cwd` or `repo_url`
|
||||||
|
- preserve current behavior for existing callers that only send `cwd/repoUrl/repoRef`
|
||||||
|
|
||||||
|
## 2. Issue service
|
||||||
|
|
||||||
|
Update create/update flows to handle explicit workspace binding.
|
||||||
|
|
||||||
|
### Create behavior
|
||||||
|
|
||||||
|
Resolve defaults in this order:
|
||||||
|
|
||||||
|
1. explicit `projectWorkspaceId` from request
|
||||||
|
2. `project.executionWorkspacePolicy.defaultProjectWorkspaceId`
|
||||||
|
3. project's primary workspace
|
||||||
|
4. null
|
||||||
|
|
||||||
|
Resolve `executionWorkspacePreference`:
|
||||||
|
|
||||||
|
1. explicit request field
|
||||||
|
2. project policy default
|
||||||
|
3. compatibility fallback to `inherit`
|
||||||
|
|
||||||
|
Do not create an execution workspace at issue creation time unless:
|
||||||
|
|
||||||
|
- `reuse_existing` is explicitly chosen and `executionWorkspaceId` is provided
|
||||||
|
|
||||||
|
Otherwise, workspace realization happens when execution starts.
|
||||||
|
|
||||||
|
### Update behavior
|
||||||
|
|
||||||
|
- allow changing `projectWorkspaceId` only if the workspace belongs to the same project
|
||||||
|
- allow setting `executionWorkspaceId` only if it belongs to the same company and project
|
||||||
|
- do not automatically destroy or relink historical work products when workspace linkage changes
|
||||||
|
|
||||||
|
## 3. Workspace realization service
|
||||||
|
|
||||||
|
Refactor `workspace-runtime.ts` so realization produces or reuses an `execution_workspaces` row.
|
||||||
|
|
||||||
|
### New flow
|
||||||
|
|
||||||
|
Input:
|
||||||
|
|
||||||
|
- issue
|
||||||
|
- project workspace
|
||||||
|
- project execution policy
|
||||||
|
- execution topology hint
|
||||||
|
- adapter/runtime configuration
|
||||||
|
|
||||||
|
Output:
|
||||||
|
|
||||||
|
- realized execution workspace record
|
||||||
|
- runtime cwd/provider metadata
|
||||||
|
|
||||||
|
### Required modes
|
||||||
|
|
||||||
|
- `shared_workspace`
|
||||||
|
- reuse a stable execution workspace representing the project primary/shared workspace
|
||||||
|
- `isolated_workspace`
|
||||||
|
- create or reuse a derived isolated execution workspace
|
||||||
|
- `operator_branch`
|
||||||
|
- create or reuse a long-lived branch workspace
|
||||||
|
- `adapter_managed`
|
||||||
|
- create an execution workspace with provider references and optional null `cwd`
|
||||||
|
- `cloud_sandbox`
|
||||||
|
- same as adapter-managed, but explicit remote sandbox semantics
|
||||||
|
|
||||||
|
### Reuse rules
|
||||||
|
|
||||||
|
When `reuse_existing` is requested:
|
||||||
|
|
||||||
|
- only list active or recently used execution workspaces
|
||||||
|
- only for the same project
|
||||||
|
- only for the same project workspace if one is specified
|
||||||
|
- exclude archived and cleanup-failed workspaces
|
||||||
|
|
||||||
|
### Shared workspace realization
|
||||||
|
|
||||||
|
For compatibility mode and shared-workspace projects:
|
||||||
|
|
||||||
|
- create a stable execution workspace per project workspace when first needed
|
||||||
|
- reuse it for subsequent runs
|
||||||
|
|
||||||
|
This avoids a special-case branch in later work product linkage.
|
||||||
|
|
||||||
|
## 4. Runtime service integration
|
||||||
|
|
||||||
|
When runtime services are started or reused:
|
||||||
|
|
||||||
|
- populate `execution_workspace_id`
|
||||||
|
- continue populating `project_workspace_id`, `project_id`, and `issue_id`
|
||||||
|
|
||||||
|
When a runtime service yields a URL:
|
||||||
|
|
||||||
|
- optionally create or update a linked `issue_work_products` row of type `runtime_service` or `preview_url`
|
||||||
|
|
||||||
|
## 5. PR and preview reporting
|
||||||
|
|
||||||
|
Add a service for creating/updating `issue_work_products`.
|
||||||
|
|
||||||
|
### Supported V1 product types
|
||||||
|
|
||||||
|
- `pull_request`
|
||||||
|
- `preview_url`
|
||||||
|
- `runtime_service`
|
||||||
|
- `branch`
|
||||||
|
- `commit`
|
||||||
|
- `artifact`
|
||||||
|
- `document`
|
||||||
|
|
||||||
|
### GitHub PR reporting
|
||||||
|
|
||||||
|
For V1, GitHub is the only provider with richer semantics.
|
||||||
|
|
||||||
|
Supported statuses:
|
||||||
|
|
||||||
|
- `draft`
|
||||||
|
- `ready_for_review`
|
||||||
|
- `approved`
|
||||||
|
- `changes_requested`
|
||||||
|
- `merged`
|
||||||
|
- `closed`
|
||||||
|
|
||||||
|
Represent these in `status` and `review_state` rather than inventing a separate PR table in V1.
|
||||||
|
|
||||||
|
## Routes and API
|
||||||
|
|
||||||
|
## 1. Project workspace routes
|
||||||
|
|
||||||
|
Extend existing routes:
|
||||||
|
|
||||||
|
- `GET /projects/:id/workspaces`
|
||||||
|
- `POST /projects/:id/workspaces`
|
||||||
|
- `PATCH /projects/:id/workspaces/:workspaceId`
|
||||||
|
- `DELETE /projects/:id/workspaces/:workspaceId`
|
||||||
|
|
||||||
|
### New accepted/returned fields
|
||||||
|
|
||||||
|
- `sourceType`
|
||||||
|
- `defaultRef`
|
||||||
|
- `visibility`
|
||||||
|
- `setupCommand`
|
||||||
|
- `cleanupCommand`
|
||||||
|
- `remoteProvider`
|
||||||
|
- `remoteWorkspaceRef`
|
||||||
|
|
||||||
|
## 2. Execution workspace routes
|
||||||
|
|
||||||
|
Add:
|
||||||
|
|
||||||
|
- `GET /companies/:companyId/execution-workspaces`
|
||||||
|
- filters:
|
||||||
|
- `projectId`
|
||||||
|
- `projectWorkspaceId`
|
||||||
|
- `status`
|
||||||
|
- `issueId`
|
||||||
|
- `reuseEligible=true`
|
||||||
|
- `GET /execution-workspaces/:id`
|
||||||
|
- `PATCH /execution-workspaces/:id`
|
||||||
|
- update status/metadata/cleanup fields only in V1
|
||||||
|
|
||||||
|
Do not add top-level navigation for these routes yet.
|
||||||
|
|
||||||
|
## 3. Work product routes
|
||||||
|
|
||||||
|
Add:
|
||||||
|
|
||||||
|
- `GET /issues/:id/work-products`
|
||||||
|
- `POST /issues/:id/work-products`
|
||||||
|
- `PATCH /work-products/:id`
|
||||||
|
- `DELETE /work-products/:id`
|
||||||
|
|
||||||
|
### V1 mutation permissions
|
||||||
|
|
||||||
|
- board can create/update/delete all
|
||||||
|
- agents can create/update for issues they are assigned or currently executing
|
||||||
|
- deletion should generally archive rather than hard-delete once linked to historical output
|
||||||
|
|
||||||
|
## 4. Issue routes
|
||||||
|
|
||||||
|
Extend existing create/update payloads to accept:
|
||||||
|
|
||||||
|
- `projectWorkspaceId`
|
||||||
|
- `executionWorkspacePreference`
|
||||||
|
- `executionWorkspaceId`
|
||||||
|
|
||||||
|
Extend `GET /issues/:id` to return:
|
||||||
|
|
||||||
|
- `projectWorkspaceId`
|
||||||
|
- `executionWorkspaceId`
|
||||||
|
- `executionWorkspacePreference`
|
||||||
|
- `currentExecutionWorkspace`
|
||||||
|
- `workProducts[]`
|
||||||
|
|
||||||
|
## 5. Instance settings routes
|
||||||
|
|
||||||
|
Add support for:
|
||||||
|
|
||||||
|
- reading/writing `experimental.workspaces`
|
||||||
|
|
||||||
|
This is a UI gate only.
|
||||||
|
|
||||||
|
If there is no generic instance settings storage yet, the first slice can store this in the existing config/instance settings mechanism used by `/instance/settings`.
|
||||||
|
|
||||||
|
## UI Changes
|
||||||
|
|
||||||
|
## 1. `/instance/settings`
|
||||||
|
|
||||||
|
Add section:
|
||||||
|
|
||||||
|
- `Experimental`
|
||||||
|
- `Enable Workspaces`
|
||||||
|
|
||||||
|
When off:
|
||||||
|
|
||||||
|
- hide new workspace-specific affordances
|
||||||
|
- do not alter existing project or issue behavior
|
||||||
|
|
||||||
|
## 2. Project properties
|
||||||
|
|
||||||
|
Do not create a separate `Code` tab yet.
|
||||||
|
Ship inside existing project properties first.
|
||||||
|
|
||||||
|
### Add or re-enable sections
|
||||||
|
|
||||||
|
- `Project Workspaces`
|
||||||
|
- `Execution Defaults`
|
||||||
|
- `Provisioning`
|
||||||
|
- `Pull Requests`
|
||||||
|
- `Previews and Runtime`
|
||||||
|
- `Cleanup`
|
||||||
|
|
||||||
|
### Display rules
|
||||||
|
|
||||||
|
- only show when `experimental.workspaces=true`
|
||||||
|
- keep wording generic enough for local and remote setups
|
||||||
|
- only show git-specific fields when `sourceType=git_repo`
|
||||||
|
- only show local-path-specific fields when not `remote_managed`
|
||||||
|
|
||||||
|
## 3. Issue create dialog
|
||||||
|
|
||||||
|
When the workspace experimental flag is on and the selected project has workspace automation or workspaces:
|
||||||
|
|
||||||
|
### Basic fields
|
||||||
|
|
||||||
|
- `Codebase`
|
||||||
|
- select from project workspaces
|
||||||
|
- default to policy default or primary workspace
|
||||||
|
- `Execution mode`
|
||||||
|
- `Project default`
|
||||||
|
- `Shared workspace`
|
||||||
|
- `Isolated workspace`
|
||||||
|
- `Operator branch`
|
||||||
|
|
||||||
|
### Advanced section
|
||||||
|
|
||||||
|
- `Reuse existing execution workspace`
|
||||||
|
|
||||||
|
This control should query only:
|
||||||
|
|
||||||
|
- same project
|
||||||
|
- same codebase if selected
|
||||||
|
- active/recent workspaces
|
||||||
|
- compact labels with branch or workspace name
|
||||||
|
|
||||||
|
Do not expose all execution workspaces in a noisy unfiltered list.
|
||||||
|
|
||||||
|
## 4. Issue detail
|
||||||
|
|
||||||
|
Add a `Work Product` tab when:
|
||||||
|
|
||||||
|
- the experimental flag is on, or
|
||||||
|
- the issue already has work products
|
||||||
|
|
||||||
|
### Show
|
||||||
|
|
||||||
|
- current execution workspace summary
|
||||||
|
- PR cards
|
||||||
|
- preview cards
|
||||||
|
- branch/commit rows
|
||||||
|
- artifacts/documents
|
||||||
|
|
||||||
|
Add compact header chips:
|
||||||
|
|
||||||
|
- codebase
|
||||||
|
- workspace
|
||||||
|
- PR count/status
|
||||||
|
- preview status
|
||||||
|
|
||||||
|
## 5. Execution workspace detail page
|
||||||
|
|
||||||
|
Add a detail route but no nav item.
|
||||||
|
|
||||||
|
Linked from:
|
||||||
|
|
||||||
|
- issue work product tab
|
||||||
|
- project workspace/execution panels
|
||||||
|
|
||||||
|
### Show
|
||||||
|
|
||||||
|
- identity and status
|
||||||
|
- project workspace origin
|
||||||
|
- source issue
|
||||||
|
- linked issues
|
||||||
|
- branch/ref/provider info
|
||||||
|
- runtime services
|
||||||
|
- work products
|
||||||
|
- cleanup state
|
||||||
|
|
||||||
|
## Runtime and Adapter Behavior
|
||||||
|
|
||||||
|
## 1. Local adapters
|
||||||
|
|
||||||
|
For local adapters:
|
||||||
|
|
||||||
|
- continue to use existing cwd/worktree realization paths
|
||||||
|
- persist the result as execution workspaces
|
||||||
|
- attach runtime services and work product to the execution workspace and issue
|
||||||
|
|
||||||
|
## 2. Remote or cloud adapters
|
||||||
|
|
||||||
|
For remote adapters:
|
||||||
|
|
||||||
|
- allow execution workspaces with null `cwd`
|
||||||
|
- require provider metadata sufficient to identify the remote workspace/session
|
||||||
|
- allow work product creation without any host-local process ownership
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
|
||||||
|
- cloud coding agent opens a branch and PR on GitHub
|
||||||
|
- Vercel preview URL is reported back as a preview work product
|
||||||
|
- remote sandbox emits artifact URLs
|
||||||
|
|
||||||
|
## 3. Approval-aware PR workflow
|
||||||
|
|
||||||
|
V1 should support richer PR state tracking, but not a full review engine.
|
||||||
|
|
||||||
|
### Required actions
|
||||||
|
|
||||||
|
- `open_pr`
|
||||||
|
- `mark_ready`
|
||||||
|
|
||||||
|
### Required review states
|
||||||
|
|
||||||
|
- `draft`
|
||||||
|
- `ready_for_review`
|
||||||
|
- `approved`
|
||||||
|
- `changes_requested`
|
||||||
|
- `merged`
|
||||||
|
- `closed`
|
||||||
|
|
||||||
|
### Storage approach
|
||||||
|
|
||||||
|
- represent these as `issue_work_products` with `type='pull_request'`
|
||||||
|
- use `status` and `review_state`
|
||||||
|
- store provider-specific details in `metadata`
|
||||||
|
|
||||||
|
## Migration Plan
|
||||||
|
|
||||||
|
## 1. Existing installs
|
||||||
|
|
||||||
|
The migration posture is backward-compatible by default.
|
||||||
|
|
||||||
|
### Guarantees
|
||||||
|
|
||||||
|
- no existing project must be edited before it keeps working
|
||||||
|
- no existing issue flow should start requiring workspace input
|
||||||
|
- all new nullable columns must preserve current behavior when absent
|
||||||
|
|
||||||
|
## 2. Project workspace migration
|
||||||
|
|
||||||
|
Migrate `project_workspaces` in place.
|
||||||
|
|
||||||
|
### Backfill
|
||||||
|
|
||||||
|
- derive `source_type`
|
||||||
|
- copy `repo_ref` to `default_ref`
|
||||||
|
- leave new optional fields null
|
||||||
|
|
||||||
|
## 3. Issue migration
|
||||||
|
|
||||||
|
Do not backfill `project_workspace_id` or `execution_workspace_id` on all existing issues.
|
||||||
|
|
||||||
|
Reason:
|
||||||
|
|
||||||
|
- the safest migration is to preserve current runtime behavior and bind explicitly only when new workspace-aware flows are used
|
||||||
|
|
||||||
|
Interpret old issues as:
|
||||||
|
|
||||||
|
- `executionWorkspacePreference = inherit`
|
||||||
|
- compatibility/shared behavior
|
||||||
|
|
||||||
|
## 4. Runtime history migration
|
||||||
|
|
||||||
|
Do not attempt a perfect historical reconstruction of execution workspaces in the migration itself.
|
||||||
|
|
||||||
|
Instead:
|
||||||
|
|
||||||
|
- create execution workspace records forward from first new run
|
||||||
|
- optionally add a later backfill tool for recent runtime services if it proves valuable
|
||||||
|
|
||||||
|
## Rollout Order
|
||||||
|
|
||||||
|
## Phase 1: Schema and shared contracts
|
||||||
|
|
||||||
|
1. extend `project_workspaces`
|
||||||
|
2. add `execution_workspaces`
|
||||||
|
3. add `issue_work_products`
|
||||||
|
4. extend `issues`
|
||||||
|
5. extend `workspace_runtime_services`
|
||||||
|
6. update shared types and validators
|
||||||
|
|
||||||
|
## Phase 2: Service wiring
|
||||||
|
|
||||||
|
1. update project workspace CRUD
|
||||||
|
2. update issue create/update resolution
|
||||||
|
3. refactor workspace realization to persist execution workspaces
|
||||||
|
4. attach runtime services to execution workspaces
|
||||||
|
5. add work product service and persistence
|
||||||
|
|
||||||
|
## Phase 3: API and UI
|
||||||
|
|
||||||
|
1. add execution workspace routes
|
||||||
|
2. add work product routes
|
||||||
|
3. add instance experimental settings toggle
|
||||||
|
4. re-enable and revise project workspace UI behind the flag
|
||||||
|
5. add issue create/update controls behind the flag
|
||||||
|
6. add issue work product tab
|
||||||
|
7. add execution workspace detail page
|
||||||
|
|
||||||
|
## Phase 4: Provider integrations
|
||||||
|
|
||||||
|
1. GitHub PR reporting
|
||||||
|
2. preview URL reporting
|
||||||
|
3. runtime-service-to-work-product linking
|
||||||
|
4. remote/cloud provider references
|
||||||
|
|
||||||
|
## Acceptance Criteria
|
||||||
|
|
||||||
|
1. Existing installs continue to behave predictably with no required reconfiguration.
|
||||||
|
2. Projects can define local, git, non-git, and remote-managed project workspaces.
|
||||||
|
3. Issues can explicitly select a project workspace and execution preference.
|
||||||
|
4. Each issue can point to one current execution workspace.
|
||||||
|
5. Multiple issues can intentionally reuse the same execution workspace.
|
||||||
|
6. Execution workspaces are persisted for both local and remote execution flows.
|
||||||
|
7. Work products can be attached to issues with optional execution workspace linkage.
|
||||||
|
8. GitHub PRs can be represented with richer lifecycle states.
|
||||||
|
9. The main UI remains simple when the experimental flag is off.
|
||||||
|
10. No top-level workspace navigation is required for this first slice.
|
||||||
|
|
||||||
|
## Risks and Mitigations
|
||||||
|
|
||||||
|
## Risk: too many overlapping workspace concepts
|
||||||
|
|
||||||
|
Mitigation:
|
||||||
|
|
||||||
|
- keep issue UI to `Codebase` and `Execution mode`
|
||||||
|
- reserve execution workspace details for advanced pages
|
||||||
|
|
||||||
|
## Risk: breaking current projects on upgrade
|
||||||
|
|
||||||
|
Mitigation:
|
||||||
|
|
||||||
|
- nullable schema additions
|
||||||
|
- in-place `project_workspaces` migration
|
||||||
|
- compatibility defaults
|
||||||
|
|
||||||
|
## Risk: local-only assumptions leaking into cloud mode
|
||||||
|
|
||||||
|
Mitigation:
|
||||||
|
|
||||||
|
- make `cwd` optional for execution workspaces
|
||||||
|
- use `provider_type` and `provider_ref`
|
||||||
|
- use `PAPERCLIP_EXECUTION_TOPOLOGY` as a defaulting guardrail
|
||||||
|
|
||||||
|
## Risk: turning PRs into a bespoke subsystem too early
|
||||||
|
|
||||||
|
Mitigation:
|
||||||
|
|
||||||
|
- represent PRs as work products in V1
|
||||||
|
- keep provider-specific details in metadata
|
||||||
|
- defer a dedicated PR table unless usage proves it necessary
|
||||||
|
|
||||||
|
## Recommended First Engineering Slice
|
||||||
|
|
||||||
|
If we want the narrowest useful implementation:
|
||||||
|
|
||||||
|
1. extend `project_workspaces`
|
||||||
|
2. add `execution_workspaces`
|
||||||
|
3. extend `issues` with explicit workspace fields
|
||||||
|
4. persist execution workspaces from existing local workspace realization
|
||||||
|
5. add `issue_work_products`
|
||||||
|
6. show project workspace controls and issue workspace controls behind the experimental flag
|
||||||
|
7. add issue `Work Product` tab with PR/preview/runtime service display
|
||||||
|
|
||||||
|
This slice is enough to validate the model without yet building every provider integration or cleanup workflow.
|
||||||
@@ -94,7 +94,7 @@ export async function prepareWorktreeCodexHome(
|
|||||||
}
|
}
|
||||||
|
|
||||||
await onLog(
|
await onLog(
|
||||||
"stderr",
|
"stdout",
|
||||||
`[paperclip] Using worktree-isolated Codex home "${targetHome}" (seeded from "${sourceHome}").\n`,
|
`[paperclip] Using worktree-isolated Codex home "${targetHome}" (seeded from "${sourceHome}").\n`,
|
||||||
);
|
);
|
||||||
return targetHome;
|
return targetHome;
|
||||||
|
|||||||
@@ -116,7 +116,7 @@ export async function ensureCodexSkillsInjected(
|
|||||||
);
|
);
|
||||||
for (const skillName of removedSkills) {
|
for (const skillName of removedSkills) {
|
||||||
await onLog(
|
await onLog(
|
||||||
"stderr",
|
"stdout",
|
||||||
`[paperclip] Removed maintainer-only Codex skill "${skillName}" from ${skillsHome}\n`,
|
`[paperclip] Removed maintainer-only Codex skill "${skillName}" from ${skillsHome}\n`,
|
||||||
);
|
);
|
||||||
}
|
}
|
||||||
@@ -143,7 +143,7 @@ export async function ensureCodexSkillsInjected(
|
|||||||
await fs.symlink(entry.source, target);
|
await fs.symlink(entry.source, target);
|
||||||
}
|
}
|
||||||
await onLog(
|
await onLog(
|
||||||
"stderr",
|
"stdout",
|
||||||
`[paperclip] Repaired Codex skill "${entry.name}" into ${skillsHome}\n`,
|
`[paperclip] Repaired Codex skill "${entry.name}" into ${skillsHome}\n`,
|
||||||
);
|
);
|
||||||
continue;
|
continue;
|
||||||
@@ -154,7 +154,7 @@ export async function ensureCodexSkillsInjected(
|
|||||||
if (result === "skipped") continue;
|
if (result === "skipped") continue;
|
||||||
|
|
||||||
await onLog(
|
await onLog(
|
||||||
"stderr",
|
"stdout",
|
||||||
`[paperclip] ${result === "repaired" ? "Repaired" : "Injected"} Codex skill "${entry.name}" into ${skillsHome}\n`,
|
`[paperclip] ${result === "repaired" ? "Repaired" : "Injected"} Codex skill "${entry.name}" into ${skillsHome}\n`,
|
||||||
);
|
);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
@@ -364,7 +364,7 @@ export async function execute(ctx: AdapterExecutionContext): Promise<AdapterExec
|
|||||||
`Resolve any relative file references from ${instructionsDir}.\n\n`;
|
`Resolve any relative file references from ${instructionsDir}.\n\n`;
|
||||||
instructionsChars = instructionsPrefix.length;
|
instructionsChars = instructionsPrefix.length;
|
||||||
await onLog(
|
await onLog(
|
||||||
"stderr",
|
"stdout",
|
||||||
`[paperclip] Loaded agent instructions file: ${instructionsFilePath}\n`,
|
`[paperclip] Loaded agent instructions file: ${instructionsFilePath}\n`,
|
||||||
);
|
);
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
|
|||||||
@@ -33,9 +33,9 @@
|
|||||||
"seed": "tsx src/seed.ts"
|
"seed": "tsx src/seed.ts"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"embedded-postgres": "^18.1.0-beta.16",
|
|
||||||
"@paperclipai/shared": "workspace:*",
|
"@paperclipai/shared": "workspace:*",
|
||||||
"drizzle-orm": "^0.38.4",
|
"drizzle-orm": "^0.38.4",
|
||||||
|
"embedded-postgres": "^18.1.0-beta.16",
|
||||||
"postgres": "^3.4.5"
|
"postgres": "^3.4.5"
|
||||||
},
|
},
|
||||||
"devDependencies": {
|
"devDependencies": {
|
||||||
|
|||||||
157
packages/db/src/client.test.ts
Normal file
157
packages/db/src/client.test.ts
Normal file
@@ -0,0 +1,157 @@
|
|||||||
|
import { createHash } from "node:crypto";
|
||||||
|
import fs from "node:fs";
|
||||||
|
import net from "node:net";
|
||||||
|
import os from "node:os";
|
||||||
|
import path from "node:path";
|
||||||
|
import { afterEach, describe, expect, it } from "vitest";
|
||||||
|
import postgres from "postgres";
|
||||||
|
import {
|
||||||
|
applyPendingMigrations,
|
||||||
|
ensurePostgresDatabase,
|
||||||
|
inspectMigrations,
|
||||||
|
} from "./client.js";
|
||||||
|
|
||||||
|
type EmbeddedPostgresInstance = {
|
||||||
|
initialise(): Promise<void>;
|
||||||
|
start(): Promise<void>;
|
||||||
|
stop(): Promise<void>;
|
||||||
|
};
|
||||||
|
|
||||||
|
type EmbeddedPostgresCtor = new (opts: {
|
||||||
|
databaseDir: string;
|
||||||
|
user: string;
|
||||||
|
password: string;
|
||||||
|
port: number;
|
||||||
|
persistent: boolean;
|
||||||
|
initdbFlags?: string[];
|
||||||
|
onLog?: (message: unknown) => void;
|
||||||
|
onError?: (message: unknown) => void;
|
||||||
|
}) => EmbeddedPostgresInstance;
|
||||||
|
|
||||||
|
const tempPaths: string[] = [];
|
||||||
|
const runningInstances: EmbeddedPostgresInstance[] = [];
|
||||||
|
|
||||||
|
async function getEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
|
||||||
|
const mod = await import("embedded-postgres");
|
||||||
|
return mod.default as EmbeddedPostgresCtor;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function getAvailablePort(): Promise<number> {
|
||||||
|
return await new Promise((resolve, reject) => {
|
||||||
|
const server = net.createServer();
|
||||||
|
server.unref();
|
||||||
|
server.on("error", reject);
|
||||||
|
server.listen(0, "127.0.0.1", () => {
|
||||||
|
const address = server.address();
|
||||||
|
if (!address || typeof address === "string") {
|
||||||
|
server.close(() => reject(new Error("Failed to allocate test port")));
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const { port } = address;
|
||||||
|
server.close((error) => {
|
||||||
|
if (error) reject(error);
|
||||||
|
else resolve(port);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
async function createTempDatabase(): Promise<string> {
|
||||||
|
const dataDir = fs.mkdtempSync(path.join(os.tmpdir(), "paperclip-db-client-"));
|
||||||
|
tempPaths.push(dataDir);
|
||||||
|
const port = await getAvailablePort();
|
||||||
|
const EmbeddedPostgres = await getEmbeddedPostgresCtor();
|
||||||
|
const instance = new EmbeddedPostgres({
|
||||||
|
databaseDir: dataDir,
|
||||||
|
user: "paperclip",
|
||||||
|
password: "paperclip",
|
||||||
|
port,
|
||||||
|
persistent: true,
|
||||||
|
initdbFlags: ["--encoding=UTF8", "--locale=C"],
|
||||||
|
onLog: () => {},
|
||||||
|
onError: () => {},
|
||||||
|
});
|
||||||
|
await instance.initialise();
|
||||||
|
await instance.start();
|
||||||
|
runningInstances.push(instance);
|
||||||
|
|
||||||
|
const adminUrl = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
|
||||||
|
await ensurePostgresDatabase(adminUrl, "paperclip");
|
||||||
|
return `postgres://paperclip:paperclip@127.0.0.1:${port}/paperclip`;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function migrationHash(migrationFile: string): Promise<string> {
|
||||||
|
const content = await fs.promises.readFile(
|
||||||
|
new URL(`./migrations/${migrationFile}`, import.meta.url),
|
||||||
|
"utf8",
|
||||||
|
);
|
||||||
|
return createHash("sha256").update(content).digest("hex");
|
||||||
|
}
|
||||||
|
|
||||||
|
afterEach(async () => {
|
||||||
|
while (runningInstances.length > 0) {
|
||||||
|
const instance = runningInstances.pop();
|
||||||
|
if (!instance) continue;
|
||||||
|
await instance.stop();
|
||||||
|
}
|
||||||
|
while (tempPaths.length > 0) {
|
||||||
|
const tempPath = tempPaths.pop();
|
||||||
|
if (!tempPath) continue;
|
||||||
|
fs.rmSync(tempPath, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
describe("applyPendingMigrations", () => {
|
||||||
|
it(
|
||||||
|
"applies an inserted earlier migration without replaying later legacy migrations",
|
||||||
|
async () => {
|
||||||
|
const connectionString = await createTempDatabase();
|
||||||
|
|
||||||
|
await applyPendingMigrations(connectionString);
|
||||||
|
|
||||||
|
const sql = postgres(connectionString, { max: 1, onnotice: () => {} });
|
||||||
|
try {
|
||||||
|
const richMagnetoHash = await migrationHash("0030_rich_magneto.sql");
|
||||||
|
|
||||||
|
await sql.unsafe(
|
||||||
|
`DELETE FROM "drizzle"."__drizzle_migrations" WHERE hash = '${richMagnetoHash}'`,
|
||||||
|
);
|
||||||
|
await sql.unsafe(`DROP TABLE "company_logos"`);
|
||||||
|
} finally {
|
||||||
|
await sql.end();
|
||||||
|
}
|
||||||
|
|
||||||
|
const pendingState = await inspectMigrations(connectionString);
|
||||||
|
expect(pendingState).toMatchObject({
|
||||||
|
status: "needsMigrations",
|
||||||
|
pendingMigrations: ["0030_rich_magneto.sql"],
|
||||||
|
reason: "pending-migrations",
|
||||||
|
});
|
||||||
|
|
||||||
|
await applyPendingMigrations(connectionString);
|
||||||
|
|
||||||
|
const finalState = await inspectMigrations(connectionString);
|
||||||
|
expect(finalState.status).toBe("upToDate");
|
||||||
|
|
||||||
|
const verifySql = postgres(connectionString, { max: 1, onnotice: () => {} });
|
||||||
|
try {
|
||||||
|
const rows = await verifySql.unsafe<{ table_name: string }[]>(
|
||||||
|
`
|
||||||
|
SELECT table_name
|
||||||
|
FROM information_schema.tables
|
||||||
|
WHERE table_schema = 'public'
|
||||||
|
AND table_name IN ('company_logos', 'execution_workspaces')
|
||||||
|
ORDER BY table_name
|
||||||
|
`,
|
||||||
|
);
|
||||||
|
expect(rows.map((row) => row.table_name)).toEqual([
|
||||||
|
"company_logos",
|
||||||
|
"execution_workspaces",
|
||||||
|
]);
|
||||||
|
} finally {
|
||||||
|
await verifySql.end();
|
||||||
|
}
|
||||||
|
},
|
||||||
|
20_000,
|
||||||
|
);
|
||||||
|
});
|
||||||
@@ -50,6 +50,21 @@ export function createDb(url: string) {
|
|||||||
return drizzlePg(sql, { schema });
|
return drizzlePg(sql, { schema });
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export async function getPostgresDataDirectory(url: string): Promise<string | null> {
|
||||||
|
const sql = createUtilitySql(url);
|
||||||
|
try {
|
||||||
|
const rows = await sql<{ data_directory: string | null }[]>`
|
||||||
|
SELECT current_setting('data_directory', true) AS data_directory
|
||||||
|
`;
|
||||||
|
const actual = rows[0]?.data_directory;
|
||||||
|
return typeof actual === "string" && actual.length > 0 ? actual : null;
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
} finally {
|
||||||
|
await sql.end();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
async function listMigrationFiles(): Promise<string[]> {
|
async function listMigrationFiles(): Promise<string[]> {
|
||||||
const entries = await readdir(MIGRATIONS_FOLDER, { withFileTypes: true });
|
const entries = await readdir(MIGRATIONS_FOLDER, { withFileTypes: true });
|
||||||
return entries
|
return entries
|
||||||
@@ -646,13 +661,26 @@ export async function applyPendingMigrations(url: string): Promise<void> {
|
|||||||
const initialState = await inspectMigrations(url);
|
const initialState = await inspectMigrations(url);
|
||||||
if (initialState.status === "upToDate") return;
|
if (initialState.status === "upToDate") return;
|
||||||
|
|
||||||
const sql = createUtilitySql(url);
|
if (initialState.reason === "no-migration-journal-empty-db") {
|
||||||
|
const sql = createUtilitySql(url);
|
||||||
|
try {
|
||||||
|
const db = drizzlePg(sql);
|
||||||
|
await migratePg(db, { migrationsFolder: MIGRATIONS_FOLDER });
|
||||||
|
} finally {
|
||||||
|
await sql.end();
|
||||||
|
}
|
||||||
|
|
||||||
try {
|
const bootstrappedState = await inspectMigrations(url);
|
||||||
const db = drizzlePg(sql);
|
if (bootstrappedState.status === "upToDate") return;
|
||||||
await migratePg(db, { migrationsFolder: MIGRATIONS_FOLDER });
|
throw new Error(
|
||||||
} finally {
|
`Failed to bootstrap migrations: ${bootstrappedState.pendingMigrations.join(", ")}`,
|
||||||
await sql.end();
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (initialState.reason === "no-migration-journal-non-empty-db") {
|
||||||
|
throw new Error(
|
||||||
|
"Database has tables but no migration journal; automatic migration is unsafe. Initialize migration history manually.",
|
||||||
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
let state = await inspectMigrations(url);
|
let state = await inspectMigrations(url);
|
||||||
@@ -665,7 +693,7 @@ export async function applyPendingMigrations(url: string): Promise<void> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
if (state.status !== "needsMigrations" || state.reason !== "pending-migrations") {
|
if (state.status !== "needsMigrations" || state.reason !== "pending-migrations") {
|
||||||
throw new Error("Migrations are still pending after attempted apply; run inspectMigrations for details.");
|
throw new Error("Migrations are still pending after migration-history reconciliation; run inspectMigrations for details.");
|
||||||
}
|
}
|
||||||
|
|
||||||
await applyPendingMigrationsManually(url, state.pendingMigrations);
|
await applyPendingMigrationsManually(url, state.pendingMigrations);
|
||||||
|
|||||||
@@ -1,5 +1,6 @@
|
|||||||
export {
|
export {
|
||||||
createDb,
|
createDb,
|
||||||
|
getPostgresDataDirectory,
|
||||||
ensurePostgresDatabase,
|
ensurePostgresDatabase,
|
||||||
inspectMigrations,
|
inspectMigrations,
|
||||||
applyPendingMigrations,
|
applyPendingMigrations,
|
||||||
|
|||||||
@@ -1,9 +1,7 @@
|
|||||||
import { existsSync, readFileSync, rmSync } from "node:fs";
|
import { existsSync, readFileSync, rmSync } from "node:fs";
|
||||||
import { createRequire } from "node:module";
|
|
||||||
import { createServer } from "node:net";
|
import { createServer } from "node:net";
|
||||||
import path from "node:path";
|
import path from "node:path";
|
||||||
import { fileURLToPath, pathToFileURL } from "node:url";
|
import { ensurePostgresDatabase, getPostgresDataDirectory } from "./client.js";
|
||||||
import { ensurePostgresDatabase } from "./client.js";
|
|
||||||
import { resolveDatabaseTarget } from "./runtime-config.js";
|
import { resolveDatabaseTarget } from "./runtime-config.js";
|
||||||
|
|
||||||
type EmbeddedPostgresInstance = {
|
type EmbeddedPostgresInstance = {
|
||||||
@@ -90,17 +88,8 @@ async function findAvailablePort(startPort: number): Promise<number> {
|
|||||||
}
|
}
|
||||||
|
|
||||||
async function loadEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
|
async function loadEmbeddedPostgresCtor(): Promise<EmbeddedPostgresCtor> {
|
||||||
const require = createRequire(import.meta.url);
|
|
||||||
const resolveCandidates = [
|
|
||||||
path.resolve(fileURLToPath(new URL("../..", import.meta.url))),
|
|
||||||
path.resolve(fileURLToPath(new URL("../../server", import.meta.url))),
|
|
||||||
path.resolve(fileURLToPath(new URL("../../cli", import.meta.url))),
|
|
||||||
process.cwd(),
|
|
||||||
];
|
|
||||||
|
|
||||||
try {
|
try {
|
||||||
const resolvedModulePath = require.resolve("embedded-postgres", { paths: resolveCandidates });
|
const mod = await import("embedded-postgres");
|
||||||
const mod = await import(pathToFileURL(resolvedModulePath).href);
|
|
||||||
return mod.default as EmbeddedPostgresCtor;
|
return mod.default as EmbeddedPostgresCtor;
|
||||||
} catch {
|
} catch {
|
||||||
throw new Error(
|
throw new Error(
|
||||||
@@ -116,8 +105,33 @@ async function ensureEmbeddedPostgresConnection(
|
|||||||
const EmbeddedPostgres = await loadEmbeddedPostgresCtor();
|
const EmbeddedPostgres = await loadEmbeddedPostgresCtor();
|
||||||
const selectedPort = await findAvailablePort(preferredPort);
|
const selectedPort = await findAvailablePort(preferredPort);
|
||||||
const postmasterPidFile = path.resolve(dataDir, "postmaster.pid");
|
const postmasterPidFile = path.resolve(dataDir, "postmaster.pid");
|
||||||
|
const pgVersionFile = path.resolve(dataDir, "PG_VERSION");
|
||||||
const runningPid = readRunningPostmasterPid(postmasterPidFile);
|
const runningPid = readRunningPostmasterPid(postmasterPidFile);
|
||||||
const runningPort = readPidFilePort(postmasterPidFile);
|
const runningPort = readPidFilePort(postmasterPidFile);
|
||||||
|
const preferredAdminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${preferredPort}/postgres`;
|
||||||
|
|
||||||
|
if (!runningPid && existsSync(pgVersionFile)) {
|
||||||
|
try {
|
||||||
|
const actualDataDir = await getPostgresDataDirectory(preferredAdminConnectionString);
|
||||||
|
const matchesDataDir =
|
||||||
|
typeof actualDataDir === "string" &&
|
||||||
|
path.resolve(actualDataDir) === path.resolve(dataDir);
|
||||||
|
if (!matchesDataDir) {
|
||||||
|
throw new Error("reachable postgres does not use the expected embedded data directory");
|
||||||
|
}
|
||||||
|
await ensurePostgresDatabase(preferredAdminConnectionString, "paperclip");
|
||||||
|
process.emitWarning(
|
||||||
|
`Adopting an existing PostgreSQL instance on port ${preferredPort} for embedded data dir ${dataDir} because postmaster.pid is missing.`,
|
||||||
|
);
|
||||||
|
return {
|
||||||
|
connectionString: `postgres://paperclip:paperclip@127.0.0.1:${preferredPort}/paperclip`,
|
||||||
|
source: `embedded-postgres@${preferredPort}`,
|
||||||
|
stop: async () => {},
|
||||||
|
};
|
||||||
|
} catch {
|
||||||
|
// Fall through and attempt to start the configured embedded cluster.
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
if (runningPid) {
|
if (runningPid) {
|
||||||
const port = runningPort ?? preferredPort;
|
const port = runningPort ?? preferredPort;
|
||||||
|
|||||||
91
packages/db/src/migrations/0035_marvelous_satana.sql
Normal file
91
packages/db/src/migrations/0035_marvelous_satana.sql
Normal file
@@ -0,0 +1,91 @@
|
|||||||
|
CREATE TABLE "execution_workspaces" (
|
||||||
|
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
|
||||||
|
"company_id" uuid NOT NULL,
|
||||||
|
"project_id" uuid NOT NULL,
|
||||||
|
"project_workspace_id" uuid,
|
||||||
|
"source_issue_id" uuid,
|
||||||
|
"mode" text NOT NULL,
|
||||||
|
"strategy_type" text NOT NULL,
|
||||||
|
"name" text NOT NULL,
|
||||||
|
"status" text DEFAULT 'active' NOT NULL,
|
||||||
|
"cwd" text,
|
||||||
|
"repo_url" text,
|
||||||
|
"base_ref" text,
|
||||||
|
"branch_name" text,
|
||||||
|
"provider_type" text DEFAULT 'local_fs' NOT NULL,
|
||||||
|
"provider_ref" text,
|
||||||
|
"derived_from_execution_workspace_id" uuid,
|
||||||
|
"last_used_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||||
|
"opened_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||||
|
"closed_at" timestamp with time zone,
|
||||||
|
"cleanup_eligible_at" timestamp with time zone,
|
||||||
|
"cleanup_reason" text,
|
||||||
|
"metadata" jsonb,
|
||||||
|
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||||
|
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
|
||||||
|
);
|
||||||
|
--> statement-breakpoint
|
||||||
|
CREATE TABLE "issue_work_products" (
|
||||||
|
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
|
||||||
|
"company_id" uuid NOT NULL,
|
||||||
|
"project_id" uuid,
|
||||||
|
"issue_id" uuid NOT NULL,
|
||||||
|
"execution_workspace_id" uuid,
|
||||||
|
"runtime_service_id" uuid,
|
||||||
|
"type" text NOT NULL,
|
||||||
|
"provider" text NOT NULL,
|
||||||
|
"external_id" text,
|
||||||
|
"title" text NOT NULL,
|
||||||
|
"url" text,
|
||||||
|
"status" text NOT NULL,
|
||||||
|
"review_state" text DEFAULT 'none' NOT NULL,
|
||||||
|
"is_primary" boolean DEFAULT false NOT NULL,
|
||||||
|
"health_status" text DEFAULT 'unknown' NOT NULL,
|
||||||
|
"summary" text,
|
||||||
|
"metadata" jsonb,
|
||||||
|
"created_by_run_id" uuid,
|
||||||
|
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||||
|
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
|
||||||
|
);
|
||||||
|
--> statement-breakpoint
|
||||||
|
ALTER TABLE "issues" ADD COLUMN "project_workspace_id" uuid;--> statement-breakpoint
|
||||||
|
ALTER TABLE "issues" ADD COLUMN "execution_workspace_id" uuid;--> statement-breakpoint
|
||||||
|
ALTER TABLE "issues" ADD COLUMN "execution_workspace_preference" text;--> statement-breakpoint
|
||||||
|
ALTER TABLE "project_workspaces" ADD COLUMN "source_type" text DEFAULT 'local_path' NOT NULL;--> statement-breakpoint
|
||||||
|
ALTER TABLE "project_workspaces" ADD COLUMN "default_ref" text;--> statement-breakpoint
|
||||||
|
ALTER TABLE "project_workspaces" ADD COLUMN "visibility" text DEFAULT 'default' NOT NULL;--> statement-breakpoint
|
||||||
|
ALTER TABLE "project_workspaces" ADD COLUMN "setup_command" text;--> statement-breakpoint
|
||||||
|
ALTER TABLE "project_workspaces" ADD COLUMN "cleanup_command" text;--> statement-breakpoint
|
||||||
|
ALTER TABLE "project_workspaces" ADD COLUMN "remote_provider" text;--> statement-breakpoint
|
||||||
|
ALTER TABLE "project_workspaces" ADD COLUMN "remote_workspace_ref" text;--> statement-breakpoint
|
||||||
|
ALTER TABLE "project_workspaces" ADD COLUMN "shared_workspace_key" text;--> statement-breakpoint
|
||||||
|
ALTER TABLE "workspace_runtime_services" ADD COLUMN "execution_workspace_id" uuid;--> statement-breakpoint
|
||||||
|
ALTER TABLE "execution_workspaces" ADD CONSTRAINT "execution_workspaces_company_id_companies_id_fk" FOREIGN KEY ("company_id") REFERENCES "public"."companies"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "execution_workspaces" ADD CONSTRAINT "execution_workspaces_project_id_projects_id_fk" FOREIGN KEY ("project_id") REFERENCES "public"."projects"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "execution_workspaces" ADD CONSTRAINT "execution_workspaces_project_workspace_id_project_workspaces_id_fk" FOREIGN KEY ("project_workspace_id") REFERENCES "public"."project_workspaces"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "execution_workspaces" ADD CONSTRAINT "execution_workspaces_source_issue_id_issues_id_fk" FOREIGN KEY ("source_issue_id") REFERENCES "public"."issues"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "execution_workspaces" ADD CONSTRAINT "execution_workspaces_derived_from_execution_workspace_id_execution_workspaces_id_fk" FOREIGN KEY ("derived_from_execution_workspace_id") REFERENCES "public"."execution_workspaces"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "issue_work_products" ADD CONSTRAINT "issue_work_products_company_id_companies_id_fk" FOREIGN KEY ("company_id") REFERENCES "public"."companies"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "issue_work_products" ADD CONSTRAINT "issue_work_products_project_id_projects_id_fk" FOREIGN KEY ("project_id") REFERENCES "public"."projects"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "issue_work_products" ADD CONSTRAINT "issue_work_products_issue_id_issues_id_fk" FOREIGN KEY ("issue_id") REFERENCES "public"."issues"("id") ON DELETE cascade ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "issue_work_products" ADD CONSTRAINT "issue_work_products_execution_workspace_id_execution_workspaces_id_fk" FOREIGN KEY ("execution_workspace_id") REFERENCES "public"."execution_workspaces"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "issue_work_products" ADD CONSTRAINT "issue_work_products_runtime_service_id_workspace_runtime_services_id_fk" FOREIGN KEY ("runtime_service_id") REFERENCES "public"."workspace_runtime_services"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "issue_work_products" ADD CONSTRAINT "issue_work_products_created_by_run_id_heartbeat_runs_id_fk" FOREIGN KEY ("created_by_run_id") REFERENCES "public"."heartbeat_runs"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||||
|
CREATE INDEX "execution_workspaces_company_project_status_idx" ON "execution_workspaces" USING btree ("company_id","project_id","status");--> statement-breakpoint
|
||||||
|
CREATE INDEX "execution_workspaces_company_project_workspace_status_idx" ON "execution_workspaces" USING btree ("company_id","project_workspace_id","status");--> statement-breakpoint
|
||||||
|
CREATE INDEX "execution_workspaces_company_source_issue_idx" ON "execution_workspaces" USING btree ("company_id","source_issue_id");--> statement-breakpoint
|
||||||
|
CREATE INDEX "execution_workspaces_company_last_used_idx" ON "execution_workspaces" USING btree ("company_id","last_used_at");--> statement-breakpoint
|
||||||
|
CREATE INDEX "execution_workspaces_company_branch_idx" ON "execution_workspaces" USING btree ("company_id","branch_name");--> statement-breakpoint
|
||||||
|
CREATE INDEX "issue_work_products_company_issue_type_idx" ON "issue_work_products" USING btree ("company_id","issue_id","type");--> statement-breakpoint
|
||||||
|
CREATE INDEX "issue_work_products_company_execution_workspace_type_idx" ON "issue_work_products" USING btree ("company_id","execution_workspace_id","type");--> statement-breakpoint
|
||||||
|
CREATE INDEX "issue_work_products_company_provider_external_id_idx" ON "issue_work_products" USING btree ("company_id","provider","external_id");--> statement-breakpoint
|
||||||
|
CREATE INDEX "issue_work_products_company_updated_idx" ON "issue_work_products" USING btree ("company_id","updated_at");--> statement-breakpoint
|
||||||
|
ALTER TABLE "issues" ADD CONSTRAINT "issues_project_workspace_id_project_workspaces_id_fk" FOREIGN KEY ("project_workspace_id") REFERENCES "public"."project_workspaces"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "issues" ADD CONSTRAINT "issues_execution_workspace_id_execution_workspaces_id_fk" FOREIGN KEY ("execution_workspace_id") REFERENCES "public"."execution_workspaces"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "workspace_runtime_services" ADD CONSTRAINT "workspace_runtime_services_execution_workspace_id_execution_workspaces_id_fk" FOREIGN KEY ("execution_workspace_id") REFERENCES "public"."execution_workspaces"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||||
|
CREATE INDEX "issues_company_project_workspace_idx" ON "issues" USING btree ("company_id","project_workspace_id");--> statement-breakpoint
|
||||||
|
CREATE INDEX "issues_company_execution_workspace_idx" ON "issues" USING btree ("company_id","execution_workspace_id");--> statement-breakpoint
|
||||||
|
CREATE INDEX "project_workspaces_project_source_type_idx" ON "project_workspaces" USING btree ("project_id","source_type");--> statement-breakpoint
|
||||||
|
CREATE INDEX "project_workspaces_company_shared_key_idx" ON "project_workspaces" USING btree ("company_id","shared_workspace_key");--> statement-breakpoint
|
||||||
|
CREATE UNIQUE INDEX "project_workspaces_project_remote_ref_idx" ON "project_workspaces" USING btree ("project_id","remote_provider","remote_workspace_ref");--> statement-breakpoint
|
||||||
|
CREATE INDEX "workspace_runtime_services_company_execution_workspace_status_idx" ON "workspace_runtime_services" USING btree ("company_id","execution_workspace_id","status");
|
||||||
9
packages/db/src/migrations/0036_cheerful_nitro.sql
Normal file
9
packages/db/src/migrations/0036_cheerful_nitro.sql
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
CREATE TABLE "instance_settings" (
|
||||||
|
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
|
||||||
|
"singleton_key" text DEFAULT 'default' NOT NULL,
|
||||||
|
"experimental" jsonb DEFAULT '{}'::jsonb NOT NULL,
|
||||||
|
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||||
|
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
|
||||||
|
);
|
||||||
|
--> statement-breakpoint
|
||||||
|
CREATE UNIQUE INDEX "instance_settings_singleton_key_idx" ON "instance_settings" USING btree ("singleton_key");
|
||||||
29
packages/db/src/migrations/0037_friendly_eddie_brock.sql
Normal file
29
packages/db/src/migrations/0037_friendly_eddie_brock.sql
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
CREATE TABLE "workspace_operations" (
|
||||||
|
"id" uuid PRIMARY KEY DEFAULT gen_random_uuid() NOT NULL,
|
||||||
|
"company_id" uuid NOT NULL,
|
||||||
|
"execution_workspace_id" uuid,
|
||||||
|
"heartbeat_run_id" uuid,
|
||||||
|
"phase" text NOT NULL,
|
||||||
|
"command" text,
|
||||||
|
"cwd" text,
|
||||||
|
"status" text DEFAULT 'running' NOT NULL,
|
||||||
|
"exit_code" integer,
|
||||||
|
"log_store" text,
|
||||||
|
"log_ref" text,
|
||||||
|
"log_bytes" bigint,
|
||||||
|
"log_sha256" text,
|
||||||
|
"log_compressed" boolean DEFAULT false NOT NULL,
|
||||||
|
"stdout_excerpt" text,
|
||||||
|
"stderr_excerpt" text,
|
||||||
|
"metadata" jsonb,
|
||||||
|
"started_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||||
|
"finished_at" timestamp with time zone,
|
||||||
|
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
|
||||||
|
"updated_at" timestamp with time zone DEFAULT now() NOT NULL
|
||||||
|
);
|
||||||
|
--> statement-breakpoint
|
||||||
|
ALTER TABLE "workspace_operations" ADD CONSTRAINT "workspace_operations_company_id_companies_id_fk" FOREIGN KEY ("company_id") REFERENCES "public"."companies"("id") ON DELETE no action ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "workspace_operations" ADD CONSTRAINT "workspace_operations_execution_workspace_id_execution_workspaces_id_fk" FOREIGN KEY ("execution_workspace_id") REFERENCES "public"."execution_workspaces"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||||
|
ALTER TABLE "workspace_operations" ADD CONSTRAINT "workspace_operations_heartbeat_run_id_heartbeat_runs_id_fk" FOREIGN KEY ("heartbeat_run_id") REFERENCES "public"."heartbeat_runs"("id") ON DELETE set null ON UPDATE no action;--> statement-breakpoint
|
||||||
|
CREATE INDEX "workspace_operations_company_run_started_idx" ON "workspace_operations" USING btree ("company_id","heartbeat_run_id","started_at");--> statement-breakpoint
|
||||||
|
CREATE INDEX "workspace_operations_company_workspace_started_idx" ON "workspace_operations" USING btree ("company_id","execution_workspace_id","started_at");
|
||||||
9959
packages/db/src/migrations/meta/0035_snapshot.json
Normal file
9959
packages/db/src/migrations/meta/0035_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
10023
packages/db/src/migrations/meta/0036_snapshot.json
Normal file
10023
packages/db/src/migrations/meta/0036_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
10263
packages/db/src/migrations/meta/0037_snapshot.json
Normal file
10263
packages/db/src/migrations/meta/0037_snapshot.json
Normal file
File diff suppressed because it is too large
Load Diff
@@ -246,6 +246,27 @@
|
|||||||
"when": 1773697572188,
|
"when": 1773697572188,
|
||||||
"tag": "0034_fat_dormammu",
|
"tag": "0034_fat_dormammu",
|
||||||
"breakpoints": true
|
"breakpoints": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"idx": 35,
|
||||||
|
"version": "7",
|
||||||
|
"when": 1773698696169,
|
||||||
|
"tag": "0035_marvelous_satana",
|
||||||
|
"breakpoints": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"idx": 36,
|
||||||
|
"version": "7",
|
||||||
|
"when": 1773756213455,
|
||||||
|
"tag": "0036_cheerful_nitro",
|
||||||
|
"breakpoints": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"idx": 37,
|
||||||
|
"version": "7",
|
||||||
|
"when": 1773756922363,
|
||||||
|
"tag": "0037_friendly_eddie_brock",
|
||||||
|
"breakpoints": true
|
||||||
}
|
}
|
||||||
]
|
]
|
||||||
}
|
}
|
||||||
68
packages/db/src/schema/execution_workspaces.ts
Normal file
68
packages/db/src/schema/execution_workspaces.ts
Normal file
@@ -0,0 +1,68 @@
|
|||||||
|
import {
|
||||||
|
type AnyPgColumn,
|
||||||
|
index,
|
||||||
|
jsonb,
|
||||||
|
pgTable,
|
||||||
|
text,
|
||||||
|
timestamp,
|
||||||
|
uuid,
|
||||||
|
} from "drizzle-orm/pg-core";
|
||||||
|
import { companies } from "./companies.js";
|
||||||
|
import { issues } from "./issues.js";
|
||||||
|
import { projectWorkspaces } from "./project_workspaces.js";
|
||||||
|
import { projects } from "./projects.js";
|
||||||
|
|
||||||
|
export const executionWorkspaces = pgTable(
|
||||||
|
"execution_workspaces",
|
||||||
|
{
|
||||||
|
id: uuid("id").primaryKey().defaultRandom(),
|
||||||
|
companyId: uuid("company_id").notNull().references(() => companies.id),
|
||||||
|
projectId: uuid("project_id").notNull().references(() => projects.id, { onDelete: "cascade" }),
|
||||||
|
projectWorkspaceId: uuid("project_workspace_id").references(() => projectWorkspaces.id, { onDelete: "set null" }),
|
||||||
|
sourceIssueId: uuid("source_issue_id").references((): AnyPgColumn => issues.id, { onDelete: "set null" }),
|
||||||
|
mode: text("mode").notNull(),
|
||||||
|
strategyType: text("strategy_type").notNull(),
|
||||||
|
name: text("name").notNull(),
|
||||||
|
status: text("status").notNull().default("active"),
|
||||||
|
cwd: text("cwd"),
|
||||||
|
repoUrl: text("repo_url"),
|
||||||
|
baseRef: text("base_ref"),
|
||||||
|
branchName: text("branch_name"),
|
||||||
|
providerType: text("provider_type").notNull().default("local_fs"),
|
||||||
|
providerRef: text("provider_ref"),
|
||||||
|
derivedFromExecutionWorkspaceId: uuid("derived_from_execution_workspace_id")
|
||||||
|
.references((): AnyPgColumn => executionWorkspaces.id, { onDelete: "set null" }),
|
||||||
|
lastUsedAt: timestamp("last_used_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
openedAt: timestamp("opened_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
closedAt: timestamp("closed_at", { withTimezone: true }),
|
||||||
|
cleanupEligibleAt: timestamp("cleanup_eligible_at", { withTimezone: true }),
|
||||||
|
cleanupReason: text("cleanup_reason"),
|
||||||
|
metadata: jsonb("metadata").$type<Record<string, unknown>>(),
|
||||||
|
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
},
|
||||||
|
(table) => ({
|
||||||
|
companyProjectStatusIdx: index("execution_workspaces_company_project_status_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.projectId,
|
||||||
|
table.status,
|
||||||
|
),
|
||||||
|
companyProjectWorkspaceStatusIdx: index("execution_workspaces_company_project_workspace_status_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.projectWorkspaceId,
|
||||||
|
table.status,
|
||||||
|
),
|
||||||
|
companySourceIssueIdx: index("execution_workspaces_company_source_issue_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.sourceIssueId,
|
||||||
|
),
|
||||||
|
companyLastUsedIdx: index("execution_workspaces_company_last_used_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.lastUsedAt,
|
||||||
|
),
|
||||||
|
companyBranchIdx: index("execution_workspaces_company_branch_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.branchName,
|
||||||
|
),
|
||||||
|
}),
|
||||||
|
);
|
||||||
@@ -1,6 +1,7 @@
|
|||||||
export { companies } from "./companies.js";
|
export { companies } from "./companies.js";
|
||||||
export { companyLogos } from "./company_logos.js";
|
export { companyLogos } from "./company_logos.js";
|
||||||
export { authUsers, authSessions, authAccounts, authVerifications } from "./auth.js";
|
export { authUsers, authSessions, authAccounts, authVerifications } from "./auth.js";
|
||||||
|
export { instanceSettings } from "./instance_settings.js";
|
||||||
export { instanceUserRoles } from "./instance_user_roles.js";
|
export { instanceUserRoles } from "./instance_user_roles.js";
|
||||||
export { agents } from "./agents.js";
|
export { agents } from "./agents.js";
|
||||||
export { companyMemberships } from "./company_memberships.js";
|
export { companyMemberships } from "./company_memberships.js";
|
||||||
@@ -16,10 +17,13 @@ export { agentTaskSessions } from "./agent_task_sessions.js";
|
|||||||
export { agentWakeupRequests } from "./agent_wakeup_requests.js";
|
export { agentWakeupRequests } from "./agent_wakeup_requests.js";
|
||||||
export { projects } from "./projects.js";
|
export { projects } from "./projects.js";
|
||||||
export { projectWorkspaces } from "./project_workspaces.js";
|
export { projectWorkspaces } from "./project_workspaces.js";
|
||||||
|
export { executionWorkspaces } from "./execution_workspaces.js";
|
||||||
|
export { workspaceOperations } from "./workspace_operations.js";
|
||||||
export { workspaceRuntimeServices } from "./workspace_runtime_services.js";
|
export { workspaceRuntimeServices } from "./workspace_runtime_services.js";
|
||||||
export { projectGoals } from "./project_goals.js";
|
export { projectGoals } from "./project_goals.js";
|
||||||
export { goals } from "./goals.js";
|
export { goals } from "./goals.js";
|
||||||
export { issues } from "./issues.js";
|
export { issues } from "./issues.js";
|
||||||
|
export { issueWorkProducts } from "./issue_work_products.js";
|
||||||
export { labels } from "./labels.js";
|
export { labels } from "./labels.js";
|
||||||
export { issueLabels } from "./issue_labels.js";
|
export { issueLabels } from "./issue_labels.js";
|
||||||
export { issueApprovals } from "./issue_approvals.js";
|
export { issueApprovals } from "./issue_approvals.js";
|
||||||
|
|||||||
15
packages/db/src/schema/instance_settings.ts
Normal file
15
packages/db/src/schema/instance_settings.ts
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
import { pgTable, uuid, text, timestamp, jsonb, uniqueIndex } from "drizzle-orm/pg-core";
|
||||||
|
|
||||||
|
export const instanceSettings = pgTable(
|
||||||
|
"instance_settings",
|
||||||
|
{
|
||||||
|
id: uuid("id").primaryKey().defaultRandom(),
|
||||||
|
singletonKey: text("singleton_key").notNull().default("default"),
|
||||||
|
experimental: jsonb("experimental").$type<Record<string, unknown>>().notNull().default({}),
|
||||||
|
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
},
|
||||||
|
(table) => ({
|
||||||
|
singletonKeyIdx: uniqueIndex("instance_settings_singleton_key_idx").on(table.singletonKey),
|
||||||
|
}),
|
||||||
|
);
|
||||||
64
packages/db/src/schema/issue_work_products.ts
Normal file
64
packages/db/src/schema/issue_work_products.ts
Normal file
@@ -0,0 +1,64 @@
|
|||||||
|
import {
|
||||||
|
boolean,
|
||||||
|
index,
|
||||||
|
jsonb,
|
||||||
|
pgTable,
|
||||||
|
text,
|
||||||
|
timestamp,
|
||||||
|
uuid,
|
||||||
|
} from "drizzle-orm/pg-core";
|
||||||
|
import { companies } from "./companies.js";
|
||||||
|
import { executionWorkspaces } from "./execution_workspaces.js";
|
||||||
|
import { heartbeatRuns } from "./heartbeat_runs.js";
|
||||||
|
import { issues } from "./issues.js";
|
||||||
|
import { projects } from "./projects.js";
|
||||||
|
import { workspaceRuntimeServices } from "./workspace_runtime_services.js";
|
||||||
|
|
||||||
|
export const issueWorkProducts = pgTable(
|
||||||
|
"issue_work_products",
|
||||||
|
{
|
||||||
|
id: uuid("id").primaryKey().defaultRandom(),
|
||||||
|
companyId: uuid("company_id").notNull().references(() => companies.id),
|
||||||
|
projectId: uuid("project_id").references(() => projects.id, { onDelete: "set null" }),
|
||||||
|
issueId: uuid("issue_id").notNull().references(() => issues.id, { onDelete: "cascade" }),
|
||||||
|
executionWorkspaceId: uuid("execution_workspace_id")
|
||||||
|
.references(() => executionWorkspaces.id, { onDelete: "set null" }),
|
||||||
|
runtimeServiceId: uuid("runtime_service_id")
|
||||||
|
.references(() => workspaceRuntimeServices.id, { onDelete: "set null" }),
|
||||||
|
type: text("type").notNull(),
|
||||||
|
provider: text("provider").notNull(),
|
||||||
|
externalId: text("external_id"),
|
||||||
|
title: text("title").notNull(),
|
||||||
|
url: text("url"),
|
||||||
|
status: text("status").notNull(),
|
||||||
|
reviewState: text("review_state").notNull().default("none"),
|
||||||
|
isPrimary: boolean("is_primary").notNull().default(false),
|
||||||
|
healthStatus: text("health_status").notNull().default("unknown"),
|
||||||
|
summary: text("summary"),
|
||||||
|
metadata: jsonb("metadata").$type<Record<string, unknown>>(),
|
||||||
|
createdByRunId: uuid("created_by_run_id").references(() => heartbeatRuns.id, { onDelete: "set null" }),
|
||||||
|
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
},
|
||||||
|
(table) => ({
|
||||||
|
companyIssueTypeIdx: index("issue_work_products_company_issue_type_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.issueId,
|
||||||
|
table.type,
|
||||||
|
),
|
||||||
|
companyExecutionWorkspaceTypeIdx: index("issue_work_products_company_execution_workspace_type_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.executionWorkspaceId,
|
||||||
|
table.type,
|
||||||
|
),
|
||||||
|
companyProviderExternalIdIdx: index("issue_work_products_company_provider_external_id_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.provider,
|
||||||
|
table.externalId,
|
||||||
|
),
|
||||||
|
companyUpdatedIdx: index("issue_work_products_company_updated_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.updatedAt,
|
||||||
|
),
|
||||||
|
}),
|
||||||
|
);
|
||||||
@@ -14,6 +14,8 @@ import { projects } from "./projects.js";
|
|||||||
import { goals } from "./goals.js";
|
import { goals } from "./goals.js";
|
||||||
import { companies } from "./companies.js";
|
import { companies } from "./companies.js";
|
||||||
import { heartbeatRuns } from "./heartbeat_runs.js";
|
import { heartbeatRuns } from "./heartbeat_runs.js";
|
||||||
|
import { projectWorkspaces } from "./project_workspaces.js";
|
||||||
|
import { executionWorkspaces } from "./execution_workspaces.js";
|
||||||
|
|
||||||
export const issues = pgTable(
|
export const issues = pgTable(
|
||||||
"issues",
|
"issues",
|
||||||
@@ -21,6 +23,7 @@ export const issues = pgTable(
|
|||||||
id: uuid("id").primaryKey().defaultRandom(),
|
id: uuid("id").primaryKey().defaultRandom(),
|
||||||
companyId: uuid("company_id").notNull().references(() => companies.id),
|
companyId: uuid("company_id").notNull().references(() => companies.id),
|
||||||
projectId: uuid("project_id").references(() => projects.id),
|
projectId: uuid("project_id").references(() => projects.id),
|
||||||
|
projectWorkspaceId: uuid("project_workspace_id").references(() => projectWorkspaces.id, { onDelete: "set null" }),
|
||||||
goalId: uuid("goal_id").references(() => goals.id),
|
goalId: uuid("goal_id").references(() => goals.id),
|
||||||
parentId: uuid("parent_id").references((): AnyPgColumn => issues.id),
|
parentId: uuid("parent_id").references((): AnyPgColumn => issues.id),
|
||||||
title: text("title").notNull(),
|
title: text("title").notNull(),
|
||||||
@@ -40,6 +43,9 @@ export const issues = pgTable(
|
|||||||
requestDepth: integer("request_depth").notNull().default(0),
|
requestDepth: integer("request_depth").notNull().default(0),
|
||||||
billingCode: text("billing_code"),
|
billingCode: text("billing_code"),
|
||||||
assigneeAdapterOverrides: jsonb("assignee_adapter_overrides").$type<Record<string, unknown>>(),
|
assigneeAdapterOverrides: jsonb("assignee_adapter_overrides").$type<Record<string, unknown>>(),
|
||||||
|
executionWorkspaceId: uuid("execution_workspace_id")
|
||||||
|
.references((): AnyPgColumn => executionWorkspaces.id, { onDelete: "set null" }),
|
||||||
|
executionWorkspacePreference: text("execution_workspace_preference"),
|
||||||
executionWorkspaceSettings: jsonb("execution_workspace_settings").$type<Record<string, unknown>>(),
|
executionWorkspaceSettings: jsonb("execution_workspace_settings").$type<Record<string, unknown>>(),
|
||||||
startedAt: timestamp("started_at", { withTimezone: true }),
|
startedAt: timestamp("started_at", { withTimezone: true }),
|
||||||
completedAt: timestamp("completed_at", { withTimezone: true }),
|
completedAt: timestamp("completed_at", { withTimezone: true }),
|
||||||
@@ -62,6 +68,8 @@ export const issues = pgTable(
|
|||||||
),
|
),
|
||||||
parentIdx: index("issues_company_parent_idx").on(table.companyId, table.parentId),
|
parentIdx: index("issues_company_parent_idx").on(table.companyId, table.parentId),
|
||||||
projectIdx: index("issues_company_project_idx").on(table.companyId, table.projectId),
|
projectIdx: index("issues_company_project_idx").on(table.companyId, table.projectId),
|
||||||
|
projectWorkspaceIdx: index("issues_company_project_workspace_idx").on(table.companyId, table.projectWorkspaceId),
|
||||||
|
executionWorkspaceIdx: index("issues_company_execution_workspace_idx").on(table.companyId, table.executionWorkspaceId),
|
||||||
identifierIdx: uniqueIndex("issues_identifier_idx").on(table.identifier),
|
identifierIdx: uniqueIndex("issues_identifier_idx").on(table.identifier),
|
||||||
}),
|
}),
|
||||||
);
|
);
|
||||||
|
|||||||
@@ -5,6 +5,7 @@ import {
|
|||||||
pgTable,
|
pgTable,
|
||||||
text,
|
text,
|
||||||
timestamp,
|
timestamp,
|
||||||
|
uniqueIndex,
|
||||||
uuid,
|
uuid,
|
||||||
} from "drizzle-orm/pg-core";
|
} from "drizzle-orm/pg-core";
|
||||||
import { companies } from "./companies.js";
|
import { companies } from "./companies.js";
|
||||||
@@ -17,9 +18,17 @@ export const projectWorkspaces = pgTable(
|
|||||||
companyId: uuid("company_id").notNull().references(() => companies.id),
|
companyId: uuid("company_id").notNull().references(() => companies.id),
|
||||||
projectId: uuid("project_id").notNull().references(() => projects.id, { onDelete: "cascade" }),
|
projectId: uuid("project_id").notNull().references(() => projects.id, { onDelete: "cascade" }),
|
||||||
name: text("name").notNull(),
|
name: text("name").notNull(),
|
||||||
|
sourceType: text("source_type").notNull().default("local_path"),
|
||||||
cwd: text("cwd"),
|
cwd: text("cwd"),
|
||||||
repoUrl: text("repo_url"),
|
repoUrl: text("repo_url"),
|
||||||
repoRef: text("repo_ref"),
|
repoRef: text("repo_ref"),
|
||||||
|
defaultRef: text("default_ref"),
|
||||||
|
visibility: text("visibility").notNull().default("default"),
|
||||||
|
setupCommand: text("setup_command"),
|
||||||
|
cleanupCommand: text("cleanup_command"),
|
||||||
|
remoteProvider: text("remote_provider"),
|
||||||
|
remoteWorkspaceRef: text("remote_workspace_ref"),
|
||||||
|
sharedWorkspaceKey: text("shared_workspace_key"),
|
||||||
metadata: jsonb("metadata").$type<Record<string, unknown>>(),
|
metadata: jsonb("metadata").$type<Record<string, unknown>>(),
|
||||||
isPrimary: boolean("is_primary").notNull().default(false),
|
isPrimary: boolean("is_primary").notNull().default(false),
|
||||||
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
@@ -28,5 +37,9 @@ export const projectWorkspaces = pgTable(
|
|||||||
(table) => ({
|
(table) => ({
|
||||||
companyProjectIdx: index("project_workspaces_company_project_idx").on(table.companyId, table.projectId),
|
companyProjectIdx: index("project_workspaces_company_project_idx").on(table.companyId, table.projectId),
|
||||||
projectPrimaryIdx: index("project_workspaces_project_primary_idx").on(table.projectId, table.isPrimary),
|
projectPrimaryIdx: index("project_workspaces_project_primary_idx").on(table.projectId, table.isPrimary),
|
||||||
|
projectSourceTypeIdx: index("project_workspaces_project_source_type_idx").on(table.projectId, table.sourceType),
|
||||||
|
companySharedKeyIdx: index("project_workspaces_company_shared_key_idx").on(table.companyId, table.sharedWorkspaceKey),
|
||||||
|
projectRemoteRefIdx: uniqueIndex("project_workspaces_project_remote_ref_idx")
|
||||||
|
.on(table.projectId, table.remoteProvider, table.remoteWorkspaceRef),
|
||||||
}),
|
}),
|
||||||
);
|
);
|
||||||
|
|||||||
57
packages/db/src/schema/workspace_operations.ts
Normal file
57
packages/db/src/schema/workspace_operations.ts
Normal file
@@ -0,0 +1,57 @@
|
|||||||
|
import {
|
||||||
|
bigint,
|
||||||
|
boolean,
|
||||||
|
index,
|
||||||
|
integer,
|
||||||
|
jsonb,
|
||||||
|
pgTable,
|
||||||
|
text,
|
||||||
|
timestamp,
|
||||||
|
uuid,
|
||||||
|
} from "drizzle-orm/pg-core";
|
||||||
|
import { companies } from "./companies.js";
|
||||||
|
import { executionWorkspaces } from "./execution_workspaces.js";
|
||||||
|
import { heartbeatRuns } from "./heartbeat_runs.js";
|
||||||
|
|
||||||
|
export const workspaceOperations = pgTable(
|
||||||
|
"workspace_operations",
|
||||||
|
{
|
||||||
|
id: uuid("id").primaryKey().defaultRandom(),
|
||||||
|
companyId: uuid("company_id").notNull().references(() => companies.id),
|
||||||
|
executionWorkspaceId: uuid("execution_workspace_id").references(() => executionWorkspaces.id, {
|
||||||
|
onDelete: "set null",
|
||||||
|
}),
|
||||||
|
heartbeatRunId: uuid("heartbeat_run_id").references(() => heartbeatRuns.id, {
|
||||||
|
onDelete: "set null",
|
||||||
|
}),
|
||||||
|
phase: text("phase").notNull(),
|
||||||
|
command: text("command"),
|
||||||
|
cwd: text("cwd"),
|
||||||
|
status: text("status").notNull().default("running"),
|
||||||
|
exitCode: integer("exit_code"),
|
||||||
|
logStore: text("log_store"),
|
||||||
|
logRef: text("log_ref"),
|
||||||
|
logBytes: bigint("log_bytes", { mode: "number" }),
|
||||||
|
logSha256: text("log_sha256"),
|
||||||
|
logCompressed: boolean("log_compressed").notNull().default(false),
|
||||||
|
stdoutExcerpt: text("stdout_excerpt"),
|
||||||
|
stderrExcerpt: text("stderr_excerpt"),
|
||||||
|
metadata: jsonb("metadata").$type<Record<string, unknown>>(),
|
||||||
|
startedAt: timestamp("started_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
finishedAt: timestamp("finished_at", { withTimezone: true }),
|
||||||
|
createdAt: timestamp("created_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
updatedAt: timestamp("updated_at", { withTimezone: true }).notNull().defaultNow(),
|
||||||
|
},
|
||||||
|
(table) => ({
|
||||||
|
companyRunStartedIdx: index("workspace_operations_company_run_started_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.heartbeatRunId,
|
||||||
|
table.startedAt,
|
||||||
|
),
|
||||||
|
companyWorkspaceStartedIdx: index("workspace_operations_company_workspace_started_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.executionWorkspaceId,
|
||||||
|
table.startedAt,
|
||||||
|
),
|
||||||
|
}),
|
||||||
|
);
|
||||||
@@ -10,6 +10,7 @@ import {
|
|||||||
import { companies } from "./companies.js";
|
import { companies } from "./companies.js";
|
||||||
import { projects } from "./projects.js";
|
import { projects } from "./projects.js";
|
||||||
import { projectWorkspaces } from "./project_workspaces.js";
|
import { projectWorkspaces } from "./project_workspaces.js";
|
||||||
|
import { executionWorkspaces } from "./execution_workspaces.js";
|
||||||
import { issues } from "./issues.js";
|
import { issues } from "./issues.js";
|
||||||
import { agents } from "./agents.js";
|
import { agents } from "./agents.js";
|
||||||
import { heartbeatRuns } from "./heartbeat_runs.js";
|
import { heartbeatRuns } from "./heartbeat_runs.js";
|
||||||
@@ -21,6 +22,7 @@ export const workspaceRuntimeServices = pgTable(
|
|||||||
companyId: uuid("company_id").notNull().references(() => companies.id),
|
companyId: uuid("company_id").notNull().references(() => companies.id),
|
||||||
projectId: uuid("project_id").references(() => projects.id, { onDelete: "set null" }),
|
projectId: uuid("project_id").references(() => projects.id, { onDelete: "set null" }),
|
||||||
projectWorkspaceId: uuid("project_workspace_id").references(() => projectWorkspaces.id, { onDelete: "set null" }),
|
projectWorkspaceId: uuid("project_workspace_id").references(() => projectWorkspaces.id, { onDelete: "set null" }),
|
||||||
|
executionWorkspaceId: uuid("execution_workspace_id").references(() => executionWorkspaces.id, { onDelete: "set null" }),
|
||||||
issueId: uuid("issue_id").references(() => issues.id, { onDelete: "set null" }),
|
issueId: uuid("issue_id").references(() => issues.id, { onDelete: "set null" }),
|
||||||
scopeType: text("scope_type").notNull(),
|
scopeType: text("scope_type").notNull(),
|
||||||
scopeId: text("scope_id"),
|
scopeId: text("scope_id"),
|
||||||
@@ -50,6 +52,11 @@ export const workspaceRuntimeServices = pgTable(
|
|||||||
table.projectWorkspaceId,
|
table.projectWorkspaceId,
|
||||||
table.status,
|
table.status,
|
||||||
),
|
),
|
||||||
|
companyExecutionWorkspaceStatusIdx: index("workspace_runtime_services_company_execution_workspace_status_idx").on(
|
||||||
|
table.companyId,
|
||||||
|
table.executionWorkspaceId,
|
||||||
|
table.status,
|
||||||
|
),
|
||||||
companyProjectStatusIdx: index("workspace_runtime_services_company_project_status_idx").on(
|
companyProjectStatusIdx: index("workspace_runtime_services_company_project_status_idx").on(
|
||||||
table.companyId,
|
table.companyId,
|
||||||
table.projectId,
|
table.projectId,
|
||||||
|
|||||||
@@ -353,6 +353,7 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
|||||||
id: randomUUID(),
|
id: randomUUID(),
|
||||||
companyId: input.companyId,
|
companyId: input.companyId,
|
||||||
projectId: input.projectId ?? null,
|
projectId: input.projectId ?? null,
|
||||||
|
projectWorkspaceId: null,
|
||||||
goalId: input.goalId ?? null,
|
goalId: input.goalId ?? null,
|
||||||
parentId: input.parentId ?? null,
|
parentId: input.parentId ?? null,
|
||||||
title: input.title,
|
title: input.title,
|
||||||
@@ -372,6 +373,8 @@ export function createTestHarness(options: TestHarnessOptions): TestHarness {
|
|||||||
requestDepth: 0,
|
requestDepth: 0,
|
||||||
billingCode: null,
|
billingCode: null,
|
||||||
assigneeAdapterOverrides: null,
|
assigneeAdapterOverrides: null,
|
||||||
|
executionWorkspaceId: null,
|
||||||
|
executionWorkspacePreference: null,
|
||||||
executionWorkspaceSettings: null,
|
executionWorkspaceSettings: null,
|
||||||
startedAt: null,
|
startedAt: null,
|
||||||
completedAt: null,
|
completedAt: null,
|
||||||
|
|||||||
@@ -120,6 +120,8 @@ export {
|
|||||||
|
|
||||||
export type {
|
export type {
|
||||||
Company,
|
Company,
|
||||||
|
InstanceExperimentalSettings,
|
||||||
|
InstanceSettings,
|
||||||
Agent,
|
Agent,
|
||||||
AgentPermissions,
|
AgentPermissions,
|
||||||
AgentKeyCreated,
|
AgentKeyCreated,
|
||||||
@@ -130,14 +132,28 @@ export type {
|
|||||||
AdapterEnvironmentTestResult,
|
AdapterEnvironmentTestResult,
|
||||||
AssetImage,
|
AssetImage,
|
||||||
Project,
|
Project,
|
||||||
|
ProjectCodebase,
|
||||||
|
ProjectCodebaseOrigin,
|
||||||
ProjectGoalRef,
|
ProjectGoalRef,
|
||||||
ProjectWorkspace,
|
ProjectWorkspace,
|
||||||
|
ExecutionWorkspace,
|
||||||
WorkspaceRuntimeService,
|
WorkspaceRuntimeService,
|
||||||
|
WorkspaceOperation,
|
||||||
|
WorkspaceOperationPhase,
|
||||||
|
WorkspaceOperationStatus,
|
||||||
ExecutionWorkspaceStrategyType,
|
ExecutionWorkspaceStrategyType,
|
||||||
ExecutionWorkspaceMode,
|
ExecutionWorkspaceMode,
|
||||||
|
ExecutionWorkspaceProviderType,
|
||||||
|
ExecutionWorkspaceStatus,
|
||||||
ExecutionWorkspaceStrategy,
|
ExecutionWorkspaceStrategy,
|
||||||
ProjectExecutionWorkspacePolicy,
|
ProjectExecutionWorkspacePolicy,
|
||||||
|
ProjectExecutionWorkspaceDefaultMode,
|
||||||
IssueExecutionWorkspaceSettings,
|
IssueExecutionWorkspaceSettings,
|
||||||
|
IssueWorkProduct,
|
||||||
|
IssueWorkProductType,
|
||||||
|
IssueWorkProductProvider,
|
||||||
|
IssueWorkProductStatus,
|
||||||
|
IssueWorkProductReviewState,
|
||||||
Issue,
|
Issue,
|
||||||
IssueAssigneeAdapterOverrides,
|
IssueAssigneeAdapterOverrides,
|
||||||
IssueComment,
|
IssueComment,
|
||||||
@@ -228,6 +244,12 @@ export type {
|
|||||||
ProviderQuotaResult,
|
ProviderQuotaResult,
|
||||||
} from "./types/index.js";
|
} from "./types/index.js";
|
||||||
|
|
||||||
|
export {
|
||||||
|
instanceExperimentalSettingsSchema,
|
||||||
|
patchInstanceExperimentalSettingsSchema,
|
||||||
|
type PatchInstanceExperimentalSettings,
|
||||||
|
} from "./validators/index.js";
|
||||||
|
|
||||||
export {
|
export {
|
||||||
createCompanySchema,
|
createCompanySchema,
|
||||||
updateCompanySchema,
|
updateCompanySchema,
|
||||||
@@ -269,6 +291,13 @@ export {
|
|||||||
addIssueCommentSchema,
|
addIssueCommentSchema,
|
||||||
linkIssueApprovalSchema,
|
linkIssueApprovalSchema,
|
||||||
createIssueAttachmentMetadataSchema,
|
createIssueAttachmentMetadataSchema,
|
||||||
|
createIssueWorkProductSchema,
|
||||||
|
updateIssueWorkProductSchema,
|
||||||
|
issueWorkProductTypeSchema,
|
||||||
|
issueWorkProductStatusSchema,
|
||||||
|
issueWorkProductReviewStateSchema,
|
||||||
|
updateExecutionWorkspaceSchema,
|
||||||
|
executionWorkspaceStatusSchema,
|
||||||
issueDocumentFormatSchema,
|
issueDocumentFormatSchema,
|
||||||
issueDocumentKeySchema,
|
issueDocumentKeySchema,
|
||||||
upsertIssueDocumentSchema,
|
upsertIssueDocumentSchema,
|
||||||
@@ -279,6 +308,9 @@ export {
|
|||||||
type AddIssueComment,
|
type AddIssueComment,
|
||||||
type LinkIssueApproval,
|
type LinkIssueApproval,
|
||||||
type CreateIssueAttachmentMetadata,
|
type CreateIssueAttachmentMetadata,
|
||||||
|
type CreateIssueWorkProduct,
|
||||||
|
type UpdateIssueWorkProduct,
|
||||||
|
type UpdateExecutionWorkspace,
|
||||||
type IssueDocumentFormat,
|
type IssueDocumentFormat,
|
||||||
type UpsertIssueDocument,
|
type UpsertIssueDocument,
|
||||||
createGoalSchema,
|
createGoalSchema,
|
||||||
|
|||||||
@@ -1,4 +1,5 @@
|
|||||||
export type { Company } from "./company.js";
|
export type { Company } from "./company.js";
|
||||||
|
export type { InstanceExperimentalSettings, InstanceSettings } from "./instance.js";
|
||||||
export type {
|
export type {
|
||||||
Agent,
|
Agent,
|
||||||
AgentPermissions,
|
AgentPermissions,
|
||||||
@@ -10,15 +11,31 @@ export type {
|
|||||||
AdapterEnvironmentTestResult,
|
AdapterEnvironmentTestResult,
|
||||||
} from "./agent.js";
|
} from "./agent.js";
|
||||||
export type { AssetImage } from "./asset.js";
|
export type { AssetImage } from "./asset.js";
|
||||||
export type { Project, ProjectGoalRef, ProjectWorkspace } from "./project.js";
|
export type { Project, ProjectCodebase, ProjectCodebaseOrigin, ProjectGoalRef, ProjectWorkspace } from "./project.js";
|
||||||
export type {
|
export type {
|
||||||
|
ExecutionWorkspace,
|
||||||
WorkspaceRuntimeService,
|
WorkspaceRuntimeService,
|
||||||
ExecutionWorkspaceStrategyType,
|
ExecutionWorkspaceStrategyType,
|
||||||
ExecutionWorkspaceMode,
|
ExecutionWorkspaceMode,
|
||||||
|
ExecutionWorkspaceProviderType,
|
||||||
|
ExecutionWorkspaceStatus,
|
||||||
ExecutionWorkspaceStrategy,
|
ExecutionWorkspaceStrategy,
|
||||||
ProjectExecutionWorkspacePolicy,
|
ProjectExecutionWorkspacePolicy,
|
||||||
|
ProjectExecutionWorkspaceDefaultMode,
|
||||||
IssueExecutionWorkspaceSettings,
|
IssueExecutionWorkspaceSettings,
|
||||||
} from "./workspace-runtime.js";
|
} from "./workspace-runtime.js";
|
||||||
|
export type {
|
||||||
|
WorkspaceOperation,
|
||||||
|
WorkspaceOperationPhase,
|
||||||
|
WorkspaceOperationStatus,
|
||||||
|
} from "./workspace-operation.js";
|
||||||
|
export type {
|
||||||
|
IssueWorkProduct,
|
||||||
|
IssueWorkProductType,
|
||||||
|
IssueWorkProductProvider,
|
||||||
|
IssueWorkProductStatus,
|
||||||
|
IssueWorkProductReviewState,
|
||||||
|
} from "./work-product.js";
|
||||||
export type {
|
export type {
|
||||||
Issue,
|
Issue,
|
||||||
IssueAssigneeAdapterOverrides,
|
IssueAssigneeAdapterOverrides,
|
||||||
|
|||||||
10
packages/shared/src/types/instance.ts
Normal file
10
packages/shared/src/types/instance.ts
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
export interface InstanceExperimentalSettings {
|
||||||
|
enableIsolatedWorkspaces: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface InstanceSettings {
|
||||||
|
id: string;
|
||||||
|
experimental: InstanceExperimentalSettings;
|
||||||
|
createdAt: Date;
|
||||||
|
updatedAt: Date;
|
||||||
|
}
|
||||||
@@ -1,7 +1,8 @@
|
|||||||
import type { IssuePriority, IssueStatus } from "../constants.js";
|
import type { IssuePriority, IssueStatus } from "../constants.js";
|
||||||
import type { Goal } from "./goal.js";
|
import type { Goal } from "./goal.js";
|
||||||
import type { Project, ProjectWorkspace } from "./project.js";
|
import type { Project, ProjectWorkspace } from "./project.js";
|
||||||
import type { IssueExecutionWorkspaceSettings } from "./workspace-runtime.js";
|
import type { ExecutionWorkspace, IssueExecutionWorkspaceSettings } from "./workspace-runtime.js";
|
||||||
|
import type { IssueWorkProduct } from "./work-product.js";
|
||||||
|
|
||||||
export interface IssueAncestorProject {
|
export interface IssueAncestorProject {
|
||||||
id: string;
|
id: string;
|
||||||
@@ -97,6 +98,7 @@ export interface Issue {
|
|||||||
id: string;
|
id: string;
|
||||||
companyId: string;
|
companyId: string;
|
||||||
projectId: string | null;
|
projectId: string | null;
|
||||||
|
projectWorkspaceId: string | null;
|
||||||
goalId: string | null;
|
goalId: string | null;
|
||||||
parentId: string | null;
|
parentId: string | null;
|
||||||
ancestors?: IssueAncestor[];
|
ancestors?: IssueAncestor[];
|
||||||
@@ -117,6 +119,8 @@ export interface Issue {
|
|||||||
requestDepth: number;
|
requestDepth: number;
|
||||||
billingCode: string | null;
|
billingCode: string | null;
|
||||||
assigneeAdapterOverrides: IssueAssigneeAdapterOverrides | null;
|
assigneeAdapterOverrides: IssueAssigneeAdapterOverrides | null;
|
||||||
|
executionWorkspaceId: string | null;
|
||||||
|
executionWorkspacePreference: string | null;
|
||||||
executionWorkspaceSettings: IssueExecutionWorkspaceSettings | null;
|
executionWorkspaceSettings: IssueExecutionWorkspaceSettings | null;
|
||||||
startedAt: Date | null;
|
startedAt: Date | null;
|
||||||
completedAt: Date | null;
|
completedAt: Date | null;
|
||||||
@@ -129,6 +133,8 @@ export interface Issue {
|
|||||||
legacyPlanDocument?: LegacyPlanDocument | null;
|
legacyPlanDocument?: LegacyPlanDocument | null;
|
||||||
project?: Project | null;
|
project?: Project | null;
|
||||||
goal?: Goal | null;
|
goal?: Goal | null;
|
||||||
|
currentExecutionWorkspace?: ExecutionWorkspace | null;
|
||||||
|
workProducts?: IssueWorkProduct[];
|
||||||
mentionedProjects?: Project[];
|
mentionedProjects?: Project[];
|
||||||
myLastTouchAt?: Date | null;
|
myLastTouchAt?: Date | null;
|
||||||
lastExternalCommentAt?: Date | null;
|
lastExternalCommentAt?: Date | null;
|
||||||
|
|||||||
@@ -1,6 +1,9 @@
|
|||||||
import type { PauseReason, ProjectStatus } from "../constants.js";
|
import type { PauseReason, ProjectStatus } from "../constants.js";
|
||||||
import type { ProjectExecutionWorkspacePolicy, WorkspaceRuntimeService } from "./workspace-runtime.js";
|
import type { ProjectExecutionWorkspacePolicy, WorkspaceRuntimeService } from "./workspace-runtime.js";
|
||||||
|
|
||||||
|
export type ProjectWorkspaceSourceType = "local_path" | "git_repo" | "remote_managed" | "non_git_path";
|
||||||
|
export type ProjectWorkspaceVisibility = "default" | "advanced";
|
||||||
|
|
||||||
export interface ProjectGoalRef {
|
export interface ProjectGoalRef {
|
||||||
id: string;
|
id: string;
|
||||||
title: string;
|
title: string;
|
||||||
@@ -11,9 +14,17 @@ export interface ProjectWorkspace {
|
|||||||
companyId: string;
|
companyId: string;
|
||||||
projectId: string;
|
projectId: string;
|
||||||
name: string;
|
name: string;
|
||||||
|
sourceType: ProjectWorkspaceSourceType;
|
||||||
cwd: string | null;
|
cwd: string | null;
|
||||||
repoUrl: string | null;
|
repoUrl: string | null;
|
||||||
repoRef: string | null;
|
repoRef: string | null;
|
||||||
|
defaultRef: string | null;
|
||||||
|
visibility: ProjectWorkspaceVisibility;
|
||||||
|
setupCommand: string | null;
|
||||||
|
cleanupCommand: string | null;
|
||||||
|
remoteProvider: string | null;
|
||||||
|
remoteWorkspaceRef: string | null;
|
||||||
|
sharedWorkspaceKey: string | null;
|
||||||
metadata: Record<string, unknown> | null;
|
metadata: Record<string, unknown> | null;
|
||||||
isPrimary: boolean;
|
isPrimary: boolean;
|
||||||
runtimeServices?: WorkspaceRuntimeService[];
|
runtimeServices?: WorkspaceRuntimeService[];
|
||||||
@@ -21,6 +32,20 @@ export interface ProjectWorkspace {
|
|||||||
updatedAt: Date;
|
updatedAt: Date;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export type ProjectCodebaseOrigin = "local_folder" | "managed_checkout";
|
||||||
|
|
||||||
|
export interface ProjectCodebase {
|
||||||
|
workspaceId: string | null;
|
||||||
|
repoUrl: string | null;
|
||||||
|
repoRef: string | null;
|
||||||
|
defaultRef: string | null;
|
||||||
|
repoName: string | null;
|
||||||
|
localFolder: string | null;
|
||||||
|
managedFolder: string;
|
||||||
|
effectiveLocalFolder: string;
|
||||||
|
origin: ProjectCodebaseOrigin;
|
||||||
|
}
|
||||||
|
|
||||||
export interface Project {
|
export interface Project {
|
||||||
id: string;
|
id: string;
|
||||||
companyId: string;
|
companyId: string;
|
||||||
@@ -38,6 +63,7 @@ export interface Project {
|
|||||||
pauseReason: PauseReason | null;
|
pauseReason: PauseReason | null;
|
||||||
pausedAt: Date | null;
|
pausedAt: Date | null;
|
||||||
executionWorkspacePolicy: ProjectExecutionWorkspacePolicy | null;
|
executionWorkspacePolicy: ProjectExecutionWorkspacePolicy | null;
|
||||||
|
codebase: ProjectCodebase;
|
||||||
workspaces: ProjectWorkspace[];
|
workspaces: ProjectWorkspace[];
|
||||||
primaryWorkspace: ProjectWorkspace | null;
|
primaryWorkspace: ProjectWorkspace | null;
|
||||||
archivedAt: Date | null;
|
archivedAt: Date | null;
|
||||||
|
|||||||
55
packages/shared/src/types/work-product.ts
Normal file
55
packages/shared/src/types/work-product.ts
Normal file
@@ -0,0 +1,55 @@
|
|||||||
|
export type IssueWorkProductType =
|
||||||
|
| "preview_url"
|
||||||
|
| "runtime_service"
|
||||||
|
| "pull_request"
|
||||||
|
| "branch"
|
||||||
|
| "commit"
|
||||||
|
| "artifact"
|
||||||
|
| "document";
|
||||||
|
|
||||||
|
export type IssueWorkProductProvider =
|
||||||
|
| "paperclip"
|
||||||
|
| "github"
|
||||||
|
| "vercel"
|
||||||
|
| "s3"
|
||||||
|
| "custom";
|
||||||
|
|
||||||
|
export type IssueWorkProductStatus =
|
||||||
|
| "active"
|
||||||
|
| "ready_for_review"
|
||||||
|
| "approved"
|
||||||
|
| "changes_requested"
|
||||||
|
| "merged"
|
||||||
|
| "closed"
|
||||||
|
| "failed"
|
||||||
|
| "archived"
|
||||||
|
| "draft";
|
||||||
|
|
||||||
|
export type IssueWorkProductReviewState =
|
||||||
|
| "none"
|
||||||
|
| "needs_board_review"
|
||||||
|
| "approved"
|
||||||
|
| "changes_requested";
|
||||||
|
|
||||||
|
export interface IssueWorkProduct {
|
||||||
|
id: string;
|
||||||
|
companyId: string;
|
||||||
|
projectId: string | null;
|
||||||
|
issueId: string;
|
||||||
|
executionWorkspaceId: string | null;
|
||||||
|
runtimeServiceId: string | null;
|
||||||
|
type: IssueWorkProductType;
|
||||||
|
provider: IssueWorkProductProvider | string;
|
||||||
|
externalId: string | null;
|
||||||
|
title: string;
|
||||||
|
url: string | null;
|
||||||
|
status: IssueWorkProductStatus | string;
|
||||||
|
reviewState: IssueWorkProductReviewState;
|
||||||
|
isPrimary: boolean;
|
||||||
|
healthStatus: "unknown" | "healthy" | "unhealthy";
|
||||||
|
summary: string | null;
|
||||||
|
metadata: Record<string, unknown> | null;
|
||||||
|
createdByRunId: string | null;
|
||||||
|
createdAt: Date;
|
||||||
|
updatedAt: Date;
|
||||||
|
}
|
||||||
31
packages/shared/src/types/workspace-operation.ts
Normal file
31
packages/shared/src/types/workspace-operation.ts
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
export type WorkspaceOperationPhase =
|
||||||
|
| "worktree_prepare"
|
||||||
|
| "workspace_provision"
|
||||||
|
| "workspace_teardown"
|
||||||
|
| "worktree_cleanup";
|
||||||
|
|
||||||
|
export type WorkspaceOperationStatus = "running" | "succeeded" | "failed" | "skipped";
|
||||||
|
|
||||||
|
export interface WorkspaceOperation {
|
||||||
|
id: string;
|
||||||
|
companyId: string;
|
||||||
|
executionWorkspaceId: string | null;
|
||||||
|
heartbeatRunId: string | null;
|
||||||
|
phase: WorkspaceOperationPhase;
|
||||||
|
command: string | null;
|
||||||
|
cwd: string | null;
|
||||||
|
status: WorkspaceOperationStatus;
|
||||||
|
exitCode: number | null;
|
||||||
|
logStore: string | null;
|
||||||
|
logRef: string | null;
|
||||||
|
logBytes: number | null;
|
||||||
|
logSha256: string | null;
|
||||||
|
logCompressed: boolean;
|
||||||
|
stdoutExcerpt: string | null;
|
||||||
|
stderrExcerpt: string | null;
|
||||||
|
metadata: Record<string, unknown> | null;
|
||||||
|
startedAt: Date;
|
||||||
|
finishedAt: Date | null;
|
||||||
|
createdAt: Date;
|
||||||
|
updatedAt: Date;
|
||||||
|
}
|
||||||
@@ -1,6 +1,35 @@
|
|||||||
export type ExecutionWorkspaceStrategyType = "project_primary" | "git_worktree";
|
export type ExecutionWorkspaceStrategyType =
|
||||||
|
| "project_primary"
|
||||||
|
| "git_worktree"
|
||||||
|
| "adapter_managed"
|
||||||
|
| "cloud_sandbox";
|
||||||
|
|
||||||
export type ExecutionWorkspaceMode = "inherit" | "project_primary" | "isolated" | "agent_default";
|
export type ProjectExecutionWorkspaceDefaultMode =
|
||||||
|
| "shared_workspace"
|
||||||
|
| "isolated_workspace"
|
||||||
|
| "operator_branch"
|
||||||
|
| "adapter_default";
|
||||||
|
|
||||||
|
export type ExecutionWorkspaceMode =
|
||||||
|
| "inherit"
|
||||||
|
| "shared_workspace"
|
||||||
|
| "isolated_workspace"
|
||||||
|
| "operator_branch"
|
||||||
|
| "reuse_existing"
|
||||||
|
| "agent_default";
|
||||||
|
|
||||||
|
export type ExecutionWorkspaceProviderType =
|
||||||
|
| "local_fs"
|
||||||
|
| "git_worktree"
|
||||||
|
| "adapter_managed"
|
||||||
|
| "cloud_sandbox";
|
||||||
|
|
||||||
|
export type ExecutionWorkspaceStatus =
|
||||||
|
| "active"
|
||||||
|
| "idle"
|
||||||
|
| "in_review"
|
||||||
|
| "archived"
|
||||||
|
| "cleanup_failed";
|
||||||
|
|
||||||
export interface ExecutionWorkspaceStrategy {
|
export interface ExecutionWorkspaceStrategy {
|
||||||
type: ExecutionWorkspaceStrategyType;
|
type: ExecutionWorkspaceStrategyType;
|
||||||
@@ -13,12 +42,14 @@ export interface ExecutionWorkspaceStrategy {
|
|||||||
|
|
||||||
export interface ProjectExecutionWorkspacePolicy {
|
export interface ProjectExecutionWorkspacePolicy {
|
||||||
enabled: boolean;
|
enabled: boolean;
|
||||||
defaultMode?: "project_primary" | "isolated";
|
defaultMode?: ProjectExecutionWorkspaceDefaultMode;
|
||||||
allowIssueOverride?: boolean;
|
allowIssueOverride?: boolean;
|
||||||
|
defaultProjectWorkspaceId?: string | null;
|
||||||
workspaceStrategy?: ExecutionWorkspaceStrategy | null;
|
workspaceStrategy?: ExecutionWorkspaceStrategy | null;
|
||||||
workspaceRuntime?: Record<string, unknown> | null;
|
workspaceRuntime?: Record<string, unknown> | null;
|
||||||
branchPolicy?: Record<string, unknown> | null;
|
branchPolicy?: Record<string, unknown> | null;
|
||||||
pullRequestPolicy?: Record<string, unknown> | null;
|
pullRequestPolicy?: Record<string, unknown> | null;
|
||||||
|
runtimePolicy?: Record<string, unknown> | null;
|
||||||
cleanupPolicy?: Record<string, unknown> | null;
|
cleanupPolicy?: Record<string, unknown> | null;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -28,11 +59,39 @@ export interface IssueExecutionWorkspaceSettings {
|
|||||||
workspaceRuntime?: Record<string, unknown> | null;
|
workspaceRuntime?: Record<string, unknown> | null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export interface ExecutionWorkspace {
|
||||||
|
id: string;
|
||||||
|
companyId: string;
|
||||||
|
projectId: string;
|
||||||
|
projectWorkspaceId: string | null;
|
||||||
|
sourceIssueId: string | null;
|
||||||
|
mode: Exclude<ExecutionWorkspaceMode, "inherit" | "reuse_existing" | "agent_default"> | "adapter_managed" | "cloud_sandbox";
|
||||||
|
strategyType: ExecutionWorkspaceStrategyType;
|
||||||
|
name: string;
|
||||||
|
status: ExecutionWorkspaceStatus;
|
||||||
|
cwd: string | null;
|
||||||
|
repoUrl: string | null;
|
||||||
|
baseRef: string | null;
|
||||||
|
branchName: string | null;
|
||||||
|
providerType: ExecutionWorkspaceProviderType;
|
||||||
|
providerRef: string | null;
|
||||||
|
derivedFromExecutionWorkspaceId: string | null;
|
||||||
|
lastUsedAt: Date;
|
||||||
|
openedAt: Date;
|
||||||
|
closedAt: Date | null;
|
||||||
|
cleanupEligibleAt: Date | null;
|
||||||
|
cleanupReason: string | null;
|
||||||
|
metadata: Record<string, unknown> | null;
|
||||||
|
createdAt: Date;
|
||||||
|
updatedAt: Date;
|
||||||
|
}
|
||||||
|
|
||||||
export interface WorkspaceRuntimeService {
|
export interface WorkspaceRuntimeService {
|
||||||
id: string;
|
id: string;
|
||||||
companyId: string;
|
companyId: string;
|
||||||
projectId: string | null;
|
projectId: string | null;
|
||||||
projectWorkspaceId: string | null;
|
projectWorkspaceId: string | null;
|
||||||
|
executionWorkspaceId: string | null;
|
||||||
issueId: string | null;
|
issueId: string | null;
|
||||||
scopeType: "project_workspace" | "execution_workspace" | "run" | "agent";
|
scopeType: "project_workspace" | "execution_workspace" | "run" | "agent";
|
||||||
scopeId: string | null;
|
scopeId: string | null;
|
||||||
|
|||||||
18
packages/shared/src/validators/execution-workspace.ts
Normal file
18
packages/shared/src/validators/execution-workspace.ts
Normal file
@@ -0,0 +1,18 @@
|
|||||||
|
import { z } from "zod";
|
||||||
|
|
||||||
|
export const executionWorkspaceStatusSchema = z.enum([
|
||||||
|
"active",
|
||||||
|
"idle",
|
||||||
|
"in_review",
|
||||||
|
"archived",
|
||||||
|
"cleanup_failed",
|
||||||
|
]);
|
||||||
|
|
||||||
|
export const updateExecutionWorkspaceSchema = z.object({
|
||||||
|
status: executionWorkspaceStatusSchema.optional(),
|
||||||
|
cleanupEligibleAt: z.string().datetime().optional().nullable(),
|
||||||
|
cleanupReason: z.string().optional().nullable(),
|
||||||
|
metadata: z.record(z.unknown()).optional().nullable(),
|
||||||
|
}).strict();
|
||||||
|
|
||||||
|
export type UpdateExecutionWorkspace = z.infer<typeof updateExecutionWorkspaceSchema>;
|
||||||
@@ -1,3 +1,10 @@
|
|||||||
|
export {
|
||||||
|
instanceExperimentalSettingsSchema,
|
||||||
|
patchInstanceExperimentalSettingsSchema,
|
||||||
|
type InstanceExperimentalSettings,
|
||||||
|
type PatchInstanceExperimentalSettings,
|
||||||
|
} from "./instance.js";
|
||||||
|
|
||||||
export {
|
export {
|
||||||
upsertBudgetPolicySchema,
|
upsertBudgetPolicySchema,
|
||||||
resolveBudgetIncidentSchema,
|
resolveBudgetIncidentSchema,
|
||||||
@@ -88,6 +95,22 @@ export {
|
|||||||
type UpsertIssueDocument,
|
type UpsertIssueDocument,
|
||||||
} from "./issue.js";
|
} from "./issue.js";
|
||||||
|
|
||||||
|
export {
|
||||||
|
createIssueWorkProductSchema,
|
||||||
|
updateIssueWorkProductSchema,
|
||||||
|
issueWorkProductTypeSchema,
|
||||||
|
issueWorkProductStatusSchema,
|
||||||
|
issueWorkProductReviewStateSchema,
|
||||||
|
type CreateIssueWorkProduct,
|
||||||
|
type UpdateIssueWorkProduct,
|
||||||
|
} from "./work-product.js";
|
||||||
|
|
||||||
|
export {
|
||||||
|
updateExecutionWorkspaceSchema,
|
||||||
|
executionWorkspaceStatusSchema,
|
||||||
|
type UpdateExecutionWorkspace,
|
||||||
|
} from "./execution-workspace.js";
|
||||||
|
|
||||||
export {
|
export {
|
||||||
createGoalSchema,
|
createGoalSchema,
|
||||||
updateGoalSchema,
|
updateGoalSchema,
|
||||||
|
|||||||
10
packages/shared/src/validators/instance.ts
Normal file
10
packages/shared/src/validators/instance.ts
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
import { z } from "zod";
|
||||||
|
|
||||||
|
export const instanceExperimentalSettingsSchema = z.object({
|
||||||
|
enableIsolatedWorkspaces: z.boolean().default(false),
|
||||||
|
}).strict();
|
||||||
|
|
||||||
|
export const patchInstanceExperimentalSettingsSchema = instanceExperimentalSettingsSchema.partial();
|
||||||
|
|
||||||
|
export type InstanceExperimentalSettings = z.infer<typeof instanceExperimentalSettingsSchema>;
|
||||||
|
export type PatchInstanceExperimentalSettings = z.infer<typeof patchInstanceExperimentalSettingsSchema>;
|
||||||
@@ -3,7 +3,7 @@ import { ISSUE_PRIORITIES, ISSUE_STATUSES } from "../constants.js";
|
|||||||
|
|
||||||
const executionWorkspaceStrategySchema = z
|
const executionWorkspaceStrategySchema = z
|
||||||
.object({
|
.object({
|
||||||
type: z.enum(["project_primary", "git_worktree"]).optional(),
|
type: z.enum(["project_primary", "git_worktree", "adapter_managed", "cloud_sandbox"]).optional(),
|
||||||
baseRef: z.string().optional().nullable(),
|
baseRef: z.string().optional().nullable(),
|
||||||
branchTemplate: z.string().optional().nullable(),
|
branchTemplate: z.string().optional().nullable(),
|
||||||
worktreeParentDir: z.string().optional().nullable(),
|
worktreeParentDir: z.string().optional().nullable(),
|
||||||
@@ -14,7 +14,7 @@ const executionWorkspaceStrategySchema = z
|
|||||||
|
|
||||||
export const issueExecutionWorkspaceSettingsSchema = z
|
export const issueExecutionWorkspaceSettingsSchema = z
|
||||||
.object({
|
.object({
|
||||||
mode: z.enum(["inherit", "project_primary", "isolated", "agent_default"]).optional(),
|
mode: z.enum(["inherit", "shared_workspace", "isolated_workspace", "operator_branch", "reuse_existing", "agent_default"]).optional(),
|
||||||
workspaceStrategy: executionWorkspaceStrategySchema.optional().nullable(),
|
workspaceStrategy: executionWorkspaceStrategySchema.optional().nullable(),
|
||||||
workspaceRuntime: z.record(z.unknown()).optional().nullable(),
|
workspaceRuntime: z.record(z.unknown()).optional().nullable(),
|
||||||
})
|
})
|
||||||
@@ -29,6 +29,7 @@ export const issueAssigneeAdapterOverridesSchema = z
|
|||||||
|
|
||||||
export const createIssueSchema = z.object({
|
export const createIssueSchema = z.object({
|
||||||
projectId: z.string().uuid().optional().nullable(),
|
projectId: z.string().uuid().optional().nullable(),
|
||||||
|
projectWorkspaceId: z.string().uuid().optional().nullable(),
|
||||||
goalId: z.string().uuid().optional().nullable(),
|
goalId: z.string().uuid().optional().nullable(),
|
||||||
parentId: z.string().uuid().optional().nullable(),
|
parentId: z.string().uuid().optional().nullable(),
|
||||||
title: z.string().min(1),
|
title: z.string().min(1),
|
||||||
@@ -40,6 +41,15 @@ export const createIssueSchema = z.object({
|
|||||||
requestDepth: z.number().int().nonnegative().optional().default(0),
|
requestDepth: z.number().int().nonnegative().optional().default(0),
|
||||||
billingCode: z.string().optional().nullable(),
|
billingCode: z.string().optional().nullable(),
|
||||||
assigneeAdapterOverrides: issueAssigneeAdapterOverridesSchema.optional().nullable(),
|
assigneeAdapterOverrides: issueAssigneeAdapterOverridesSchema.optional().nullable(),
|
||||||
|
executionWorkspaceId: z.string().uuid().optional().nullable(),
|
||||||
|
executionWorkspacePreference: z.enum([
|
||||||
|
"inherit",
|
||||||
|
"shared_workspace",
|
||||||
|
"isolated_workspace",
|
||||||
|
"operator_branch",
|
||||||
|
"reuse_existing",
|
||||||
|
"agent_default",
|
||||||
|
]).optional().nullable(),
|
||||||
executionWorkspaceSettings: issueExecutionWorkspaceSettingsSchema.optional().nullable(),
|
executionWorkspaceSettings: issueExecutionWorkspaceSettingsSchema.optional().nullable(),
|
||||||
labelIds: z.array(z.string().uuid()).optional(),
|
labelIds: z.array(z.string().uuid()).optional(),
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -3,7 +3,7 @@ import { PROJECT_STATUSES } from "../constants.js";
|
|||||||
|
|
||||||
const executionWorkspaceStrategySchema = z
|
const executionWorkspaceStrategySchema = z
|
||||||
.object({
|
.object({
|
||||||
type: z.enum(["project_primary", "git_worktree"]).optional(),
|
type: z.enum(["project_primary", "git_worktree", "adapter_managed", "cloud_sandbox"]).optional(),
|
||||||
baseRef: z.string().optional().nullable(),
|
baseRef: z.string().optional().nullable(),
|
||||||
branchTemplate: z.string().optional().nullable(),
|
branchTemplate: z.string().optional().nullable(),
|
||||||
worktreeParentDir: z.string().optional().nullable(),
|
worktreeParentDir: z.string().optional().nullable(),
|
||||||
@@ -15,30 +15,54 @@ const executionWorkspaceStrategySchema = z
|
|||||||
export const projectExecutionWorkspacePolicySchema = z
|
export const projectExecutionWorkspacePolicySchema = z
|
||||||
.object({
|
.object({
|
||||||
enabled: z.boolean(),
|
enabled: z.boolean(),
|
||||||
defaultMode: z.enum(["project_primary", "isolated"]).optional(),
|
defaultMode: z.enum(["shared_workspace", "isolated_workspace", "operator_branch", "adapter_default"]).optional(),
|
||||||
allowIssueOverride: z.boolean().optional(),
|
allowIssueOverride: z.boolean().optional(),
|
||||||
|
defaultProjectWorkspaceId: z.string().uuid().optional().nullable(),
|
||||||
workspaceStrategy: executionWorkspaceStrategySchema.optional().nullable(),
|
workspaceStrategy: executionWorkspaceStrategySchema.optional().nullable(),
|
||||||
workspaceRuntime: z.record(z.unknown()).optional().nullable(),
|
workspaceRuntime: z.record(z.unknown()).optional().nullable(),
|
||||||
branchPolicy: z.record(z.unknown()).optional().nullable(),
|
branchPolicy: z.record(z.unknown()).optional().nullable(),
|
||||||
pullRequestPolicy: z.record(z.unknown()).optional().nullable(),
|
pullRequestPolicy: z.record(z.unknown()).optional().nullable(),
|
||||||
|
runtimePolicy: z.record(z.unknown()).optional().nullable(),
|
||||||
cleanupPolicy: z.record(z.unknown()).optional().nullable(),
|
cleanupPolicy: z.record(z.unknown()).optional().nullable(),
|
||||||
})
|
})
|
||||||
.strict();
|
.strict();
|
||||||
|
|
||||||
|
const projectWorkspaceSourceTypeSchema = z.enum(["local_path", "git_repo", "remote_managed", "non_git_path"]);
|
||||||
|
const projectWorkspaceVisibilitySchema = z.enum(["default", "advanced"]);
|
||||||
|
|
||||||
const projectWorkspaceFields = {
|
const projectWorkspaceFields = {
|
||||||
name: z.string().min(1).optional(),
|
name: z.string().min(1).optional(),
|
||||||
|
sourceType: projectWorkspaceSourceTypeSchema.optional(),
|
||||||
cwd: z.string().min(1).optional().nullable(),
|
cwd: z.string().min(1).optional().nullable(),
|
||||||
repoUrl: z.string().url().optional().nullable(),
|
repoUrl: z.string().url().optional().nullable(),
|
||||||
repoRef: z.string().optional().nullable(),
|
repoRef: z.string().optional().nullable(),
|
||||||
|
defaultRef: z.string().optional().nullable(),
|
||||||
|
visibility: projectWorkspaceVisibilitySchema.optional(),
|
||||||
|
setupCommand: z.string().optional().nullable(),
|
||||||
|
cleanupCommand: z.string().optional().nullable(),
|
||||||
|
remoteProvider: z.string().optional().nullable(),
|
||||||
|
remoteWorkspaceRef: z.string().optional().nullable(),
|
||||||
|
sharedWorkspaceKey: z.string().optional().nullable(),
|
||||||
metadata: z.record(z.unknown()).optional().nullable(),
|
metadata: z.record(z.unknown()).optional().nullable(),
|
||||||
};
|
};
|
||||||
|
|
||||||
export const createProjectWorkspaceSchema = z.object({
|
function validateProjectWorkspace(value: Record<string, unknown>, ctx: z.RefinementCtx) {
|
||||||
...projectWorkspaceFields,
|
const sourceType = value.sourceType ?? "local_path";
|
||||||
isPrimary: z.boolean().optional().default(false),
|
|
||||||
}).superRefine((value, ctx) => {
|
|
||||||
const hasCwd = typeof value.cwd === "string" && value.cwd.trim().length > 0;
|
const hasCwd = typeof value.cwd === "string" && value.cwd.trim().length > 0;
|
||||||
const hasRepo = typeof value.repoUrl === "string" && value.repoUrl.trim().length > 0;
|
const hasRepo = typeof value.repoUrl === "string" && value.repoUrl.trim().length > 0;
|
||||||
|
const hasRemoteRef = typeof value.remoteWorkspaceRef === "string" && value.remoteWorkspaceRef.trim().length > 0;
|
||||||
|
|
||||||
|
if (sourceType === "remote_managed") {
|
||||||
|
if (!hasRemoteRef && !hasRepo) {
|
||||||
|
ctx.addIssue({
|
||||||
|
code: z.ZodIssueCode.custom,
|
||||||
|
message: "Remote-managed workspace requires remoteWorkspaceRef or repoUrl.",
|
||||||
|
path: ["remoteWorkspaceRef"],
|
||||||
|
});
|
||||||
|
}
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
if (!hasCwd && !hasRepo) {
|
if (!hasCwd && !hasRepo) {
|
||||||
ctx.addIssue({
|
ctx.addIssue({
|
||||||
code: z.ZodIssueCode.custom,
|
code: z.ZodIssueCode.custom,
|
||||||
@@ -46,7 +70,12 @@ export const createProjectWorkspaceSchema = z.object({
|
|||||||
path: ["cwd"],
|
path: ["cwd"],
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
});
|
}
|
||||||
|
|
||||||
|
export const createProjectWorkspaceSchema = z.object({
|
||||||
|
...projectWorkspaceFields,
|
||||||
|
isPrimary: z.boolean().optional().default(false),
|
||||||
|
}).superRefine(validateProjectWorkspace);
|
||||||
|
|
||||||
export type CreateProjectWorkspace = z.infer<typeof createProjectWorkspaceSchema>;
|
export type CreateProjectWorkspace = z.infer<typeof createProjectWorkspaceSchema>;
|
||||||
|
|
||||||
|
|||||||
54
packages/shared/src/validators/work-product.ts
Normal file
54
packages/shared/src/validators/work-product.ts
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
import { z } from "zod";
|
||||||
|
|
||||||
|
export const issueWorkProductTypeSchema = z.enum([
|
||||||
|
"preview_url",
|
||||||
|
"runtime_service",
|
||||||
|
"pull_request",
|
||||||
|
"branch",
|
||||||
|
"commit",
|
||||||
|
"artifact",
|
||||||
|
"document",
|
||||||
|
]);
|
||||||
|
|
||||||
|
export const issueWorkProductStatusSchema = z.enum([
|
||||||
|
"active",
|
||||||
|
"ready_for_review",
|
||||||
|
"approved",
|
||||||
|
"changes_requested",
|
||||||
|
"merged",
|
||||||
|
"closed",
|
||||||
|
"failed",
|
||||||
|
"archived",
|
||||||
|
"draft",
|
||||||
|
]);
|
||||||
|
|
||||||
|
export const issueWorkProductReviewStateSchema = z.enum([
|
||||||
|
"none",
|
||||||
|
"needs_board_review",
|
||||||
|
"approved",
|
||||||
|
"changes_requested",
|
||||||
|
]);
|
||||||
|
|
||||||
|
export const createIssueWorkProductSchema = z.object({
|
||||||
|
projectId: z.string().uuid().optional().nullable(),
|
||||||
|
executionWorkspaceId: z.string().uuid().optional().nullable(),
|
||||||
|
runtimeServiceId: z.string().uuid().optional().nullable(),
|
||||||
|
type: issueWorkProductTypeSchema,
|
||||||
|
provider: z.string().min(1),
|
||||||
|
externalId: z.string().optional().nullable(),
|
||||||
|
title: z.string().min(1),
|
||||||
|
url: z.string().url().optional().nullable(),
|
||||||
|
status: issueWorkProductStatusSchema.default("active"),
|
||||||
|
reviewState: issueWorkProductReviewStateSchema.optional().default("none"),
|
||||||
|
isPrimary: z.boolean().optional().default(false),
|
||||||
|
healthStatus: z.enum(["unknown", "healthy", "unhealthy"]).optional().default("unknown"),
|
||||||
|
summary: z.string().optional().nullable(),
|
||||||
|
metadata: z.record(z.unknown()).optional().nullable(),
|
||||||
|
createdByRunId: z.string().uuid().optional().nullable(),
|
||||||
|
});
|
||||||
|
|
||||||
|
export type CreateIssueWorkProduct = z.infer<typeof createIssueWorkProductSchema>;
|
||||||
|
|
||||||
|
export const updateIssueWorkProductSchema = createIssueWorkProductSchema.partial();
|
||||||
|
|
||||||
|
export type UpdateIssueWorkProduct = z.infer<typeof updateIssueWorkProductSchema>;
|
||||||
19
server/src/__tests__/app-hmr-port.test.ts
Normal file
19
server/src/__tests__/app-hmr-port.test.ts
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
import { describe, expect, it } from "vitest";
|
||||||
|
import { resolveViteHmrPort } from "../app.ts";
|
||||||
|
|
||||||
|
describe("resolveViteHmrPort", () => {
|
||||||
|
it("uses serverPort + 10000 when the result stays in range", () => {
|
||||||
|
expect(resolveViteHmrPort(3100)).toBe(13_100);
|
||||||
|
expect(resolveViteHmrPort(55_535)).toBe(65_535);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("falls back below the server port when adding 10000 would overflow", () => {
|
||||||
|
expect(resolveViteHmrPort(55_536)).toBe(45_536);
|
||||||
|
expect(resolveViteHmrPort(63_000)).toBe(53_000);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("never returns a privileged or invalid port", () => {
|
||||||
|
expect(resolveViteHmrPort(65_535)).toBe(55_535);
|
||||||
|
expect(resolveViteHmrPort(9_000)).toBe(19_000);
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -35,6 +35,11 @@ type CapturePayload = {
|
|||||||
paperclipEnvKeys: string[];
|
paperclipEnvKeys: string[];
|
||||||
};
|
};
|
||||||
|
|
||||||
|
type LogEntry = {
|
||||||
|
stream: "stdout" | "stderr";
|
||||||
|
chunk: string;
|
||||||
|
};
|
||||||
|
|
||||||
describe("codex execute", () => {
|
describe("codex execute", () => {
|
||||||
it("uses a worktree-isolated CODEX_HOME while preserving shared auth and config", async () => {
|
it("uses a worktree-isolated CODEX_HOME while preserving shared auth and config", async () => {
|
||||||
const root = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-codex-execute-"));
|
const root = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-codex-execute-"));
|
||||||
@@ -62,6 +67,7 @@ describe("codex execute", () => {
|
|||||||
process.env.CODEX_HOME = sharedCodexHome;
|
process.env.CODEX_HOME = sharedCodexHome;
|
||||||
|
|
||||||
try {
|
try {
|
||||||
|
const logs: LogEntry[] = [];
|
||||||
const result = await execute({
|
const result = await execute({
|
||||||
runId: "run-1",
|
runId: "run-1",
|
||||||
agent: {
|
agent: {
|
||||||
@@ -87,7 +93,9 @@ describe("codex execute", () => {
|
|||||||
},
|
},
|
||||||
context: {},
|
context: {},
|
||||||
authToken: "run-jwt-token",
|
authToken: "run-jwt-token",
|
||||||
onLog: async () => {},
|
onLog: async (stream, chunk) => {
|
||||||
|
logs.push({ stream, chunk });
|
||||||
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
expect(result.exitCode).toBe(0);
|
expect(result.exitCode).toBe(0);
|
||||||
@@ -116,6 +124,18 @@ describe("codex execute", () => {
|
|||||||
expect((await fs.lstat(isolatedConfig)).isFile()).toBe(true);
|
expect((await fs.lstat(isolatedConfig)).isFile()).toBe(true);
|
||||||
expect(await fs.readFile(isolatedConfig, "utf8")).toBe('model = "codex-mini-latest"\n');
|
expect(await fs.readFile(isolatedConfig, "utf8")).toBe('model = "codex-mini-latest"\n');
|
||||||
expect((await fs.lstat(isolatedSkill)).isSymbolicLink()).toBe(true);
|
expect((await fs.lstat(isolatedSkill)).isSymbolicLink()).toBe(true);
|
||||||
|
expect(logs).toContainEqual(
|
||||||
|
expect.objectContaining({
|
||||||
|
stream: "stdout",
|
||||||
|
chunk: expect.stringContaining("Using worktree-isolated Codex home"),
|
||||||
|
}),
|
||||||
|
);
|
||||||
|
expect(logs).toContainEqual(
|
||||||
|
expect.objectContaining({
|
||||||
|
stream: "stdout",
|
||||||
|
chunk: expect.stringContaining('Injected Codex skill "paperclip"'),
|
||||||
|
}),
|
||||||
|
);
|
||||||
} finally {
|
} finally {
|
||||||
if (previousHome === undefined) delete process.env.HOME;
|
if (previousHome === undefined) delete process.env.HOME;
|
||||||
else process.env.HOME = previousHome;
|
else process.env.HOME = previousHome;
|
||||||
|
|||||||
@@ -50,10 +50,10 @@ describe("codex local adapter skill injection", () => {
|
|||||||
await createPaperclipRepoSkill(oldRepo, "paperclip");
|
await createPaperclipRepoSkill(oldRepo, "paperclip");
|
||||||
await fs.symlink(path.join(oldRepo, "skills", "paperclip"), path.join(skillsHome, "paperclip"));
|
await fs.symlink(path.join(oldRepo, "skills", "paperclip"), path.join(skillsHome, "paperclip"));
|
||||||
|
|
||||||
const logs: string[] = [];
|
const logs: Array<{ stream: "stdout" | "stderr"; chunk: string }> = [];
|
||||||
await ensureCodexSkillsInjected(
|
await ensureCodexSkillsInjected(
|
||||||
async (_stream, chunk) => {
|
async (stream, chunk) => {
|
||||||
logs.push(chunk);
|
logs.push({ stream, chunk });
|
||||||
},
|
},
|
||||||
{
|
{
|
||||||
skillsHome,
|
skillsHome,
|
||||||
@@ -64,7 +64,12 @@ describe("codex local adapter skill injection", () => {
|
|||||||
expect(await fs.realpath(path.join(skillsHome, "paperclip"))).toBe(
|
expect(await fs.realpath(path.join(skillsHome, "paperclip"))).toBe(
|
||||||
await fs.realpath(path.join(currentRepo, "skills", "paperclip")),
|
await fs.realpath(path.join(currentRepo, "skills", "paperclip")),
|
||||||
);
|
);
|
||||||
expect(logs.some((line) => line.includes('Repaired Codex skill "paperclip"'))).toBe(true);
|
expect(logs).toContainEqual(
|
||||||
|
expect.objectContaining({
|
||||||
|
stream: "stdout",
|
||||||
|
chunk: expect.stringContaining('Repaired Codex skill "paperclip"'),
|
||||||
|
}),
|
||||||
|
);
|
||||||
});
|
});
|
||||||
|
|
||||||
it("preserves a custom Codex skill symlink outside Paperclip repo checkouts", async () => {
|
it("preserves a custom Codex skill symlink outside Paperclip repo checkouts", async () => {
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ import { describe, expect, it } from "vitest";
|
|||||||
import {
|
import {
|
||||||
buildExecutionWorkspaceAdapterConfig,
|
buildExecutionWorkspaceAdapterConfig,
|
||||||
defaultIssueExecutionWorkspaceSettingsForProject,
|
defaultIssueExecutionWorkspaceSettingsForProject,
|
||||||
|
gateProjectExecutionWorkspacePolicy,
|
||||||
parseIssueExecutionWorkspaceSettings,
|
parseIssueExecutionWorkspaceSettings,
|
||||||
parseProjectExecutionWorkspacePolicy,
|
parseProjectExecutionWorkspacePolicy,
|
||||||
resolveExecutionWorkspaceMode,
|
resolveExecutionWorkspaceMode,
|
||||||
@@ -12,36 +13,36 @@ describe("execution workspace policy helpers", () => {
|
|||||||
expect(
|
expect(
|
||||||
defaultIssueExecutionWorkspaceSettingsForProject({
|
defaultIssueExecutionWorkspaceSettingsForProject({
|
||||||
enabled: true,
|
enabled: true,
|
||||||
defaultMode: "isolated",
|
defaultMode: "isolated_workspace",
|
||||||
}),
|
}),
|
||||||
).toEqual({ mode: "isolated" });
|
).toEqual({ mode: "isolated_workspace" });
|
||||||
expect(
|
expect(
|
||||||
defaultIssueExecutionWorkspaceSettingsForProject({
|
defaultIssueExecutionWorkspaceSettingsForProject({
|
||||||
enabled: true,
|
enabled: true,
|
||||||
defaultMode: "project_primary",
|
defaultMode: "shared_workspace",
|
||||||
}),
|
}),
|
||||||
).toEqual({ mode: "project_primary" });
|
).toEqual({ mode: "shared_workspace" });
|
||||||
expect(defaultIssueExecutionWorkspaceSettingsForProject(null)).toBeNull();
|
expect(defaultIssueExecutionWorkspaceSettingsForProject(null)).toBeNull();
|
||||||
});
|
});
|
||||||
|
|
||||||
it("prefers explicit issue mode over project policy and legacy overrides", () => {
|
it("prefers explicit issue mode over project policy and legacy overrides", () => {
|
||||||
expect(
|
expect(
|
||||||
resolveExecutionWorkspaceMode({
|
resolveExecutionWorkspaceMode({
|
||||||
projectPolicy: { enabled: true, defaultMode: "project_primary" },
|
projectPolicy: { enabled: true, defaultMode: "shared_workspace" },
|
||||||
issueSettings: { mode: "isolated" },
|
issueSettings: { mode: "isolated_workspace" },
|
||||||
legacyUseProjectWorkspace: false,
|
legacyUseProjectWorkspace: false,
|
||||||
}),
|
}),
|
||||||
).toBe("isolated");
|
).toBe("isolated_workspace");
|
||||||
});
|
});
|
||||||
|
|
||||||
it("falls back to project policy before legacy project-workspace compatibility flag", () => {
|
it("falls back to project policy before legacy project-workspace compatibility flag", () => {
|
||||||
expect(
|
expect(
|
||||||
resolveExecutionWorkspaceMode({
|
resolveExecutionWorkspaceMode({
|
||||||
projectPolicy: { enabled: true, defaultMode: "isolated" },
|
projectPolicy: { enabled: true, defaultMode: "isolated_workspace" },
|
||||||
issueSettings: null,
|
issueSettings: null,
|
||||||
legacyUseProjectWorkspace: false,
|
legacyUseProjectWorkspace: false,
|
||||||
}),
|
}),
|
||||||
).toBe("isolated");
|
).toBe("isolated_workspace");
|
||||||
expect(
|
expect(
|
||||||
resolveExecutionWorkspaceMode({
|
resolveExecutionWorkspaceMode({
|
||||||
projectPolicy: null,
|
projectPolicy: null,
|
||||||
@@ -58,7 +59,7 @@ describe("execution workspace policy helpers", () => {
|
|||||||
},
|
},
|
||||||
projectPolicy: {
|
projectPolicy: {
|
||||||
enabled: true,
|
enabled: true,
|
||||||
defaultMode: "isolated",
|
defaultMode: "isolated_workspace",
|
||||||
workspaceStrategy: {
|
workspaceStrategy: {
|
||||||
type: "git_worktree",
|
type: "git_worktree",
|
||||||
baseRef: "origin/main",
|
baseRef: "origin/main",
|
||||||
@@ -69,7 +70,7 @@ describe("execution workspace policy helpers", () => {
|
|||||||
},
|
},
|
||||||
},
|
},
|
||||||
issueSettings: null,
|
issueSettings: null,
|
||||||
mode: "isolated",
|
mode: "isolated_workspace",
|
||||||
legacyUseProjectWorkspace: null,
|
legacyUseProjectWorkspace: null,
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -92,9 +93,9 @@ describe("execution workspace policy helpers", () => {
|
|||||||
expect(
|
expect(
|
||||||
buildExecutionWorkspaceAdapterConfig({
|
buildExecutionWorkspaceAdapterConfig({
|
||||||
agentConfig: baseConfig,
|
agentConfig: baseConfig,
|
||||||
projectPolicy: { enabled: true, defaultMode: "isolated" },
|
projectPolicy: { enabled: true, defaultMode: "isolated_workspace" },
|
||||||
issueSettings: { mode: "project_primary" },
|
issueSettings: { mode: "shared_workspace" },
|
||||||
mode: "project_primary",
|
mode: "shared_workspace",
|
||||||
legacyUseProjectWorkspace: null,
|
legacyUseProjectWorkspace: null,
|
||||||
}).workspaceStrategy,
|
}).workspaceStrategy,
|
||||||
).toBeUndefined();
|
).toBeUndefined();
|
||||||
@@ -124,7 +125,7 @@ describe("execution workspace policy helpers", () => {
|
|||||||
}),
|
}),
|
||||||
).toEqual({
|
).toEqual({
|
||||||
enabled: true,
|
enabled: true,
|
||||||
defaultMode: "isolated",
|
defaultMode: "isolated_workspace",
|
||||||
workspaceStrategy: {
|
workspaceStrategy: {
|
||||||
type: "git_worktree",
|
type: "git_worktree",
|
||||||
worktreeParentDir: ".paperclip/worktrees",
|
worktreeParentDir: ".paperclip/worktrees",
|
||||||
@@ -137,7 +138,22 @@ describe("execution workspace policy helpers", () => {
|
|||||||
mode: "project_primary",
|
mode: "project_primary",
|
||||||
}),
|
}),
|
||||||
).toEqual({
|
).toEqual({
|
||||||
mode: "project_primary",
|
mode: "shared_workspace",
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it("disables project execution workspace policy when the instance flag is off", () => {
|
||||||
|
expect(
|
||||||
|
gateProjectExecutionWorkspacePolicy(
|
||||||
|
{ enabled: true, defaultMode: "isolated_workspace" },
|
||||||
|
false,
|
||||||
|
),
|
||||||
|
).toBeNull();
|
||||||
|
expect(
|
||||||
|
gateProjectExecutionWorkspacePolicy(
|
||||||
|
{ enabled: true, defaultMode: "isolated_workspace" },
|
||||||
|
true,
|
||||||
|
),
|
||||||
|
).toEqual({ enabled: true, defaultMode: "isolated_workspace" });
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -2,6 +2,7 @@ import { describe, expect, it } from "vitest";
|
|||||||
import type { agents } from "@paperclipai/db";
|
import type { agents } from "@paperclipai/db";
|
||||||
import { resolveDefaultAgentWorkspaceDir } from "../home-paths.js";
|
import { resolveDefaultAgentWorkspaceDir } from "../home-paths.js";
|
||||||
import {
|
import {
|
||||||
|
prioritizeProjectWorkspaceCandidatesForRun,
|
||||||
parseSessionCompactionPolicy,
|
parseSessionCompactionPolicy,
|
||||||
resolveRuntimeSessionParamsForWorkspace,
|
resolveRuntimeSessionParamsForWorkspace,
|
||||||
shouldResetTaskSessionForWake,
|
shouldResetTaskSessionForWake,
|
||||||
@@ -180,6 +181,42 @@ describe("shouldResetTaskSessionForWake", () => {
|
|||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
describe("prioritizeProjectWorkspaceCandidatesForRun", () => {
|
||||||
|
it("moves the explicitly selected workspace to the front", () => {
|
||||||
|
const rows = [
|
||||||
|
{ id: "workspace-1", cwd: "/tmp/one" },
|
||||||
|
{ id: "workspace-2", cwd: "/tmp/two" },
|
||||||
|
{ id: "workspace-3", cwd: "/tmp/three" },
|
||||||
|
];
|
||||||
|
|
||||||
|
expect(
|
||||||
|
prioritizeProjectWorkspaceCandidatesForRun(rows, "workspace-2").map((row) => row.id),
|
||||||
|
).toEqual(["workspace-2", "workspace-1", "workspace-3"]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("keeps the original order when no preferred workspace is selected", () => {
|
||||||
|
const rows = [
|
||||||
|
{ id: "workspace-1" },
|
||||||
|
{ id: "workspace-2" },
|
||||||
|
];
|
||||||
|
|
||||||
|
expect(
|
||||||
|
prioritizeProjectWorkspaceCandidatesForRun(rows, null).map((row) => row.id),
|
||||||
|
).toEqual(["workspace-1", "workspace-2"]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("keeps the original order when the selected workspace is missing", () => {
|
||||||
|
const rows = [
|
||||||
|
{ id: "workspace-1" },
|
||||||
|
{ id: "workspace-2" },
|
||||||
|
];
|
||||||
|
|
||||||
|
expect(
|
||||||
|
prioritizeProjectWorkspaceCandidatesForRun(rows, "workspace-9").map((row) => row.id),
|
||||||
|
).toEqual(["workspace-1", "workspace-2"]);
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
describe("parseSessionCompactionPolicy", () => {
|
describe("parseSessionCompactionPolicy", () => {
|
||||||
it("disables Paperclip-managed rotation by default for codex and claude local", () => {
|
it("disables Paperclip-managed rotation by default for codex and claude local", () => {
|
||||||
expect(parseSessionCompactionPolicy(buildAgent("codex_local"))).toEqual({
|
expect(parseSessionCompactionPolicy(buildAgent("codex_local"))).toEqual({
|
||||||
|
|||||||
99
server/src/__tests__/instance-settings-routes.test.ts
Normal file
99
server/src/__tests__/instance-settings-routes.test.ts
Normal file
@@ -0,0 +1,99 @@
|
|||||||
|
import express from "express";
|
||||||
|
import request from "supertest";
|
||||||
|
import { beforeEach, describe, expect, it, vi } from "vitest";
|
||||||
|
import { errorHandler } from "../middleware/index.js";
|
||||||
|
import { instanceSettingsRoutes } from "../routes/instance-settings.js";
|
||||||
|
|
||||||
|
const mockInstanceSettingsService = vi.hoisted(() => ({
|
||||||
|
getExperimental: vi.fn(),
|
||||||
|
updateExperimental: vi.fn(),
|
||||||
|
listCompanyIds: vi.fn(),
|
||||||
|
}));
|
||||||
|
const mockLogActivity = vi.hoisted(() => vi.fn());
|
||||||
|
|
||||||
|
vi.mock("../services/index.js", () => ({
|
||||||
|
instanceSettingsService: () => mockInstanceSettingsService,
|
||||||
|
logActivity: mockLogActivity,
|
||||||
|
}));
|
||||||
|
|
||||||
|
function createApp(actor: any) {
|
||||||
|
const app = express();
|
||||||
|
app.use(express.json());
|
||||||
|
app.use((req, _res, next) => {
|
||||||
|
req.actor = actor;
|
||||||
|
next();
|
||||||
|
});
|
||||||
|
app.use("/api", instanceSettingsRoutes({} as any));
|
||||||
|
app.use(errorHandler);
|
||||||
|
return app;
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("instance settings routes", () => {
|
||||||
|
beforeEach(() => {
|
||||||
|
vi.clearAllMocks();
|
||||||
|
mockInstanceSettingsService.getExperimental.mockResolvedValue({
|
||||||
|
enableIsolatedWorkspaces: false,
|
||||||
|
});
|
||||||
|
mockInstanceSettingsService.updateExperimental.mockResolvedValue({
|
||||||
|
id: "instance-settings-1",
|
||||||
|
experimental: {
|
||||||
|
enableIsolatedWorkspaces: true,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
mockInstanceSettingsService.listCompanyIds.mockResolvedValue(["company-1", "company-2"]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("allows local board users to read and update experimental settings", async () => {
|
||||||
|
const app = createApp({
|
||||||
|
type: "board",
|
||||||
|
userId: "local-board",
|
||||||
|
source: "local_implicit",
|
||||||
|
isInstanceAdmin: true,
|
||||||
|
});
|
||||||
|
|
||||||
|
const getRes = await request(app).get("/api/instance/settings/experimental");
|
||||||
|
expect(getRes.status).toBe(200);
|
||||||
|
expect(getRes.body).toEqual({ enableIsolatedWorkspaces: false });
|
||||||
|
|
||||||
|
const patchRes = await request(app)
|
||||||
|
.patch("/api/instance/settings/experimental")
|
||||||
|
.send({ enableIsolatedWorkspaces: true });
|
||||||
|
|
||||||
|
expect(patchRes.status).toBe(200);
|
||||||
|
expect(mockInstanceSettingsService.updateExperimental).toHaveBeenCalledWith({
|
||||||
|
enableIsolatedWorkspaces: true,
|
||||||
|
});
|
||||||
|
expect(mockLogActivity).toHaveBeenCalledTimes(2);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rejects non-admin board users", async () => {
|
||||||
|
const app = createApp({
|
||||||
|
type: "board",
|
||||||
|
userId: "user-1",
|
||||||
|
source: "session",
|
||||||
|
isInstanceAdmin: false,
|
||||||
|
companyIds: ["company-1"],
|
||||||
|
});
|
||||||
|
|
||||||
|
const res = await request(app).get("/api/instance/settings/experimental");
|
||||||
|
|
||||||
|
expect(res.status).toBe(403);
|
||||||
|
expect(mockInstanceSettingsService.getExperimental).not.toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("rejects agent callers", async () => {
|
||||||
|
const app = createApp({
|
||||||
|
type: "agent",
|
||||||
|
agentId: "agent-1",
|
||||||
|
companyId: "company-1",
|
||||||
|
source: "agent_key",
|
||||||
|
});
|
||||||
|
|
||||||
|
const res = await request(app)
|
||||||
|
.patch("/api/instance/settings/experimental")
|
||||||
|
.send({ enableIsolatedWorkspaces: true });
|
||||||
|
|
||||||
|
expect(res.status).toBe(403);
|
||||||
|
expect(mockInstanceSettingsService.updateExperimental).not.toHaveBeenCalled();
|
||||||
|
});
|
||||||
|
});
|
||||||
95
server/src/__tests__/work-products.test.ts
Normal file
95
server/src/__tests__/work-products.test.ts
Normal file
@@ -0,0 +1,95 @@
|
|||||||
|
import { describe, expect, it, vi } from "vitest";
|
||||||
|
import { workProductService } from "../services/work-products.ts";
|
||||||
|
|
||||||
|
function createWorkProductRow(overrides: Partial<Record<string, unknown>> = {}) {
|
||||||
|
const now = new Date("2026-03-17T00:00:00.000Z");
|
||||||
|
return {
|
||||||
|
id: "work-product-1",
|
||||||
|
companyId: "company-1",
|
||||||
|
projectId: "project-1",
|
||||||
|
issueId: "issue-1",
|
||||||
|
executionWorkspaceId: null,
|
||||||
|
runtimeServiceId: null,
|
||||||
|
type: "pull_request",
|
||||||
|
provider: "github",
|
||||||
|
externalId: null,
|
||||||
|
title: "PR 1",
|
||||||
|
url: "https://example.com/pr/1",
|
||||||
|
status: "open",
|
||||||
|
reviewState: "draft",
|
||||||
|
isPrimary: true,
|
||||||
|
healthStatus: "unknown",
|
||||||
|
summary: null,
|
||||||
|
metadata: null,
|
||||||
|
createdByRunId: null,
|
||||||
|
createdAt: now,
|
||||||
|
updatedAt: now,
|
||||||
|
...overrides,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
describe("workProductService", () => {
|
||||||
|
it("uses a transaction when creating a new primary work product", async () => {
|
||||||
|
const updatedWhere = vi.fn(async () => undefined);
|
||||||
|
const updateSet = vi.fn(() => ({ where: updatedWhere }));
|
||||||
|
const txUpdate = vi.fn(() => ({ set: updateSet }));
|
||||||
|
|
||||||
|
const insertedRow = createWorkProductRow();
|
||||||
|
const insertReturning = vi.fn(async () => [insertedRow]);
|
||||||
|
const insertValues = vi.fn(() => ({ returning: insertReturning }));
|
||||||
|
const txInsert = vi.fn(() => ({ values: insertValues }));
|
||||||
|
|
||||||
|
const tx = {
|
||||||
|
update: txUpdate,
|
||||||
|
insert: txInsert,
|
||||||
|
};
|
||||||
|
const transaction = vi.fn(async (callback: (input: typeof tx) => Promise<unknown>) => await callback(tx));
|
||||||
|
|
||||||
|
const svc = workProductService({ transaction } as any);
|
||||||
|
const result = await svc.createForIssue("issue-1", "company-1", {
|
||||||
|
type: "pull_request",
|
||||||
|
provider: "github",
|
||||||
|
title: "PR 1",
|
||||||
|
status: "open",
|
||||||
|
reviewState: "draft",
|
||||||
|
isPrimary: true,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(transaction).toHaveBeenCalledTimes(1);
|
||||||
|
expect(txUpdate).toHaveBeenCalledTimes(1);
|
||||||
|
expect(txInsert).toHaveBeenCalledTimes(1);
|
||||||
|
expect(result?.id).toBe("work-product-1");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("uses a transaction when promoting an existing work product to primary", async () => {
|
||||||
|
const existingRow = createWorkProductRow({ isPrimary: false });
|
||||||
|
|
||||||
|
const selectWhere = vi.fn(async () => [existingRow]);
|
||||||
|
const selectFrom = vi.fn(() => ({ where: selectWhere }));
|
||||||
|
const txSelect = vi.fn(() => ({ from: selectFrom }));
|
||||||
|
|
||||||
|
const updateReturning = vi
|
||||||
|
.fn()
|
||||||
|
.mockResolvedValue([createWorkProductRow({ reviewState: "ready_for_review" })]);
|
||||||
|
const updateWhere = vi.fn(() => ({ returning: updateReturning }));
|
||||||
|
const updateSet = vi.fn(() => ({ where: updateWhere }));
|
||||||
|
const txUpdate = vi.fn(() => ({ set: updateSet }));
|
||||||
|
|
||||||
|
const tx = {
|
||||||
|
select: txSelect,
|
||||||
|
update: txUpdate,
|
||||||
|
};
|
||||||
|
const transaction = vi.fn(async (callback: (input: typeof tx) => Promise<unknown>) => await callback(tx));
|
||||||
|
|
||||||
|
const svc = workProductService({ transaction } as any);
|
||||||
|
const result = await svc.update("work-product-1", {
|
||||||
|
isPrimary: true,
|
||||||
|
reviewState: "ready_for_review",
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(transaction).toHaveBeenCalledTimes(1);
|
||||||
|
expect(txSelect).toHaveBeenCalledTimes(1);
|
||||||
|
expect(txUpdate).toHaveBeenCalledTimes(2);
|
||||||
|
expect(result?.reviewState).toBe("ready_for_review");
|
||||||
|
});
|
||||||
|
});
|
||||||
@@ -5,12 +5,16 @@ import path from "node:path";
|
|||||||
import { promisify } from "node:util";
|
import { promisify } from "node:util";
|
||||||
import { afterEach, describe, expect, it } from "vitest";
|
import { afterEach, describe, expect, it } from "vitest";
|
||||||
import {
|
import {
|
||||||
|
cleanupExecutionWorkspaceArtifacts,
|
||||||
ensureRuntimeServicesForRun,
|
ensureRuntimeServicesForRun,
|
||||||
normalizeAdapterManagedRuntimeServices,
|
normalizeAdapterManagedRuntimeServices,
|
||||||
realizeExecutionWorkspace,
|
realizeExecutionWorkspace,
|
||||||
releaseRuntimeServicesForRun,
|
releaseRuntimeServicesForRun,
|
||||||
|
stopRuntimeServicesForExecutionWorkspace,
|
||||||
type RealizedExecutionWorkspace,
|
type RealizedExecutionWorkspace,
|
||||||
} from "../services/workspace-runtime.ts";
|
} from "../services/workspace-runtime.ts";
|
||||||
|
import type { WorkspaceOperation } from "@paperclipai/shared";
|
||||||
|
import type { WorkspaceOperationRecorder } from "../services/workspace-operations.ts";
|
||||||
|
|
||||||
const execFileAsync = promisify(execFile);
|
const execFileAsync = promisify(execFile);
|
||||||
const leasedRunIds = new Set<string>();
|
const leasedRunIds = new Set<string>();
|
||||||
@@ -48,6 +52,68 @@ function buildWorkspace(cwd: string): RealizedExecutionWorkspace {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function createWorkspaceOperationRecorderDouble() {
|
||||||
|
const operations: Array<{
|
||||||
|
phase: string;
|
||||||
|
command: string | null;
|
||||||
|
cwd: string | null;
|
||||||
|
metadata: Record<string, unknown> | null;
|
||||||
|
result: {
|
||||||
|
status?: string;
|
||||||
|
exitCode?: number | null;
|
||||||
|
stdout?: string | null;
|
||||||
|
stderr?: string | null;
|
||||||
|
system?: string | null;
|
||||||
|
metadata?: Record<string, unknown> | null;
|
||||||
|
};
|
||||||
|
}> = [];
|
||||||
|
let executionWorkspaceId: string | null = null;
|
||||||
|
|
||||||
|
const recorder: WorkspaceOperationRecorder = {
|
||||||
|
attachExecutionWorkspaceId: async (nextExecutionWorkspaceId) => {
|
||||||
|
executionWorkspaceId = nextExecutionWorkspaceId;
|
||||||
|
},
|
||||||
|
recordOperation: async (input) => {
|
||||||
|
const result = await input.run();
|
||||||
|
operations.push({
|
||||||
|
phase: input.phase,
|
||||||
|
command: input.command ?? null,
|
||||||
|
cwd: input.cwd ?? null,
|
||||||
|
metadata: {
|
||||||
|
...(input.metadata ?? {}),
|
||||||
|
...(executionWorkspaceId ? { executionWorkspaceId } : {}),
|
||||||
|
},
|
||||||
|
result,
|
||||||
|
});
|
||||||
|
return {
|
||||||
|
id: `op-${operations.length}`,
|
||||||
|
companyId: "company-1",
|
||||||
|
executionWorkspaceId,
|
||||||
|
heartbeatRunId: "run-1",
|
||||||
|
phase: input.phase,
|
||||||
|
command: input.command ?? null,
|
||||||
|
cwd: input.cwd ?? null,
|
||||||
|
status: (result.status ?? "succeeded") as WorkspaceOperation["status"],
|
||||||
|
exitCode: result.exitCode ?? null,
|
||||||
|
logStore: "local_file",
|
||||||
|
logRef: `op-${operations.length}.ndjson`,
|
||||||
|
logBytes: 0,
|
||||||
|
logSha256: null,
|
||||||
|
logCompressed: false,
|
||||||
|
stdoutExcerpt: result.stdout ?? null,
|
||||||
|
stderrExcerpt: result.stderr ?? null,
|
||||||
|
metadata: input.metadata ?? null,
|
||||||
|
startedAt: new Date(),
|
||||||
|
finishedAt: new Date(),
|
||||||
|
createdAt: new Date(),
|
||||||
|
updatedAt: new Date(),
|
||||||
|
};
|
||||||
|
},
|
||||||
|
};
|
||||||
|
|
||||||
|
return { recorder, operations };
|
||||||
|
}
|
||||||
|
|
||||||
afterEach(async () => {
|
afterEach(async () => {
|
||||||
await Promise.all(
|
await Promise.all(
|
||||||
Array.from(leasedRunIds).map(async (runId) => {
|
Array.from(leasedRunIds).map(async (runId) => {
|
||||||
@@ -55,6 +121,10 @@ afterEach(async () => {
|
|||||||
leasedRunIds.delete(runId);
|
leasedRunIds.delete(runId);
|
||||||
}),
|
}),
|
||||||
);
|
);
|
||||||
|
delete process.env.PAPERCLIP_CONFIG;
|
||||||
|
delete process.env.PAPERCLIP_HOME;
|
||||||
|
delete process.env.PAPERCLIP_INSTANCE_ID;
|
||||||
|
delete process.env.DATABASE_URL;
|
||||||
});
|
});
|
||||||
|
|
||||||
describe("realizeExecutionWorkspace", () => {
|
describe("realizeExecutionWorkspace", () => {
|
||||||
@@ -211,6 +281,304 @@ describe("realizeExecutionWorkspace", () => {
|
|||||||
|
|
||||||
await expect(fs.readFile(path.join(reused.cwd, ".paperclip-provision-created"), "utf8")).resolves.toBe("false\n");
|
await expect(fs.readFile(path.join(reused.cwd, ".paperclip-provision-created"), "utf8")).resolves.toBe("false\n");
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it("records worktree setup and provision operations when a recorder is provided", async () => {
|
||||||
|
const repoRoot = await createTempRepo();
|
||||||
|
const { recorder, operations } = createWorkspaceOperationRecorderDouble();
|
||||||
|
|
||||||
|
await fs.mkdir(path.join(repoRoot, "scripts"), { recursive: true });
|
||||||
|
await fs.writeFile(
|
||||||
|
path.join(repoRoot, "scripts", "provision.sh"),
|
||||||
|
[
|
||||||
|
"#!/usr/bin/env bash",
|
||||||
|
"set -euo pipefail",
|
||||||
|
"printf 'provisioned\\n'",
|
||||||
|
].join("\n"),
|
||||||
|
"utf8",
|
||||||
|
);
|
||||||
|
await runGit(repoRoot, ["add", "scripts/provision.sh"]);
|
||||||
|
await runGit(repoRoot, ["commit", "-m", "Add recorder provision script"]);
|
||||||
|
|
||||||
|
await realizeExecutionWorkspace({
|
||||||
|
base: {
|
||||||
|
baseCwd: repoRoot,
|
||||||
|
source: "project_primary",
|
||||||
|
projectId: "project-1",
|
||||||
|
workspaceId: "workspace-1",
|
||||||
|
repoUrl: null,
|
||||||
|
repoRef: "HEAD",
|
||||||
|
},
|
||||||
|
config: {
|
||||||
|
workspaceStrategy: {
|
||||||
|
type: "git_worktree",
|
||||||
|
branchTemplate: "{{issue.identifier}}-{{slug}}",
|
||||||
|
provisionCommand: "bash ./scripts/provision.sh",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
issue: {
|
||||||
|
id: "issue-1",
|
||||||
|
identifier: "PAP-540",
|
||||||
|
title: "Record workspace operations",
|
||||||
|
},
|
||||||
|
agent: {
|
||||||
|
id: "agent-1",
|
||||||
|
name: "Codex Coder",
|
||||||
|
companyId: "company-1",
|
||||||
|
},
|
||||||
|
recorder,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(operations.map((operation) => operation.phase)).toEqual([
|
||||||
|
"worktree_prepare",
|
||||||
|
"workspace_provision",
|
||||||
|
]);
|
||||||
|
expect(operations[0]?.command).toContain("git worktree add");
|
||||||
|
expect(operations[0]?.metadata).toMatchObject({
|
||||||
|
branchName: "PAP-540-record-workspace-operations",
|
||||||
|
created: true,
|
||||||
|
});
|
||||||
|
expect(operations[1]?.command).toBe("bash ./scripts/provision.sh");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("reuses an existing branch without resetting it when recreating a missing worktree", async () => {
|
||||||
|
const repoRoot = await createTempRepo();
|
||||||
|
const branchName = "PAP-450-recreate-missing-worktree";
|
||||||
|
|
||||||
|
await runGit(repoRoot, ["checkout", "-b", branchName]);
|
||||||
|
await fs.writeFile(path.join(repoRoot, "feature.txt"), "preserve me\n", "utf8");
|
||||||
|
await runGit(repoRoot, ["add", "feature.txt"]);
|
||||||
|
await runGit(repoRoot, ["commit", "-m", "Add preserved feature"]);
|
||||||
|
const expectedHead = (await execFileAsync("git", ["rev-parse", branchName], { cwd: repoRoot })).stdout.trim();
|
||||||
|
await runGit(repoRoot, ["checkout", "main"]);
|
||||||
|
|
||||||
|
const workspace = await realizeExecutionWorkspace({
|
||||||
|
base: {
|
||||||
|
baseCwd: repoRoot,
|
||||||
|
source: "project_primary",
|
||||||
|
projectId: "project-1",
|
||||||
|
workspaceId: "workspace-1",
|
||||||
|
repoUrl: null,
|
||||||
|
repoRef: "HEAD",
|
||||||
|
},
|
||||||
|
config: {
|
||||||
|
workspaceStrategy: {
|
||||||
|
type: "git_worktree",
|
||||||
|
branchTemplate: "{{issue.identifier}}-{{slug}}",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
issue: {
|
||||||
|
id: "issue-1",
|
||||||
|
identifier: "PAP-450",
|
||||||
|
title: "Recreate missing worktree",
|
||||||
|
},
|
||||||
|
agent: {
|
||||||
|
id: "agent-1",
|
||||||
|
name: "Codex Coder",
|
||||||
|
companyId: "company-1",
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(workspace.branchName).toBe(branchName);
|
||||||
|
await expect(fs.readFile(path.join(workspace.cwd, "feature.txt"), "utf8")).resolves.toBe("preserve me\n");
|
||||||
|
const actualHead = (await execFileAsync("git", ["rev-parse", "HEAD"], { cwd: workspace.cwd })).stdout.trim();
|
||||||
|
expect(actualHead).toBe(expectedHead);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("removes a created git worktree and branch during cleanup", async () => {
|
||||||
|
const repoRoot = await createTempRepo();
|
||||||
|
|
||||||
|
const workspace = await realizeExecutionWorkspace({
|
||||||
|
base: {
|
||||||
|
baseCwd: repoRoot,
|
||||||
|
source: "project_primary",
|
||||||
|
projectId: "project-1",
|
||||||
|
workspaceId: "workspace-1",
|
||||||
|
repoUrl: null,
|
||||||
|
repoRef: "HEAD",
|
||||||
|
},
|
||||||
|
config: {
|
||||||
|
workspaceStrategy: {
|
||||||
|
type: "git_worktree",
|
||||||
|
branchTemplate: "{{issue.identifier}}-{{slug}}",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
issue: {
|
||||||
|
id: "issue-1",
|
||||||
|
identifier: "PAP-449",
|
||||||
|
title: "Cleanup workspace",
|
||||||
|
},
|
||||||
|
agent: {
|
||||||
|
id: "agent-1",
|
||||||
|
name: "Codex Coder",
|
||||||
|
companyId: "company-1",
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
const cleanup = await cleanupExecutionWorkspaceArtifacts({
|
||||||
|
workspace: {
|
||||||
|
id: "execution-workspace-1",
|
||||||
|
cwd: workspace.cwd,
|
||||||
|
providerType: "git_worktree",
|
||||||
|
providerRef: workspace.worktreePath,
|
||||||
|
branchName: workspace.branchName,
|
||||||
|
repoUrl: workspace.repoUrl,
|
||||||
|
baseRef: workspace.repoRef,
|
||||||
|
projectId: workspace.projectId,
|
||||||
|
projectWorkspaceId: workspace.workspaceId,
|
||||||
|
sourceIssueId: "issue-1",
|
||||||
|
metadata: {
|
||||||
|
createdByRuntime: true,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
projectWorkspace: {
|
||||||
|
cwd: repoRoot,
|
||||||
|
cleanupCommand: null,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(cleanup.cleaned).toBe(true);
|
||||||
|
expect(cleanup.warnings).toEqual([]);
|
||||||
|
await expect(fs.stat(workspace.cwd)).rejects.toThrow();
|
||||||
|
await expect(
|
||||||
|
execFileAsync("git", ["branch", "--list", workspace.branchName!], { cwd: repoRoot }),
|
||||||
|
).resolves.toMatchObject({
|
||||||
|
stdout: "",
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("keeps an unmerged runtime-created branch and warns instead of force deleting it", async () => {
|
||||||
|
const repoRoot = await createTempRepo();
|
||||||
|
|
||||||
|
const workspace = await realizeExecutionWorkspace({
|
||||||
|
base: {
|
||||||
|
baseCwd: repoRoot,
|
||||||
|
source: "project_primary",
|
||||||
|
projectId: "project-1",
|
||||||
|
workspaceId: "workspace-1",
|
||||||
|
repoUrl: null,
|
||||||
|
repoRef: "HEAD",
|
||||||
|
},
|
||||||
|
config: {
|
||||||
|
workspaceStrategy: {
|
||||||
|
type: "git_worktree",
|
||||||
|
branchTemplate: "{{issue.identifier}}-{{slug}}",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
issue: {
|
||||||
|
id: "issue-1",
|
||||||
|
identifier: "PAP-451",
|
||||||
|
title: "Keep unmerged branch",
|
||||||
|
},
|
||||||
|
agent: {
|
||||||
|
id: "agent-1",
|
||||||
|
name: "Codex Coder",
|
||||||
|
companyId: "company-1",
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
await fs.writeFile(path.join(workspace.cwd, "unmerged.txt"), "still here\n", "utf8");
|
||||||
|
await runGit(workspace.cwd, ["add", "unmerged.txt"]);
|
||||||
|
await runGit(workspace.cwd, ["commit", "-m", "Keep unmerged work"]);
|
||||||
|
|
||||||
|
const cleanup = await cleanupExecutionWorkspaceArtifacts({
|
||||||
|
workspace: {
|
||||||
|
id: "execution-workspace-1",
|
||||||
|
cwd: workspace.cwd,
|
||||||
|
providerType: "git_worktree",
|
||||||
|
providerRef: workspace.worktreePath,
|
||||||
|
branchName: workspace.branchName,
|
||||||
|
repoUrl: workspace.repoUrl,
|
||||||
|
baseRef: workspace.repoRef,
|
||||||
|
projectId: workspace.projectId,
|
||||||
|
projectWorkspaceId: workspace.workspaceId,
|
||||||
|
sourceIssueId: "issue-1",
|
||||||
|
metadata: {
|
||||||
|
createdByRuntime: true,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
projectWorkspace: {
|
||||||
|
cwd: repoRoot,
|
||||||
|
cleanupCommand: null,
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(cleanup.cleaned).toBe(true);
|
||||||
|
expect(cleanup.warnings).toHaveLength(1);
|
||||||
|
expect(cleanup.warnings[0]).toContain(`Skipped deleting branch "${workspace.branchName}"`);
|
||||||
|
await expect(
|
||||||
|
execFileAsync("git", ["branch", "--list", workspace.branchName!], { cwd: repoRoot }),
|
||||||
|
).resolves.toMatchObject({
|
||||||
|
stdout: expect.stringContaining(workspace.branchName!),
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
it("records teardown and cleanup operations when a recorder is provided", async () => {
|
||||||
|
const repoRoot = await createTempRepo();
|
||||||
|
const { recorder, operations } = createWorkspaceOperationRecorderDouble();
|
||||||
|
|
||||||
|
const workspace = await realizeExecutionWorkspace({
|
||||||
|
base: {
|
||||||
|
baseCwd: repoRoot,
|
||||||
|
source: "project_primary",
|
||||||
|
projectId: "project-1",
|
||||||
|
workspaceId: "workspace-1",
|
||||||
|
repoUrl: null,
|
||||||
|
repoRef: "HEAD",
|
||||||
|
},
|
||||||
|
config: {
|
||||||
|
workspaceStrategy: {
|
||||||
|
type: "git_worktree",
|
||||||
|
branchTemplate: "{{issue.identifier}}-{{slug}}",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
issue: {
|
||||||
|
id: "issue-1",
|
||||||
|
identifier: "PAP-541",
|
||||||
|
title: "Cleanup recorder",
|
||||||
|
},
|
||||||
|
agent: {
|
||||||
|
id: "agent-1",
|
||||||
|
name: "Codex Coder",
|
||||||
|
companyId: "company-1",
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
await cleanupExecutionWorkspaceArtifacts({
|
||||||
|
workspace: {
|
||||||
|
id: "execution-workspace-1",
|
||||||
|
cwd: workspace.cwd,
|
||||||
|
providerType: "git_worktree",
|
||||||
|
providerRef: workspace.worktreePath,
|
||||||
|
branchName: workspace.branchName,
|
||||||
|
repoUrl: workspace.repoUrl,
|
||||||
|
baseRef: workspace.repoRef,
|
||||||
|
projectId: workspace.projectId,
|
||||||
|
projectWorkspaceId: workspace.workspaceId,
|
||||||
|
sourceIssueId: "issue-1",
|
||||||
|
metadata: {
|
||||||
|
createdByRuntime: true,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
projectWorkspace: {
|
||||||
|
cwd: repoRoot,
|
||||||
|
cleanupCommand: "printf 'cleanup ok\\n'",
|
||||||
|
},
|
||||||
|
recorder,
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(operations.map((operation) => operation.phase)).toEqual([
|
||||||
|
"workspace_teardown",
|
||||||
|
"worktree_cleanup",
|
||||||
|
"worktree_cleanup",
|
||||||
|
]);
|
||||||
|
expect(operations[0]?.command).toBe("printf 'cleanup ok\\n'");
|
||||||
|
expect(operations[1]?.metadata).toMatchObject({
|
||||||
|
cleanupAction: "worktree_remove",
|
||||||
|
});
|
||||||
|
expect(operations[2]?.metadata).toMatchObject({
|
||||||
|
cleanupAction: "branch_delete",
|
||||||
|
});
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
describe("ensureRuntimeServicesForRun", () => {
|
describe("ensureRuntimeServicesForRun", () => {
|
||||||
@@ -312,6 +680,199 @@ describe("ensureRuntimeServicesForRun", () => {
|
|||||||
expect(third[0]?.reused).toBe(false);
|
expect(third[0]?.reused).toBe(false);
|
||||||
expect(third[0]?.id).not.toBe(first[0]?.id);
|
expect(third[0]?.id).not.toBe(first[0]?.id);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it("does not leak parent Paperclip instance env into runtime service commands", async () => {
|
||||||
|
const workspaceRoot = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-runtime-env-"));
|
||||||
|
const workspace = buildWorkspace(workspaceRoot);
|
||||||
|
const envCapturePath = path.join(workspaceRoot, "captured-env.json");
|
||||||
|
const serviceCommand = [
|
||||||
|
"node -e",
|
||||||
|
JSON.stringify(
|
||||||
|
[
|
||||||
|
"const fs = require('node:fs');",
|
||||||
|
`fs.writeFileSync(${JSON.stringify(envCapturePath)}, JSON.stringify({`,
|
||||||
|
"paperclipConfig: process.env.PAPERCLIP_CONFIG ?? null,",
|
||||||
|
"paperclipHome: process.env.PAPERCLIP_HOME ?? null,",
|
||||||
|
"paperclipInstanceId: process.env.PAPERCLIP_INSTANCE_ID ?? null,",
|
||||||
|
"databaseUrl: process.env.DATABASE_URL ?? null,",
|
||||||
|
"customEnv: process.env.RUNTIME_CUSTOM_ENV ?? null,",
|
||||||
|
"port: process.env.PORT ?? null,",
|
||||||
|
"}));",
|
||||||
|
"require('node:http').createServer((req, res) => res.end('ok')).listen(Number(process.env.PORT), '127.0.0.1');",
|
||||||
|
].join(" "),
|
||||||
|
),
|
||||||
|
].join(" ");
|
||||||
|
|
||||||
|
process.env.PAPERCLIP_CONFIG = "/tmp/base-paperclip-config.json";
|
||||||
|
process.env.PAPERCLIP_HOME = "/tmp/base-paperclip-home";
|
||||||
|
process.env.PAPERCLIP_INSTANCE_ID = "base-instance";
|
||||||
|
process.env.DATABASE_URL = "postgres://shared-db.example.com/paperclip";
|
||||||
|
|
||||||
|
const runId = "run-env";
|
||||||
|
leasedRunIds.add(runId);
|
||||||
|
|
||||||
|
const services = await ensureRuntimeServicesForRun({
|
||||||
|
runId,
|
||||||
|
agent: {
|
||||||
|
id: "agent-1",
|
||||||
|
name: "Codex Coder",
|
||||||
|
companyId: "company-1",
|
||||||
|
},
|
||||||
|
issue: null,
|
||||||
|
workspace,
|
||||||
|
executionWorkspaceId: "execution-workspace-1",
|
||||||
|
config: {
|
||||||
|
workspaceRuntime: {
|
||||||
|
services: [
|
||||||
|
{
|
||||||
|
name: "web",
|
||||||
|
command: serviceCommand,
|
||||||
|
port: { type: "auto" },
|
||||||
|
readiness: {
|
||||||
|
type: "http",
|
||||||
|
urlTemplate: "http://127.0.0.1:{{port}}",
|
||||||
|
timeoutSec: 10,
|
||||||
|
intervalMs: 100,
|
||||||
|
},
|
||||||
|
lifecycle: "shared",
|
||||||
|
reuseScope: "execution_workspace",
|
||||||
|
stopPolicy: {
|
||||||
|
type: "on_run_finish",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
adapterEnv: {
|
||||||
|
RUNTIME_CUSTOM_ENV: "from-adapter",
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(services).toHaveLength(1);
|
||||||
|
const captured = JSON.parse(await fs.readFile(envCapturePath, "utf8")) as Record<string, string | null>;
|
||||||
|
expect(captured.paperclipConfig).toBeNull();
|
||||||
|
expect(captured.paperclipHome).toBeNull();
|
||||||
|
expect(captured.paperclipInstanceId).toBeNull();
|
||||||
|
expect(captured.databaseUrl).toBeNull();
|
||||||
|
expect(captured.customEnv).toBe("from-adapter");
|
||||||
|
expect(captured.port).toMatch(/^\d+$/);
|
||||||
|
expect(services[0]?.executionWorkspaceId).toBe("execution-workspace-1");
|
||||||
|
expect(services[0]?.scopeType).toBe("execution_workspace");
|
||||||
|
expect(services[0]?.scopeId).toBe("execution-workspace-1");
|
||||||
|
});
|
||||||
|
|
||||||
|
it("stops execution workspace runtime services by executionWorkspaceId", async () => {
|
||||||
|
const workspaceRoot = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-runtime-stop-"));
|
||||||
|
const workspace = buildWorkspace(workspaceRoot);
|
||||||
|
const runId = "run-stop";
|
||||||
|
leasedRunIds.add(runId);
|
||||||
|
|
||||||
|
const services = await ensureRuntimeServicesForRun({
|
||||||
|
runId,
|
||||||
|
agent: {
|
||||||
|
id: "agent-1",
|
||||||
|
name: "Codex Coder",
|
||||||
|
companyId: "company-1",
|
||||||
|
},
|
||||||
|
issue: null,
|
||||||
|
workspace,
|
||||||
|
executionWorkspaceId: "execution-workspace-stop",
|
||||||
|
config: {
|
||||||
|
workspaceRuntime: {
|
||||||
|
services: [
|
||||||
|
{
|
||||||
|
name: "web",
|
||||||
|
command:
|
||||||
|
"node -e \"require('node:http').createServer((req,res)=>res.end('ok')).listen(Number(process.env.PORT), '127.0.0.1')\"",
|
||||||
|
port: { type: "auto" },
|
||||||
|
readiness: {
|
||||||
|
type: "http",
|
||||||
|
urlTemplate: "http://127.0.0.1:{{port}}",
|
||||||
|
timeoutSec: 10,
|
||||||
|
intervalMs: 100,
|
||||||
|
},
|
||||||
|
lifecycle: "shared",
|
||||||
|
reuseScope: "execution_workspace",
|
||||||
|
stopPolicy: {
|
||||||
|
type: "manual",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
adapterEnv: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(services[0]?.url).toBeTruthy();
|
||||||
|
await stopRuntimeServicesForExecutionWorkspace({
|
||||||
|
executionWorkspaceId: "execution-workspace-stop",
|
||||||
|
workspaceCwd: workspace.cwd,
|
||||||
|
});
|
||||||
|
await releaseRuntimeServicesForRun(runId);
|
||||||
|
leasedRunIds.delete(runId);
|
||||||
|
await new Promise((resolve) => setTimeout(resolve, 250));
|
||||||
|
|
||||||
|
await expect(fetch(services[0]!.url!)).rejects.toThrow();
|
||||||
|
});
|
||||||
|
|
||||||
|
it("does not stop services in sibling directories when matching by workspace cwd", async () => {
|
||||||
|
const workspaceParent = await fs.mkdtemp(path.join(os.tmpdir(), "paperclip-runtime-sibling-"));
|
||||||
|
const targetWorkspaceRoot = path.join(workspaceParent, "project");
|
||||||
|
const siblingWorkspaceRoot = path.join(workspaceParent, "project-extended", "service");
|
||||||
|
await fs.mkdir(targetWorkspaceRoot, { recursive: true });
|
||||||
|
await fs.mkdir(siblingWorkspaceRoot, { recursive: true });
|
||||||
|
|
||||||
|
const siblingWorkspace = buildWorkspace(siblingWorkspaceRoot);
|
||||||
|
const runId = "run-sibling";
|
||||||
|
leasedRunIds.add(runId);
|
||||||
|
|
||||||
|
const services = await ensureRuntimeServicesForRun({
|
||||||
|
runId,
|
||||||
|
agent: {
|
||||||
|
id: "agent-1",
|
||||||
|
name: "Codex Coder",
|
||||||
|
companyId: "company-1",
|
||||||
|
},
|
||||||
|
issue: null,
|
||||||
|
workspace: siblingWorkspace,
|
||||||
|
executionWorkspaceId: "execution-workspace-sibling",
|
||||||
|
config: {
|
||||||
|
workspaceRuntime: {
|
||||||
|
services: [
|
||||||
|
{
|
||||||
|
name: "web",
|
||||||
|
command:
|
||||||
|
"node -e \"require('node:http').createServer((req,res)=>res.end('ok')).listen(Number(process.env.PORT), '127.0.0.1')\"",
|
||||||
|
port: { type: "auto" },
|
||||||
|
readiness: {
|
||||||
|
type: "http",
|
||||||
|
urlTemplate: "http://127.0.0.1:{{port}}",
|
||||||
|
timeoutSec: 10,
|
||||||
|
intervalMs: 100,
|
||||||
|
},
|
||||||
|
lifecycle: "shared",
|
||||||
|
reuseScope: "execution_workspace",
|
||||||
|
stopPolicy: {
|
||||||
|
type: "manual",
|
||||||
|
},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
},
|
||||||
|
},
|
||||||
|
adapterEnv: {},
|
||||||
|
});
|
||||||
|
|
||||||
|
await stopRuntimeServicesForExecutionWorkspace({
|
||||||
|
executionWorkspaceId: "execution-workspace-target",
|
||||||
|
workspaceCwd: targetWorkspaceRoot,
|
||||||
|
});
|
||||||
|
|
||||||
|
const response = await fetch(services[0]!.url!);
|
||||||
|
expect(await response.text()).toBe("ok");
|
||||||
|
|
||||||
|
await releaseRuntimeServicesForRun(runId);
|
||||||
|
leasedRunIds.delete(runId);
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
describe("normalizeAdapterManagedRuntimeServices", () => {
|
describe("normalizeAdapterManagedRuntimeServices", () => {
|
||||||
@@ -374,6 +935,7 @@ describe("normalizeAdapterManagedRuntimeServices", () => {
|
|||||||
companyId: "company-1",
|
companyId: "company-1",
|
||||||
projectId: "project-1",
|
projectId: "project-1",
|
||||||
projectWorkspaceId: "workspace-1",
|
projectWorkspaceId: "workspace-1",
|
||||||
|
executionWorkspaceId: null,
|
||||||
issueId: "issue-1",
|
issueId: "issue-1",
|
||||||
serviceName: "preview",
|
serviceName: "preview",
|
||||||
provider: "adapter_managed",
|
provider: "adapter_managed",
|
||||||
@@ -383,4 +945,33 @@ describe("normalizeAdapterManagedRuntimeServices", () => {
|
|||||||
});
|
});
|
||||||
expect(first[0]?.id).toBe(second[0]?.id);
|
expect(first[0]?.id).toBe(second[0]?.id);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it("prefers execution workspace ids over cwd for execution-scoped adapter services", () => {
|
||||||
|
const workspace = buildWorkspace("/tmp/project");
|
||||||
|
|
||||||
|
const refs = normalizeAdapterManagedRuntimeServices({
|
||||||
|
adapterType: "openclaw_gateway",
|
||||||
|
runId: "run-1",
|
||||||
|
agent: {
|
||||||
|
id: "agent-1",
|
||||||
|
name: "Gateway Agent",
|
||||||
|
companyId: "company-1",
|
||||||
|
},
|
||||||
|
issue: null,
|
||||||
|
workspace,
|
||||||
|
executionWorkspaceId: "execution-workspace-1",
|
||||||
|
reports: [
|
||||||
|
{
|
||||||
|
serviceName: "preview",
|
||||||
|
scopeType: "execution_workspace",
|
||||||
|
},
|
||||||
|
],
|
||||||
|
});
|
||||||
|
|
||||||
|
expect(refs[0]).toMatchObject({
|
||||||
|
scopeType: "execution_workspace",
|
||||||
|
scopeId: "execution-workspace-1",
|
||||||
|
executionWorkspaceId: "execution-workspace-1",
|
||||||
|
});
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|||||||
@@ -14,6 +14,7 @@ import { companyRoutes } from "./routes/companies.js";
|
|||||||
import { agentRoutes } from "./routes/agents.js";
|
import { agentRoutes } from "./routes/agents.js";
|
||||||
import { projectRoutes } from "./routes/projects.js";
|
import { projectRoutes } from "./routes/projects.js";
|
||||||
import { issueRoutes } from "./routes/issues.js";
|
import { issueRoutes } from "./routes/issues.js";
|
||||||
|
import { executionWorkspaceRoutes } from "./routes/execution-workspaces.js";
|
||||||
import { goalRoutes } from "./routes/goals.js";
|
import { goalRoutes } from "./routes/goals.js";
|
||||||
import { approvalRoutes } from "./routes/approvals.js";
|
import { approvalRoutes } from "./routes/approvals.js";
|
||||||
import { secretRoutes } from "./routes/secrets.js";
|
import { secretRoutes } from "./routes/secrets.js";
|
||||||
@@ -21,6 +22,7 @@ import { costRoutes } from "./routes/costs.js";
|
|||||||
import { activityRoutes } from "./routes/activity.js";
|
import { activityRoutes } from "./routes/activity.js";
|
||||||
import { dashboardRoutes } from "./routes/dashboard.js";
|
import { dashboardRoutes } from "./routes/dashboard.js";
|
||||||
import { sidebarBadgeRoutes } from "./routes/sidebar-badges.js";
|
import { sidebarBadgeRoutes } from "./routes/sidebar-badges.js";
|
||||||
|
import { instanceSettingsRoutes } from "./routes/instance-settings.js";
|
||||||
import { llmRoutes } from "./routes/llms.js";
|
import { llmRoutes } from "./routes/llms.js";
|
||||||
import { assetRoutes } from "./routes/assets.js";
|
import { assetRoutes } from "./routes/assets.js";
|
||||||
import { accessRoutes } from "./routes/access.js";
|
import { accessRoutes } from "./routes/access.js";
|
||||||
@@ -46,6 +48,13 @@ import type { BetterAuthSessionResult } from "./auth/better-auth.js";
|
|||||||
|
|
||||||
type UiMode = "none" | "static" | "vite-dev";
|
type UiMode = "none" | "static" | "vite-dev";
|
||||||
|
|
||||||
|
export function resolveViteHmrPort(serverPort: number): number {
|
||||||
|
if (serverPort <= 55_535) {
|
||||||
|
return serverPort + 10_000;
|
||||||
|
}
|
||||||
|
return Math.max(1_024, serverPort - 10_000);
|
||||||
|
}
|
||||||
|
|
||||||
export async function createApp(
|
export async function createApp(
|
||||||
db: Db,
|
db: Db,
|
||||||
opts: {
|
opts: {
|
||||||
@@ -131,6 +140,7 @@ export async function createApp(
|
|||||||
api.use(assetRoutes(db, opts.storageService));
|
api.use(assetRoutes(db, opts.storageService));
|
||||||
api.use(projectRoutes(db));
|
api.use(projectRoutes(db));
|
||||||
api.use(issueRoutes(db, opts.storageService));
|
api.use(issueRoutes(db, opts.storageService));
|
||||||
|
api.use(executionWorkspaceRoutes(db));
|
||||||
api.use(goalRoutes(db));
|
api.use(goalRoutes(db));
|
||||||
api.use(approvalRoutes(db));
|
api.use(approvalRoutes(db));
|
||||||
api.use(secretRoutes(db));
|
api.use(secretRoutes(db));
|
||||||
@@ -138,6 +148,7 @@ export async function createApp(
|
|||||||
api.use(activityRoutes(db));
|
api.use(activityRoutes(db));
|
||||||
api.use(dashboardRoutes(db));
|
api.use(dashboardRoutes(db));
|
||||||
api.use(sidebarBadgeRoutes(db));
|
api.use(sidebarBadgeRoutes(db));
|
||||||
|
api.use(instanceSettingsRoutes(db));
|
||||||
const hostServicesDisposers = new Map<string, () => void>();
|
const hostServicesDisposers = new Map<string, () => void>();
|
||||||
const workerManager = createPluginWorkerManager();
|
const workerManager = createPluginWorkerManager();
|
||||||
const pluginRegistry = pluginRegistryService(db);
|
const pluginRegistry = pluginRegistryService(db);
|
||||||
@@ -238,7 +249,7 @@ export async function createApp(
|
|||||||
|
|
||||||
if (opts.uiMode === "vite-dev") {
|
if (opts.uiMode === "vite-dev") {
|
||||||
const uiRoot = path.resolve(__dirname, "../../ui");
|
const uiRoot = path.resolve(__dirname, "../../ui");
|
||||||
const hmrPort = opts.serverPort + 10000;
|
const hmrPort = resolveViteHmrPort(opts.serverPort);
|
||||||
const { createServer: createViteServer } = await import("vite");
|
const { createServer: createViteServer } = await import("vite");
|
||||||
const vite = await createViteServer({
|
const vite = await createViteServer({
|
||||||
root: uiRoot,
|
root: uiRoot,
|
||||||
|
|||||||
@@ -4,6 +4,7 @@ import path from "node:path";
|
|||||||
const DEFAULT_INSTANCE_ID = "default";
|
const DEFAULT_INSTANCE_ID = "default";
|
||||||
const INSTANCE_ID_RE = /^[a-zA-Z0-9_-]+$/;
|
const INSTANCE_ID_RE = /^[a-zA-Z0-9_-]+$/;
|
||||||
const PATH_SEGMENT_RE = /^[a-zA-Z0-9_-]+$/;
|
const PATH_SEGMENT_RE = /^[a-zA-Z0-9_-]+$/;
|
||||||
|
const FRIENDLY_PATH_SEGMENT_RE = /[^a-zA-Z0-9._-]+/g;
|
||||||
|
|
||||||
function expandHomePrefix(value: string): string {
|
function expandHomePrefix(value: string): string {
|
||||||
if (value === "~") return os.homedir();
|
if (value === "~") return os.homedir();
|
||||||
@@ -61,6 +62,34 @@ export function resolveDefaultAgentWorkspaceDir(agentId: string): string {
|
|||||||
return path.resolve(resolvePaperclipInstanceRoot(), "workspaces", trimmed);
|
return path.resolve(resolvePaperclipInstanceRoot(), "workspaces", trimmed);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function sanitizeFriendlyPathSegment(value: string | null | undefined, fallback = "_default"): string {
|
||||||
|
const trimmed = value?.trim() ?? "";
|
||||||
|
if (!trimmed) return fallback;
|
||||||
|
const sanitized = trimmed
|
||||||
|
.replace(FRIENDLY_PATH_SEGMENT_RE, "-")
|
||||||
|
.replace(/^-+|-+$/g, "");
|
||||||
|
return sanitized || fallback;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function resolveManagedProjectWorkspaceDir(input: {
|
||||||
|
companyId: string;
|
||||||
|
projectId: string;
|
||||||
|
repoName?: string | null;
|
||||||
|
}): string {
|
||||||
|
const companyId = input.companyId.trim();
|
||||||
|
const projectId = input.projectId.trim();
|
||||||
|
if (!companyId || !projectId) {
|
||||||
|
throw new Error("Managed project workspace path requires companyId and projectId.");
|
||||||
|
}
|
||||||
|
return path.resolve(
|
||||||
|
resolvePaperclipInstanceRoot(),
|
||||||
|
"projects",
|
||||||
|
sanitizeFriendlyPathSegment(companyId, "company"),
|
||||||
|
sanitizeFriendlyPathSegment(projectId, "project"),
|
||||||
|
sanitizeFriendlyPathSegment(input.repoName, "_default"),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
export function resolveHomeAwarePath(value: string): string {
|
export function resolveHomeAwarePath(value: string): string {
|
||||||
return path.resolve(expandHomePrefix(value));
|
return path.resolve(expandHomePrefix(value));
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ import { and, eq } from "drizzle-orm";
|
|||||||
import {
|
import {
|
||||||
createDb,
|
createDb,
|
||||||
ensurePostgresDatabase,
|
ensurePostgresDatabase,
|
||||||
|
getPostgresDataDirectory,
|
||||||
inspectMigrations,
|
inspectMigrations,
|
||||||
applyPendingMigrations,
|
applyPendingMigrations,
|
||||||
reconcilePendingMigrationHistory,
|
reconcilePendingMigrationHistory,
|
||||||
@@ -320,45 +321,60 @@ export async function startServer(): Promise<StartedServer> {
|
|||||||
if (runningPid) {
|
if (runningPid) {
|
||||||
logger.warn(`Embedded PostgreSQL already running; reusing existing process (pid=${runningPid}, port=${port})`);
|
logger.warn(`Embedded PostgreSQL already running; reusing existing process (pid=${runningPid}, port=${port})`);
|
||||||
} else {
|
} else {
|
||||||
const detectedPort = await detectPort(configuredPort);
|
const configuredAdminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${configuredPort}/postgres`;
|
||||||
if (detectedPort !== configuredPort) {
|
try {
|
||||||
logger.warn(`Embedded PostgreSQL port is in use; using next free port (requestedPort=${configuredPort}, selectedPort=${detectedPort})`);
|
const actualDataDir = await getPostgresDataDirectory(configuredAdminConnectionString);
|
||||||
}
|
if (
|
||||||
port = detectedPort;
|
typeof actualDataDir !== "string" ||
|
||||||
logger.info(`Using embedded PostgreSQL because no DATABASE_URL set (dataDir=${dataDir}, port=${port})`);
|
resolve(actualDataDir) !== resolve(dataDir)
|
||||||
embeddedPostgres = new EmbeddedPostgres({
|
) {
|
||||||
databaseDir: dataDir,
|
throw new Error("reachable postgres does not use the expected embedded data directory");
|
||||||
user: "paperclip",
|
}
|
||||||
password: "paperclip",
|
await ensurePostgresDatabase(configuredAdminConnectionString, "paperclip");
|
||||||
port,
|
logger.warn(
|
||||||
persistent: true,
|
`Embedded PostgreSQL appears to already be reachable without a pid file; reusing existing server on configured port ${configuredPort}`,
|
||||||
initdbFlags: ["--encoding=UTF8", "--locale=C"],
|
);
|
||||||
onLog: appendEmbeddedPostgresLog,
|
} catch {
|
||||||
onError: appendEmbeddedPostgresLog,
|
const detectedPort = await detectPort(configuredPort);
|
||||||
});
|
if (detectedPort !== configuredPort) {
|
||||||
|
logger.warn(`Embedded PostgreSQL port is in use; using next free port (requestedPort=${configuredPort}, selectedPort=${detectedPort})`);
|
||||||
if (!clusterAlreadyInitialized) {
|
}
|
||||||
|
port = detectedPort;
|
||||||
|
logger.info(`Using embedded PostgreSQL because no DATABASE_URL set (dataDir=${dataDir}, port=${port})`);
|
||||||
|
embeddedPostgres = new EmbeddedPostgres({
|
||||||
|
databaseDir: dataDir,
|
||||||
|
user: "paperclip",
|
||||||
|
password: "paperclip",
|
||||||
|
port,
|
||||||
|
persistent: true,
|
||||||
|
initdbFlags: ["--encoding=UTF8", "--locale=C"],
|
||||||
|
onLog: appendEmbeddedPostgresLog,
|
||||||
|
onError: appendEmbeddedPostgresLog,
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!clusterAlreadyInitialized) {
|
||||||
|
try {
|
||||||
|
await embeddedPostgres.initialise();
|
||||||
|
} catch (err) {
|
||||||
|
logEmbeddedPostgresFailure("initialise", err);
|
||||||
|
throw err;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
logger.info(`Embedded PostgreSQL cluster already exists (${clusterVersionFile}); skipping init`);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (existsSync(postmasterPidFile)) {
|
||||||
|
logger.warn("Removing stale embedded PostgreSQL lock file");
|
||||||
|
rmSync(postmasterPidFile, { force: true });
|
||||||
|
}
|
||||||
try {
|
try {
|
||||||
await embeddedPostgres.initialise();
|
await embeddedPostgres.start();
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
logEmbeddedPostgresFailure("initialise", err);
|
logEmbeddedPostgresFailure("start", err);
|
||||||
throw err;
|
throw err;
|
||||||
}
|
}
|
||||||
} else {
|
embeddedPostgresStartedByThisProcess = true;
|
||||||
logger.info(`Embedded PostgreSQL cluster already exists (${clusterVersionFile}); skipping init`);
|
|
||||||
}
|
}
|
||||||
|
|
||||||
if (existsSync(postmasterPidFile)) {
|
|
||||||
logger.warn("Removing stale embedded PostgreSQL lock file");
|
|
||||||
rmSync(postmasterPidFile, { force: true });
|
|
||||||
}
|
|
||||||
try {
|
|
||||||
await embeddedPostgres.start();
|
|
||||||
} catch (err) {
|
|
||||||
logEmbeddedPostgresFailure("start", err);
|
|
||||||
throw err;
|
|
||||||
}
|
|
||||||
embeddedPostgresStartedByThisProcess = true;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
const embeddedAdminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
|
const embeddedAdminConnectionString = `postgres://paperclip:paperclip@127.0.0.1:${port}/postgres`;
|
||||||
|
|||||||
@@ -29,6 +29,7 @@ import {
|
|||||||
issueService,
|
issueService,
|
||||||
logActivity,
|
logActivity,
|
||||||
secretService,
|
secretService,
|
||||||
|
workspaceOperationService,
|
||||||
} from "../services/index.js";
|
} from "../services/index.js";
|
||||||
import { conflict, forbidden, notFound, unprocessable } from "../errors.js";
|
import { conflict, forbidden, notFound, unprocessable } from "../errors.js";
|
||||||
import { assertBoard, assertCompanyAccess, getActorInfo } from "./authz.js";
|
import { assertBoard, assertCompanyAccess, getActorInfo } from "./authz.js";
|
||||||
@@ -62,6 +63,7 @@ export function agentRoutes(db: Db) {
|
|||||||
const heartbeat = heartbeatService(db);
|
const heartbeat = heartbeatService(db);
|
||||||
const issueApprovalsSvc = issueApprovalService(db);
|
const issueApprovalsSvc = issueApprovalService(db);
|
||||||
const secretsSvc = secretService(db);
|
const secretsSvc = secretService(db);
|
||||||
|
const workspaceOperations = workspaceOperationService(db);
|
||||||
const strictSecretsMode = process.env.PAPERCLIP_SECRETS_STRICT_MODE === "true";
|
const strictSecretsMode = process.env.PAPERCLIP_SECRETS_STRICT_MODE === "true";
|
||||||
|
|
||||||
function canCreateAgents(agent: { role: string; permissions: Record<string, unknown> | null | undefined }) {
|
function canCreateAgents(agent: { role: string; permissions: Record<string, unknown> | null | undefined }) {
|
||||||
@@ -1560,6 +1562,40 @@ export function agentRoutes(db: Db) {
|
|||||||
res.json(result);
|
res.json(result);
|
||||||
});
|
});
|
||||||
|
|
||||||
|
router.get("/heartbeat-runs/:runId/workspace-operations", async (req, res) => {
|
||||||
|
const runId = req.params.runId as string;
|
||||||
|
const run = await heartbeat.getRun(runId);
|
||||||
|
if (!run) {
|
||||||
|
res.status(404).json({ error: "Heartbeat run not found" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
assertCompanyAccess(req, run.companyId);
|
||||||
|
|
||||||
|
const context = asRecord(run.contextSnapshot);
|
||||||
|
const executionWorkspaceId = asNonEmptyString(context?.executionWorkspaceId);
|
||||||
|
const operations = await workspaceOperations.listForRun(runId, executionWorkspaceId);
|
||||||
|
res.json(redactCurrentUserValue(operations));
|
||||||
|
});
|
||||||
|
|
||||||
|
router.get("/workspace-operations/:operationId/log", async (req, res) => {
|
||||||
|
const operationId = req.params.operationId as string;
|
||||||
|
const operation = await workspaceOperations.getById(operationId);
|
||||||
|
if (!operation) {
|
||||||
|
res.status(404).json({ error: "Workspace operation not found" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
assertCompanyAccess(req, operation.companyId);
|
||||||
|
|
||||||
|
const offset = Number(req.query.offset ?? 0);
|
||||||
|
const limitBytes = Number(req.query.limitBytes ?? 256000);
|
||||||
|
const result = await workspaceOperations.readLog(operationId, {
|
||||||
|
offset: Number.isFinite(offset) ? offset : 0,
|
||||||
|
limitBytes: Number.isFinite(limitBytes) ? limitBytes : 256000,
|
||||||
|
});
|
||||||
|
|
||||||
|
res.json(result);
|
||||||
|
});
|
||||||
|
|
||||||
router.get("/issues/:issueId/live-runs", async (req, res) => {
|
router.get("/issues/:issueId/live-runs", async (req, res) => {
|
||||||
const rawId = req.params.issueId as string;
|
const rawId = req.params.issueId as string;
|
||||||
const issueSvc = issueService(db);
|
const issueSvc = issueService(db);
|
||||||
|
|||||||
181
server/src/routes/execution-workspaces.ts
Normal file
181
server/src/routes/execution-workspaces.ts
Normal file
@@ -0,0 +1,181 @@
|
|||||||
|
import { and, eq } from "drizzle-orm";
|
||||||
|
import { Router } from "express";
|
||||||
|
import type { Db } from "@paperclipai/db";
|
||||||
|
import { issues, projects, projectWorkspaces } from "@paperclipai/db";
|
||||||
|
import { updateExecutionWorkspaceSchema } from "@paperclipai/shared";
|
||||||
|
import { validate } from "../middleware/validate.js";
|
||||||
|
import { executionWorkspaceService, logActivity, workspaceOperationService } from "../services/index.js";
|
||||||
|
import { parseProjectExecutionWorkspacePolicy } from "../services/execution-workspace-policy.js";
|
||||||
|
import {
|
||||||
|
cleanupExecutionWorkspaceArtifacts,
|
||||||
|
stopRuntimeServicesForExecutionWorkspace,
|
||||||
|
} from "../services/workspace-runtime.js";
|
||||||
|
import { assertCompanyAccess, getActorInfo } from "./authz.js";
|
||||||
|
|
||||||
|
const TERMINAL_ISSUE_STATUSES = new Set(["done", "cancelled"]);
|
||||||
|
|
||||||
|
export function executionWorkspaceRoutes(db: Db) {
|
||||||
|
const router = Router();
|
||||||
|
const svc = executionWorkspaceService(db);
|
||||||
|
const workspaceOperationsSvc = workspaceOperationService(db);
|
||||||
|
|
||||||
|
router.get("/companies/:companyId/execution-workspaces", async (req, res) => {
|
||||||
|
const companyId = req.params.companyId as string;
|
||||||
|
assertCompanyAccess(req, companyId);
|
||||||
|
const workspaces = await svc.list(companyId, {
|
||||||
|
projectId: req.query.projectId as string | undefined,
|
||||||
|
projectWorkspaceId: req.query.projectWorkspaceId as string | undefined,
|
||||||
|
issueId: req.query.issueId as string | undefined,
|
||||||
|
status: req.query.status as string | undefined,
|
||||||
|
reuseEligible: req.query.reuseEligible === "true",
|
||||||
|
});
|
||||||
|
res.json(workspaces);
|
||||||
|
});
|
||||||
|
|
||||||
|
router.get("/execution-workspaces/:id", async (req, res) => {
|
||||||
|
const id = req.params.id as string;
|
||||||
|
const workspace = await svc.getById(id);
|
||||||
|
if (!workspace) {
|
||||||
|
res.status(404).json({ error: "Execution workspace not found" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
assertCompanyAccess(req, workspace.companyId);
|
||||||
|
res.json(workspace);
|
||||||
|
});
|
||||||
|
|
||||||
|
router.patch("/execution-workspaces/:id", validate(updateExecutionWorkspaceSchema), async (req, res) => {
|
||||||
|
const id = req.params.id as string;
|
||||||
|
const existing = await svc.getById(id);
|
||||||
|
if (!existing) {
|
||||||
|
res.status(404).json({ error: "Execution workspace not found" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
assertCompanyAccess(req, existing.companyId);
|
||||||
|
const patch: Record<string, unknown> = {
|
||||||
|
...req.body,
|
||||||
|
...(req.body.cleanupEligibleAt ? { cleanupEligibleAt: new Date(req.body.cleanupEligibleAt) } : {}),
|
||||||
|
};
|
||||||
|
let workspace = existing;
|
||||||
|
let cleanupWarnings: string[] = [];
|
||||||
|
|
||||||
|
if (req.body.status === "archived" && existing.status !== "archived") {
|
||||||
|
const linkedIssues = await db
|
||||||
|
.select({
|
||||||
|
id: issues.id,
|
||||||
|
status: issues.status,
|
||||||
|
})
|
||||||
|
.from(issues)
|
||||||
|
.where(and(eq(issues.companyId, existing.companyId), eq(issues.executionWorkspaceId, existing.id)));
|
||||||
|
const activeLinkedIssues = linkedIssues.filter((issue) => !TERMINAL_ISSUE_STATUSES.has(issue.status));
|
||||||
|
|
||||||
|
if (activeLinkedIssues.length > 0) {
|
||||||
|
res.status(409).json({
|
||||||
|
error: `Cannot archive execution workspace while ${activeLinkedIssues.length} linked issue(s) are still open`,
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const closedAt = new Date();
|
||||||
|
const archivedWorkspace = await svc.update(id, {
|
||||||
|
...patch,
|
||||||
|
status: "archived",
|
||||||
|
closedAt,
|
||||||
|
cleanupReason: null,
|
||||||
|
});
|
||||||
|
if (!archivedWorkspace) {
|
||||||
|
res.status(404).json({ error: "Execution workspace not found" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
workspace = archivedWorkspace;
|
||||||
|
|
||||||
|
try {
|
||||||
|
await stopRuntimeServicesForExecutionWorkspace({
|
||||||
|
db,
|
||||||
|
executionWorkspaceId: existing.id,
|
||||||
|
workspaceCwd: existing.cwd,
|
||||||
|
});
|
||||||
|
const projectWorkspace = existing.projectWorkspaceId
|
||||||
|
? await db
|
||||||
|
.select({
|
||||||
|
cwd: projectWorkspaces.cwd,
|
||||||
|
cleanupCommand: projectWorkspaces.cleanupCommand,
|
||||||
|
})
|
||||||
|
.from(projectWorkspaces)
|
||||||
|
.where(
|
||||||
|
and(
|
||||||
|
eq(projectWorkspaces.id, existing.projectWorkspaceId),
|
||||||
|
eq(projectWorkspaces.companyId, existing.companyId),
|
||||||
|
),
|
||||||
|
)
|
||||||
|
.then((rows) => rows[0] ?? null)
|
||||||
|
: null;
|
||||||
|
const projectPolicy = existing.projectId
|
||||||
|
? await db
|
||||||
|
.select({
|
||||||
|
executionWorkspacePolicy: projects.executionWorkspacePolicy,
|
||||||
|
})
|
||||||
|
.from(projects)
|
||||||
|
.where(and(eq(projects.id, existing.projectId), eq(projects.companyId, existing.companyId)))
|
||||||
|
.then((rows) => parseProjectExecutionWorkspacePolicy(rows[0]?.executionWorkspacePolicy))
|
||||||
|
: null;
|
||||||
|
const cleanupResult = await cleanupExecutionWorkspaceArtifacts({
|
||||||
|
workspace: existing,
|
||||||
|
projectWorkspace,
|
||||||
|
teardownCommand: projectPolicy?.workspaceStrategy?.teardownCommand ?? null,
|
||||||
|
recorder: workspaceOperationsSvc.createRecorder({
|
||||||
|
companyId: existing.companyId,
|
||||||
|
executionWorkspaceId: existing.id,
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
cleanupWarnings = cleanupResult.warnings;
|
||||||
|
const cleanupPatch: Record<string, unknown> = {
|
||||||
|
closedAt,
|
||||||
|
cleanupReason: cleanupWarnings.length > 0 ? cleanupWarnings.join(" | ") : null,
|
||||||
|
};
|
||||||
|
if (!cleanupResult.cleaned) {
|
||||||
|
cleanupPatch.status = "cleanup_failed";
|
||||||
|
}
|
||||||
|
if (cleanupResult.warnings.length > 0 || !cleanupResult.cleaned) {
|
||||||
|
workspace = (await svc.update(id, cleanupPatch)) ?? workspace;
|
||||||
|
}
|
||||||
|
} catch (error) {
|
||||||
|
const failureReason = error instanceof Error ? error.message : String(error);
|
||||||
|
workspace =
|
||||||
|
(await svc.update(id, {
|
||||||
|
status: "cleanup_failed",
|
||||||
|
closedAt,
|
||||||
|
cleanupReason: failureReason,
|
||||||
|
})) ?? workspace;
|
||||||
|
res.status(500).json({
|
||||||
|
error: `Failed to archive execution workspace: ${failureReason}`,
|
||||||
|
});
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
const updatedWorkspace = await svc.update(id, patch);
|
||||||
|
if (!updatedWorkspace) {
|
||||||
|
res.status(404).json({ error: "Execution workspace not found" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
workspace = updatedWorkspace;
|
||||||
|
}
|
||||||
|
const actor = getActorInfo(req);
|
||||||
|
await logActivity(db, {
|
||||||
|
companyId: existing.companyId,
|
||||||
|
actorType: actor.actorType,
|
||||||
|
actorId: actor.actorId,
|
||||||
|
agentId: actor.agentId,
|
||||||
|
runId: actor.runId,
|
||||||
|
action: "execution_workspace.updated",
|
||||||
|
entityType: "execution_workspace",
|
||||||
|
entityId: workspace.id,
|
||||||
|
details: {
|
||||||
|
changedKeys: Object.keys(req.body).sort(),
|
||||||
|
...(cleanupWarnings.length > 0 ? { cleanupWarnings } : {}),
|
||||||
|
},
|
||||||
|
});
|
||||||
|
res.json(workspace);
|
||||||
|
});
|
||||||
|
|
||||||
|
return router;
|
||||||
|
}
|
||||||
@@ -12,3 +12,4 @@ export { dashboardRoutes } from "./dashboard.js";
|
|||||||
export { sidebarBadgeRoutes } from "./sidebar-badges.js";
|
export { sidebarBadgeRoutes } from "./sidebar-badges.js";
|
||||||
export { llmRoutes } from "./llms.js";
|
export { llmRoutes } from "./llms.js";
|
||||||
export { accessRoutes } from "./access.js";
|
export { accessRoutes } from "./access.js";
|
||||||
|
export { instanceSettingsRoutes } from "./instance-settings.js";
|
||||||
|
|||||||
59
server/src/routes/instance-settings.ts
Normal file
59
server/src/routes/instance-settings.ts
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
import { Router, type Request } from "express";
|
||||||
|
import type { Db } from "@paperclipai/db";
|
||||||
|
import { patchInstanceExperimentalSettingsSchema } from "@paperclipai/shared";
|
||||||
|
import { forbidden } from "../errors.js";
|
||||||
|
import { validate } from "../middleware/validate.js";
|
||||||
|
import { instanceSettingsService, logActivity } from "../services/index.js";
|
||||||
|
import { getActorInfo } from "./authz.js";
|
||||||
|
|
||||||
|
function assertCanManageInstanceSettings(req: Request) {
|
||||||
|
if (req.actor.type !== "board") {
|
||||||
|
throw forbidden("Board access required");
|
||||||
|
}
|
||||||
|
if (req.actor.source === "local_implicit" || req.actor.isInstanceAdmin) {
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
throw forbidden("Instance admin access required");
|
||||||
|
}
|
||||||
|
|
||||||
|
export function instanceSettingsRoutes(db: Db) {
|
||||||
|
const router = Router();
|
||||||
|
const svc = instanceSettingsService(db);
|
||||||
|
|
||||||
|
router.get("/instance/settings/experimental", async (req, res) => {
|
||||||
|
assertCanManageInstanceSettings(req);
|
||||||
|
res.json(await svc.getExperimental());
|
||||||
|
});
|
||||||
|
|
||||||
|
router.patch(
|
||||||
|
"/instance/settings/experimental",
|
||||||
|
validate(patchInstanceExperimentalSettingsSchema),
|
||||||
|
async (req, res) => {
|
||||||
|
assertCanManageInstanceSettings(req);
|
||||||
|
const updated = await svc.updateExperimental(req.body);
|
||||||
|
const actor = getActorInfo(req);
|
||||||
|
const companyIds = await svc.listCompanyIds();
|
||||||
|
await Promise.all(
|
||||||
|
companyIds.map((companyId) =>
|
||||||
|
logActivity(db, {
|
||||||
|
companyId,
|
||||||
|
actorType: actor.actorType,
|
||||||
|
actorId: actor.actorId,
|
||||||
|
agentId: actor.agentId,
|
||||||
|
runId: actor.runId,
|
||||||
|
action: "instance.settings.experimental_updated",
|
||||||
|
entityType: "instance_settings",
|
||||||
|
entityId: updated.id,
|
||||||
|
details: {
|
||||||
|
experimental: updated.experimental,
|
||||||
|
changedKeys: Object.keys(req.body).sort(),
|
||||||
|
},
|
||||||
|
}),
|
||||||
|
),
|
||||||
|
);
|
||||||
|
res.json(updated.experimental);
|
||||||
|
},
|
||||||
|
);
|
||||||
|
|
||||||
|
return router;
|
||||||
|
}
|
||||||
@@ -4,11 +4,13 @@ import type { Db } from "@paperclipai/db";
|
|||||||
import {
|
import {
|
||||||
addIssueCommentSchema,
|
addIssueCommentSchema,
|
||||||
createIssueAttachmentMetadataSchema,
|
createIssueAttachmentMetadataSchema,
|
||||||
|
createIssueWorkProductSchema,
|
||||||
createIssueLabelSchema,
|
createIssueLabelSchema,
|
||||||
checkoutIssueSchema,
|
checkoutIssueSchema,
|
||||||
createIssueSchema,
|
createIssueSchema,
|
||||||
linkIssueApprovalSchema,
|
linkIssueApprovalSchema,
|
||||||
issueDocumentKeySchema,
|
issueDocumentKeySchema,
|
||||||
|
updateIssueWorkProductSchema,
|
||||||
upsertIssueDocumentSchema,
|
upsertIssueDocumentSchema,
|
||||||
updateIssueSchema,
|
updateIssueSchema,
|
||||||
} from "@paperclipai/shared";
|
} from "@paperclipai/shared";
|
||||||
@@ -17,6 +19,7 @@ import { validate } from "../middleware/validate.js";
|
|||||||
import {
|
import {
|
||||||
accessService,
|
accessService,
|
||||||
agentService,
|
agentService,
|
||||||
|
executionWorkspaceService,
|
||||||
goalService,
|
goalService,
|
||||||
heartbeatService,
|
heartbeatService,
|
||||||
issueApprovalService,
|
issueApprovalService,
|
||||||
@@ -24,6 +27,7 @@ import {
|
|||||||
documentService,
|
documentService,
|
||||||
logActivity,
|
logActivity,
|
||||||
projectService,
|
projectService,
|
||||||
|
workProductService,
|
||||||
} from "../services/index.js";
|
} from "../services/index.js";
|
||||||
import { logger } from "../middleware/logger.js";
|
import { logger } from "../middleware/logger.js";
|
||||||
import { forbidden, HttpError, unauthorized } from "../errors.js";
|
import { forbidden, HttpError, unauthorized } from "../errors.js";
|
||||||
@@ -42,6 +46,8 @@ export function issueRoutes(db: Db, storage: StorageService) {
|
|||||||
const projectsSvc = projectService(db);
|
const projectsSvc = projectService(db);
|
||||||
const goalsSvc = goalService(db);
|
const goalsSvc = goalService(db);
|
||||||
const issueApprovalsSvc = issueApprovalService(db);
|
const issueApprovalsSvc = issueApprovalService(db);
|
||||||
|
const executionWorkspacesSvc = executionWorkspaceService(db);
|
||||||
|
const workProductsSvc = workProductService(db);
|
||||||
const documentsSvc = documentService(db);
|
const documentsSvc = documentService(db);
|
||||||
const upload = multer({
|
const upload = multer({
|
||||||
storage: multer.memoryStorage(),
|
storage: multer.memoryStorage(),
|
||||||
@@ -311,6 +317,10 @@ export function issueRoutes(db: Db, storage: StorageService) {
|
|||||||
const mentionedProjects = mentionedProjectIds.length > 0
|
const mentionedProjects = mentionedProjectIds.length > 0
|
||||||
? await projectsSvc.listByIds(issue.companyId, mentionedProjectIds)
|
? await projectsSvc.listByIds(issue.companyId, mentionedProjectIds)
|
||||||
: [];
|
: [];
|
||||||
|
const currentExecutionWorkspace = issue.executionWorkspaceId
|
||||||
|
? await executionWorkspacesSvc.getById(issue.executionWorkspaceId)
|
||||||
|
: null;
|
||||||
|
const workProducts = await workProductsSvc.listForIssue(issue.id);
|
||||||
res.json({
|
res.json({
|
||||||
...issue,
|
...issue,
|
||||||
goalId: goal?.id ?? issue.goalId,
|
goalId: goal?.id ?? issue.goalId,
|
||||||
@@ -319,6 +329,8 @@ export function issueRoutes(db: Db, storage: StorageService) {
|
|||||||
project: project ?? null,
|
project: project ?? null,
|
||||||
goal: goal ?? null,
|
goal: goal ?? null,
|
||||||
mentionedProjects,
|
mentionedProjects,
|
||||||
|
currentExecutionWorkspace,
|
||||||
|
workProducts,
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
@@ -395,6 +407,18 @@ export function issueRoutes(db: Db, storage: StorageService) {
|
|||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
router.get("/issues/:id/work-products", async (req, res) => {
|
||||||
|
const id = req.params.id as string;
|
||||||
|
const issue = await svc.getById(id);
|
||||||
|
if (!issue) {
|
||||||
|
res.status(404).json({ error: "Issue not found" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
assertCompanyAccess(req, issue.companyId);
|
||||||
|
const workProducts = await workProductsSvc.listForIssue(issue.id);
|
||||||
|
res.json(workProducts);
|
||||||
|
});
|
||||||
|
|
||||||
router.get("/issues/:id/documents", async (req, res) => {
|
router.get("/issues/:id/documents", async (req, res) => {
|
||||||
const id = req.params.id as string;
|
const id = req.params.id as string;
|
||||||
const issue = await svc.getById(id);
|
const issue = await svc.getById(id);
|
||||||
@@ -535,6 +559,93 @@ export function issueRoutes(db: Db, storage: StorageService) {
|
|||||||
res.json({ ok: true });
|
res.json({ ok: true });
|
||||||
});
|
});
|
||||||
|
|
||||||
|
router.post("/issues/:id/work-products", validate(createIssueWorkProductSchema), async (req, res) => {
|
||||||
|
const id = req.params.id as string;
|
||||||
|
const issue = await svc.getById(id);
|
||||||
|
if (!issue) {
|
||||||
|
res.status(404).json({ error: "Issue not found" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
assertCompanyAccess(req, issue.companyId);
|
||||||
|
const product = await workProductsSvc.createForIssue(issue.id, issue.companyId, {
|
||||||
|
...req.body,
|
||||||
|
projectId: req.body.projectId ?? issue.projectId ?? null,
|
||||||
|
});
|
||||||
|
if (!product) {
|
||||||
|
res.status(422).json({ error: "Invalid work product payload" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const actor = getActorInfo(req);
|
||||||
|
await logActivity(db, {
|
||||||
|
companyId: issue.companyId,
|
||||||
|
actorType: actor.actorType,
|
||||||
|
actorId: actor.actorId,
|
||||||
|
agentId: actor.agentId,
|
||||||
|
runId: actor.runId,
|
||||||
|
action: "issue.work_product_created",
|
||||||
|
entityType: "issue",
|
||||||
|
entityId: issue.id,
|
||||||
|
details: { workProductId: product.id, type: product.type, provider: product.provider },
|
||||||
|
});
|
||||||
|
res.status(201).json(product);
|
||||||
|
});
|
||||||
|
|
||||||
|
router.patch("/work-products/:id", validate(updateIssueWorkProductSchema), async (req, res) => {
|
||||||
|
const id = req.params.id as string;
|
||||||
|
const existing = await workProductsSvc.getById(id);
|
||||||
|
if (!existing) {
|
||||||
|
res.status(404).json({ error: "Work product not found" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
assertCompanyAccess(req, existing.companyId);
|
||||||
|
const product = await workProductsSvc.update(id, req.body);
|
||||||
|
if (!product) {
|
||||||
|
res.status(404).json({ error: "Work product not found" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const actor = getActorInfo(req);
|
||||||
|
await logActivity(db, {
|
||||||
|
companyId: existing.companyId,
|
||||||
|
actorType: actor.actorType,
|
||||||
|
actorId: actor.actorId,
|
||||||
|
agentId: actor.agentId,
|
||||||
|
runId: actor.runId,
|
||||||
|
action: "issue.work_product_updated",
|
||||||
|
entityType: "issue",
|
||||||
|
entityId: existing.issueId,
|
||||||
|
details: { workProductId: product.id, changedKeys: Object.keys(req.body).sort() },
|
||||||
|
});
|
||||||
|
res.json(product);
|
||||||
|
});
|
||||||
|
|
||||||
|
router.delete("/work-products/:id", async (req, res) => {
|
||||||
|
const id = req.params.id as string;
|
||||||
|
const existing = await workProductsSvc.getById(id);
|
||||||
|
if (!existing) {
|
||||||
|
res.status(404).json({ error: "Work product not found" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
assertCompanyAccess(req, existing.companyId);
|
||||||
|
const removed = await workProductsSvc.remove(id);
|
||||||
|
if (!removed) {
|
||||||
|
res.status(404).json({ error: "Work product not found" });
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const actor = getActorInfo(req);
|
||||||
|
await logActivity(db, {
|
||||||
|
companyId: existing.companyId,
|
||||||
|
actorType: actor.actorType,
|
||||||
|
actorId: actor.actorId,
|
||||||
|
agentId: actor.agentId,
|
||||||
|
runId: actor.runId,
|
||||||
|
action: "issue.work_product_deleted",
|
||||||
|
entityType: "issue",
|
||||||
|
entityId: existing.issueId,
|
||||||
|
details: { workProductId: removed.id, type: removed.type },
|
||||||
|
});
|
||||||
|
res.json(removed);
|
||||||
|
});
|
||||||
|
|
||||||
router.post("/issues/:id/read", async (req, res) => {
|
router.post("/issues/:id/read", async (req, res) => {
|
||||||
const id = req.params.id as string;
|
const id = req.params.id as string;
|
||||||
const issue = await svc.getById(id);
|
const issue = await svc.getById(id);
|
||||||
|
|||||||
@@ -2,11 +2,12 @@ import type {
|
|||||||
ExecutionWorkspaceMode,
|
ExecutionWorkspaceMode,
|
||||||
ExecutionWorkspaceStrategy,
|
ExecutionWorkspaceStrategy,
|
||||||
IssueExecutionWorkspaceSettings,
|
IssueExecutionWorkspaceSettings,
|
||||||
|
ProjectExecutionWorkspaceDefaultMode,
|
||||||
ProjectExecutionWorkspacePolicy,
|
ProjectExecutionWorkspacePolicy,
|
||||||
} from "@paperclipai/shared";
|
} from "@paperclipai/shared";
|
||||||
import { asString, parseObject } from "../adapters/utils.js";
|
import { asString, parseObject } from "../adapters/utils.js";
|
||||||
|
|
||||||
type ParsedExecutionWorkspaceMode = Exclude<ExecutionWorkspaceMode, "inherit">;
|
type ParsedExecutionWorkspaceMode = Exclude<ExecutionWorkspaceMode, "inherit" | "reuse_existing">;
|
||||||
|
|
||||||
function cloneRecord(value: Record<string, unknown> | null | undefined): Record<string, unknown> | null {
|
function cloneRecord(value: Record<string, unknown> | null | undefined): Record<string, unknown> | null {
|
||||||
if (!value) return null;
|
if (!value) return null;
|
||||||
@@ -16,7 +17,7 @@ function cloneRecord(value: Record<string, unknown> | null | undefined): Record<
|
|||||||
function parseExecutionWorkspaceStrategy(raw: unknown): ExecutionWorkspaceStrategy | null {
|
function parseExecutionWorkspaceStrategy(raw: unknown): ExecutionWorkspaceStrategy | null {
|
||||||
const parsed = parseObject(raw);
|
const parsed = parseObject(raw);
|
||||||
const type = asString(parsed.type, "");
|
const type = asString(parsed.type, "");
|
||||||
if (type !== "project_primary" && type !== "git_worktree") {
|
if (type !== "project_primary" && type !== "git_worktree" && type !== "adapter_managed" && type !== "cloud_sandbox") {
|
||||||
return null;
|
return null;
|
||||||
}
|
}
|
||||||
return {
|
return {
|
||||||
@@ -33,16 +34,31 @@ export function parseProjectExecutionWorkspacePolicy(raw: unknown): ProjectExecu
|
|||||||
const parsed = parseObject(raw);
|
const parsed = parseObject(raw);
|
||||||
if (Object.keys(parsed).length === 0) return null;
|
if (Object.keys(parsed).length === 0) return null;
|
||||||
const enabled = typeof parsed.enabled === "boolean" ? parsed.enabled : false;
|
const enabled = typeof parsed.enabled === "boolean" ? parsed.enabled : false;
|
||||||
|
const workspaceStrategy = parseExecutionWorkspaceStrategy(parsed.workspaceStrategy);
|
||||||
const defaultMode = asString(parsed.defaultMode, "");
|
const defaultMode = asString(parsed.defaultMode, "");
|
||||||
|
const defaultProjectWorkspaceId =
|
||||||
|
typeof parsed.defaultProjectWorkspaceId === "string" ? parsed.defaultProjectWorkspaceId : undefined;
|
||||||
const allowIssueOverride =
|
const allowIssueOverride =
|
||||||
typeof parsed.allowIssueOverride === "boolean" ? parsed.allowIssueOverride : undefined;
|
typeof parsed.allowIssueOverride === "boolean" ? parsed.allowIssueOverride : undefined;
|
||||||
|
const normalizedDefaultMode = (() => {
|
||||||
|
if (
|
||||||
|
defaultMode === "shared_workspace" ||
|
||||||
|
defaultMode === "isolated_workspace" ||
|
||||||
|
defaultMode === "operator_branch" ||
|
||||||
|
defaultMode === "adapter_default"
|
||||||
|
) {
|
||||||
|
return defaultMode as ProjectExecutionWorkspaceDefaultMode;
|
||||||
|
}
|
||||||
|
if (defaultMode === "project_primary") return "shared_workspace";
|
||||||
|
if (defaultMode === "isolated") return "isolated_workspace";
|
||||||
|
return undefined;
|
||||||
|
})();
|
||||||
return {
|
return {
|
||||||
enabled,
|
enabled,
|
||||||
...(defaultMode === "project_primary" || defaultMode === "isolated" ? { defaultMode } : {}),
|
...(normalizedDefaultMode ? { defaultMode: normalizedDefaultMode } : {}),
|
||||||
...(allowIssueOverride !== undefined ? { allowIssueOverride } : {}),
|
...(allowIssueOverride !== undefined ? { allowIssueOverride } : {}),
|
||||||
...(parseExecutionWorkspaceStrategy(parsed.workspaceStrategy)
|
...(defaultProjectWorkspaceId ? { defaultProjectWorkspaceId } : {}),
|
||||||
? { workspaceStrategy: parseExecutionWorkspaceStrategy(parsed.workspaceStrategy) }
|
...(workspaceStrategy ? { workspaceStrategy } : {}),
|
||||||
: {}),
|
|
||||||
...(parsed.workspaceRuntime && typeof parsed.workspaceRuntime === "object" && !Array.isArray(parsed.workspaceRuntime)
|
...(parsed.workspaceRuntime && typeof parsed.workspaceRuntime === "object" && !Array.isArray(parsed.workspaceRuntime)
|
||||||
? { workspaceRuntime: { ...(parsed.workspaceRuntime as Record<string, unknown>) } }
|
? { workspaceRuntime: { ...(parsed.workspaceRuntime as Record<string, unknown>) } }
|
||||||
: {}),
|
: {}),
|
||||||
@@ -52,23 +68,48 @@ export function parseProjectExecutionWorkspacePolicy(raw: unknown): ProjectExecu
|
|||||||
...(parsed.pullRequestPolicy && typeof parsed.pullRequestPolicy === "object" && !Array.isArray(parsed.pullRequestPolicy)
|
...(parsed.pullRequestPolicy && typeof parsed.pullRequestPolicy === "object" && !Array.isArray(parsed.pullRequestPolicy)
|
||||||
? { pullRequestPolicy: { ...(parsed.pullRequestPolicy as Record<string, unknown>) } }
|
? { pullRequestPolicy: { ...(parsed.pullRequestPolicy as Record<string, unknown>) } }
|
||||||
: {}),
|
: {}),
|
||||||
|
...(parsed.runtimePolicy && typeof parsed.runtimePolicy === "object" && !Array.isArray(parsed.runtimePolicy)
|
||||||
|
? { runtimePolicy: { ...(parsed.runtimePolicy as Record<string, unknown>) } }
|
||||||
|
: {}),
|
||||||
...(parsed.cleanupPolicy && typeof parsed.cleanupPolicy === "object" && !Array.isArray(parsed.cleanupPolicy)
|
...(parsed.cleanupPolicy && typeof parsed.cleanupPolicy === "object" && !Array.isArray(parsed.cleanupPolicy)
|
||||||
? { cleanupPolicy: { ...(parsed.cleanupPolicy as Record<string, unknown>) } }
|
? { cleanupPolicy: { ...(parsed.cleanupPolicy as Record<string, unknown>) } }
|
||||||
: {}),
|
: {}),
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function gateProjectExecutionWorkspacePolicy(
|
||||||
|
projectPolicy: ProjectExecutionWorkspacePolicy | null,
|
||||||
|
isolatedWorkspacesEnabled: boolean,
|
||||||
|
): ProjectExecutionWorkspacePolicy | null {
|
||||||
|
if (!isolatedWorkspacesEnabled) return null;
|
||||||
|
return projectPolicy;
|
||||||
|
}
|
||||||
|
|
||||||
export function parseIssueExecutionWorkspaceSettings(raw: unknown): IssueExecutionWorkspaceSettings | null {
|
export function parseIssueExecutionWorkspaceSettings(raw: unknown): IssueExecutionWorkspaceSettings | null {
|
||||||
const parsed = parseObject(raw);
|
const parsed = parseObject(raw);
|
||||||
if (Object.keys(parsed).length === 0) return null;
|
if (Object.keys(parsed).length === 0) return null;
|
||||||
|
const workspaceStrategy = parseExecutionWorkspaceStrategy(parsed.workspaceStrategy);
|
||||||
const mode = asString(parsed.mode, "");
|
const mode = asString(parsed.mode, "");
|
||||||
|
const normalizedMode = (() => {
|
||||||
|
if (
|
||||||
|
mode === "inherit" ||
|
||||||
|
mode === "shared_workspace" ||
|
||||||
|
mode === "isolated_workspace" ||
|
||||||
|
mode === "operator_branch" ||
|
||||||
|
mode === "reuse_existing" ||
|
||||||
|
mode === "agent_default"
|
||||||
|
) {
|
||||||
|
return mode;
|
||||||
|
}
|
||||||
|
if (mode === "project_primary") return "shared_workspace";
|
||||||
|
if (mode === "isolated") return "isolated_workspace";
|
||||||
|
return "";
|
||||||
|
})();
|
||||||
return {
|
return {
|
||||||
...(mode === "inherit" || mode === "project_primary" || mode === "isolated" || mode === "agent_default"
|
...(normalizedMode
|
||||||
? { mode }
|
? { mode: normalizedMode as IssueExecutionWorkspaceSettings["mode"] }
|
||||||
: {}),
|
|
||||||
...(parseExecutionWorkspaceStrategy(parsed.workspaceStrategy)
|
|
||||||
? { workspaceStrategy: parseExecutionWorkspaceStrategy(parsed.workspaceStrategy) }
|
|
||||||
: {}),
|
: {}),
|
||||||
|
...(workspaceStrategy ? { workspaceStrategy } : {}),
|
||||||
...(parsed.workspaceRuntime && typeof parsed.workspaceRuntime === "object" && !Array.isArray(parsed.workspaceRuntime)
|
...(parsed.workspaceRuntime && typeof parsed.workspaceRuntime === "object" && !Array.isArray(parsed.workspaceRuntime)
|
||||||
? { workspaceRuntime: { ...(parsed.workspaceRuntime as Record<string, unknown>) } }
|
? { workspaceRuntime: { ...(parsed.workspaceRuntime as Record<string, unknown>) } }
|
||||||
: {}),
|
: {}),
|
||||||
@@ -80,7 +121,14 @@ export function defaultIssueExecutionWorkspaceSettingsForProject(
|
|||||||
): IssueExecutionWorkspaceSettings | null {
|
): IssueExecutionWorkspaceSettings | null {
|
||||||
if (!projectPolicy?.enabled) return null;
|
if (!projectPolicy?.enabled) return null;
|
||||||
return {
|
return {
|
||||||
mode: projectPolicy.defaultMode === "isolated" ? "isolated" : "project_primary",
|
mode:
|
||||||
|
projectPolicy.defaultMode === "isolated_workspace"
|
||||||
|
? "isolated_workspace"
|
||||||
|
: projectPolicy.defaultMode === "operator_branch"
|
||||||
|
? "operator_branch"
|
||||||
|
: projectPolicy.defaultMode === "adapter_default"
|
||||||
|
? "agent_default"
|
||||||
|
: "shared_workspace",
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -90,16 +138,19 @@ export function resolveExecutionWorkspaceMode(input: {
|
|||||||
legacyUseProjectWorkspace: boolean | null;
|
legacyUseProjectWorkspace: boolean | null;
|
||||||
}): ParsedExecutionWorkspaceMode {
|
}): ParsedExecutionWorkspaceMode {
|
||||||
const issueMode = input.issueSettings?.mode;
|
const issueMode = input.issueSettings?.mode;
|
||||||
if (issueMode && issueMode !== "inherit") {
|
if (issueMode && issueMode !== "inherit" && issueMode !== "reuse_existing") {
|
||||||
return issueMode;
|
return issueMode;
|
||||||
}
|
}
|
||||||
if (input.projectPolicy?.enabled) {
|
if (input.projectPolicy?.enabled) {
|
||||||
return input.projectPolicy.defaultMode === "isolated" ? "isolated" : "project_primary";
|
if (input.projectPolicy.defaultMode === "isolated_workspace") return "isolated_workspace";
|
||||||
|
if (input.projectPolicy.defaultMode === "operator_branch") return "operator_branch";
|
||||||
|
if (input.projectPolicy.defaultMode === "adapter_default") return "agent_default";
|
||||||
|
return "shared_workspace";
|
||||||
}
|
}
|
||||||
if (input.legacyUseProjectWorkspace === false) {
|
if (input.legacyUseProjectWorkspace === false) {
|
||||||
return "agent_default";
|
return "agent_default";
|
||||||
}
|
}
|
||||||
return "project_primary";
|
return "shared_workspace";
|
||||||
}
|
}
|
||||||
|
|
||||||
export function buildExecutionWorkspaceAdapterConfig(input: {
|
export function buildExecutionWorkspaceAdapterConfig(input: {
|
||||||
@@ -119,7 +170,7 @@ export function buildExecutionWorkspaceAdapterConfig(input: {
|
|||||||
const hasWorkspaceControl = projectHasPolicy || issueHasWorkspaceOverrides || input.legacyUseProjectWorkspace === false;
|
const hasWorkspaceControl = projectHasPolicy || issueHasWorkspaceOverrides || input.legacyUseProjectWorkspace === false;
|
||||||
|
|
||||||
if (hasWorkspaceControl) {
|
if (hasWorkspaceControl) {
|
||||||
if (input.mode === "isolated") {
|
if (input.mode === "isolated_workspace") {
|
||||||
const strategy =
|
const strategy =
|
||||||
input.issueSettings?.workspaceStrategy ??
|
input.issueSettings?.workspaceStrategy ??
|
||||||
input.projectPolicy?.workspaceStrategy ??
|
input.projectPolicy?.workspaceStrategy ??
|
||||||
|
|||||||
99
server/src/services/execution-workspaces.ts
Normal file
99
server/src/services/execution-workspaces.ts
Normal file
@@ -0,0 +1,99 @@
|
|||||||
|
import { and, desc, eq, inArray } from "drizzle-orm";
|
||||||
|
import type { Db } from "@paperclipai/db";
|
||||||
|
import { executionWorkspaces } from "@paperclipai/db";
|
||||||
|
import type { ExecutionWorkspace } from "@paperclipai/shared";
|
||||||
|
|
||||||
|
type ExecutionWorkspaceRow = typeof executionWorkspaces.$inferSelect;
|
||||||
|
|
||||||
|
function toExecutionWorkspace(row: ExecutionWorkspaceRow): ExecutionWorkspace {
|
||||||
|
return {
|
||||||
|
id: row.id,
|
||||||
|
companyId: row.companyId,
|
||||||
|
projectId: row.projectId,
|
||||||
|
projectWorkspaceId: row.projectWorkspaceId ?? null,
|
||||||
|
sourceIssueId: row.sourceIssueId ?? null,
|
||||||
|
mode: row.mode as ExecutionWorkspace["mode"],
|
||||||
|
strategyType: row.strategyType as ExecutionWorkspace["strategyType"],
|
||||||
|
name: row.name,
|
||||||
|
status: row.status as ExecutionWorkspace["status"],
|
||||||
|
cwd: row.cwd ?? null,
|
||||||
|
repoUrl: row.repoUrl ?? null,
|
||||||
|
baseRef: row.baseRef ?? null,
|
||||||
|
branchName: row.branchName ?? null,
|
||||||
|
providerType: row.providerType as ExecutionWorkspace["providerType"],
|
||||||
|
providerRef: row.providerRef ?? null,
|
||||||
|
derivedFromExecutionWorkspaceId: row.derivedFromExecutionWorkspaceId ?? null,
|
||||||
|
lastUsedAt: row.lastUsedAt,
|
||||||
|
openedAt: row.openedAt,
|
||||||
|
closedAt: row.closedAt ?? null,
|
||||||
|
cleanupEligibleAt: row.cleanupEligibleAt ?? null,
|
||||||
|
cleanupReason: row.cleanupReason ?? null,
|
||||||
|
metadata: (row.metadata as Record<string, unknown> | null) ?? null,
|
||||||
|
createdAt: row.createdAt,
|
||||||
|
updatedAt: row.updatedAt,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function executionWorkspaceService(db: Db) {
|
||||||
|
return {
|
||||||
|
list: async (companyId: string, filters?: {
|
||||||
|
projectId?: string;
|
||||||
|
projectWorkspaceId?: string;
|
||||||
|
issueId?: string;
|
||||||
|
status?: string;
|
||||||
|
reuseEligible?: boolean;
|
||||||
|
}) => {
|
||||||
|
const conditions = [eq(executionWorkspaces.companyId, companyId)];
|
||||||
|
if (filters?.projectId) conditions.push(eq(executionWorkspaces.projectId, filters.projectId));
|
||||||
|
if (filters?.projectWorkspaceId) {
|
||||||
|
conditions.push(eq(executionWorkspaces.projectWorkspaceId, filters.projectWorkspaceId));
|
||||||
|
}
|
||||||
|
if (filters?.issueId) conditions.push(eq(executionWorkspaces.sourceIssueId, filters.issueId));
|
||||||
|
if (filters?.status) {
|
||||||
|
const statuses = filters.status.split(",").map((value) => value.trim()).filter(Boolean);
|
||||||
|
if (statuses.length === 1) conditions.push(eq(executionWorkspaces.status, statuses[0]!));
|
||||||
|
else if (statuses.length > 1) conditions.push(inArray(executionWorkspaces.status, statuses));
|
||||||
|
}
|
||||||
|
if (filters?.reuseEligible) {
|
||||||
|
conditions.push(inArray(executionWorkspaces.status, ["active", "idle", "in_review"]));
|
||||||
|
}
|
||||||
|
|
||||||
|
const rows = await db
|
||||||
|
.select()
|
||||||
|
.from(executionWorkspaces)
|
||||||
|
.where(and(...conditions))
|
||||||
|
.orderBy(desc(executionWorkspaces.lastUsedAt), desc(executionWorkspaces.createdAt));
|
||||||
|
return rows.map(toExecutionWorkspace);
|
||||||
|
},
|
||||||
|
|
||||||
|
getById: async (id: string) => {
|
||||||
|
const row = await db
|
||||||
|
.select()
|
||||||
|
.from(executionWorkspaces)
|
||||||
|
.where(eq(executionWorkspaces.id, id))
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
return row ? toExecutionWorkspace(row) : null;
|
||||||
|
},
|
||||||
|
|
||||||
|
create: async (data: typeof executionWorkspaces.$inferInsert) => {
|
||||||
|
const row = await db
|
||||||
|
.insert(executionWorkspaces)
|
||||||
|
.values(data)
|
||||||
|
.returning()
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
return row ? toExecutionWorkspace(row) : null;
|
||||||
|
},
|
||||||
|
|
||||||
|
update: async (id: string, patch: Partial<typeof executionWorkspaces.$inferInsert>) => {
|
||||||
|
const row = await db
|
||||||
|
.update(executionWorkspaces)
|
||||||
|
.set({ ...patch, updatedAt: new Date() })
|
||||||
|
.where(eq(executionWorkspaces.id, id))
|
||||||
|
.returning()
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
return row ? toExecutionWorkspace(row) : null;
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export { toExecutionWorkspace };
|
||||||
@@ -1,5 +1,7 @@
|
|||||||
import fs from "node:fs/promises";
|
import fs from "node:fs/promises";
|
||||||
import path from "node:path";
|
import path from "node:path";
|
||||||
|
import { execFile as execFileCallback } from "node:child_process";
|
||||||
|
import { promisify } from "node:util";
|
||||||
import { and, asc, desc, eq, gt, inArray, sql } from "drizzle-orm";
|
import { and, asc, desc, eq, gt, inArray, sql } from "drizzle-orm";
|
||||||
import type { Db } from "@paperclipai/db";
|
import type { Db } from "@paperclipai/db";
|
||||||
import type { BillingType } from "@paperclipai/shared";
|
import type { BillingType } from "@paperclipai/shared";
|
||||||
@@ -25,22 +27,28 @@ import { parseObject, asBoolean, asNumber, appendWithCap, MAX_EXCERPT_BYTES } fr
|
|||||||
import { costService } from "./costs.js";
|
import { costService } from "./costs.js";
|
||||||
import { budgetService, type BudgetEnforcementScope } from "./budgets.js";
|
import { budgetService, type BudgetEnforcementScope } from "./budgets.js";
|
||||||
import { secretService } from "./secrets.js";
|
import { secretService } from "./secrets.js";
|
||||||
import { resolveDefaultAgentWorkspaceDir } from "../home-paths.js";
|
import { resolveDefaultAgentWorkspaceDir, resolveManagedProjectWorkspaceDir } from "../home-paths.js";
|
||||||
import { summarizeHeartbeatRunResultJson } from "./heartbeat-run-summary.js";
|
import { summarizeHeartbeatRunResultJson } from "./heartbeat-run-summary.js";
|
||||||
import {
|
import {
|
||||||
buildWorkspaceReadyComment,
|
buildWorkspaceReadyComment,
|
||||||
|
cleanupExecutionWorkspaceArtifacts,
|
||||||
ensureRuntimeServicesForRun,
|
ensureRuntimeServicesForRun,
|
||||||
persistAdapterManagedRuntimeServices,
|
persistAdapterManagedRuntimeServices,
|
||||||
realizeExecutionWorkspace,
|
realizeExecutionWorkspace,
|
||||||
releaseRuntimeServicesForRun,
|
releaseRuntimeServicesForRun,
|
||||||
|
sanitizeRuntimeServiceBaseEnv,
|
||||||
} from "./workspace-runtime.js";
|
} from "./workspace-runtime.js";
|
||||||
import { issueService } from "./issues.js";
|
import { issueService } from "./issues.js";
|
||||||
|
import { executionWorkspaceService } from "./execution-workspaces.js";
|
||||||
|
import { workspaceOperationService } from "./workspace-operations.js";
|
||||||
import {
|
import {
|
||||||
buildExecutionWorkspaceAdapterConfig,
|
buildExecutionWorkspaceAdapterConfig,
|
||||||
|
gateProjectExecutionWorkspacePolicy,
|
||||||
parseIssueExecutionWorkspaceSettings,
|
parseIssueExecutionWorkspaceSettings,
|
||||||
parseProjectExecutionWorkspacePolicy,
|
parseProjectExecutionWorkspacePolicy,
|
||||||
resolveExecutionWorkspaceMode,
|
resolveExecutionWorkspaceMode,
|
||||||
} from "./execution-workspace-policy.js";
|
} from "./execution-workspace-policy.js";
|
||||||
|
import { instanceSettingsService } from "./instance-settings.js";
|
||||||
import { redactCurrentUserText, redactCurrentUserValue } from "../log-redaction.js";
|
import { redactCurrentUserText, redactCurrentUserValue } from "../log-redaction.js";
|
||||||
import {
|
import {
|
||||||
hasSessionCompactionThresholds,
|
hasSessionCompactionThresholds,
|
||||||
@@ -54,6 +62,80 @@ const HEARTBEAT_MAX_CONCURRENT_RUNS_MAX = 10;
|
|||||||
const DEFERRED_WAKE_CONTEXT_KEY = "_paperclipWakeContext";
|
const DEFERRED_WAKE_CONTEXT_KEY = "_paperclipWakeContext";
|
||||||
const startLocksByAgent = new Map<string, Promise<void>>();
|
const startLocksByAgent = new Map<string, Promise<void>>();
|
||||||
const REPO_ONLY_CWD_SENTINEL = "/__paperclip_repo_only__";
|
const REPO_ONLY_CWD_SENTINEL = "/__paperclip_repo_only__";
|
||||||
|
const MANAGED_WORKSPACE_GIT_CLONE_TIMEOUT_MS = 10 * 60 * 1000;
|
||||||
|
const execFile = promisify(execFileCallback);
|
||||||
|
const SESSIONED_LOCAL_ADAPTERS = new Set([
|
||||||
|
"claude_local",
|
||||||
|
"codex_local",
|
||||||
|
"cursor",
|
||||||
|
"gemini_local",
|
||||||
|
"opencode_local",
|
||||||
|
"pi_local",
|
||||||
|
]);
|
||||||
|
|
||||||
|
function deriveRepoNameFromRepoUrl(repoUrl: string | null): string | null {
|
||||||
|
const trimmed = repoUrl?.trim() ?? "";
|
||||||
|
if (!trimmed) return null;
|
||||||
|
try {
|
||||||
|
const parsed = new URL(trimmed);
|
||||||
|
const cleanedPath = parsed.pathname.replace(/\/+$/, "");
|
||||||
|
const repoName = cleanedPath.split("/").filter(Boolean).pop()?.replace(/\.git$/i, "") ?? "";
|
||||||
|
return repoName || null;
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function ensureManagedProjectWorkspace(input: {
|
||||||
|
companyId: string;
|
||||||
|
projectId: string;
|
||||||
|
repoUrl: string | null;
|
||||||
|
}): Promise<{ cwd: string; warning: string | null }> {
|
||||||
|
const cwd = resolveManagedProjectWorkspaceDir({
|
||||||
|
companyId: input.companyId,
|
||||||
|
projectId: input.projectId,
|
||||||
|
repoName: deriveRepoNameFromRepoUrl(input.repoUrl),
|
||||||
|
});
|
||||||
|
await fs.mkdir(path.dirname(cwd), { recursive: true });
|
||||||
|
const stats = await fs.stat(cwd).catch(() => null);
|
||||||
|
|
||||||
|
if (!input.repoUrl) {
|
||||||
|
if (!stats) {
|
||||||
|
await fs.mkdir(cwd, { recursive: true });
|
||||||
|
}
|
||||||
|
return { cwd, warning: null };
|
||||||
|
}
|
||||||
|
|
||||||
|
const gitDirExists = await fs
|
||||||
|
.stat(path.resolve(cwd, ".git"))
|
||||||
|
.then((entry) => entry.isDirectory())
|
||||||
|
.catch(() => false);
|
||||||
|
if (gitDirExists) {
|
||||||
|
return { cwd, warning: null };
|
||||||
|
}
|
||||||
|
|
||||||
|
if (stats) {
|
||||||
|
const entries = await fs.readdir(cwd).catch(() => []);
|
||||||
|
if (entries.length > 0) {
|
||||||
|
return {
|
||||||
|
cwd,
|
||||||
|
warning: `Managed workspace path "${cwd}" already exists but is not a git checkout. Using it as-is.`,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
await fs.rm(cwd, { recursive: true, force: true });
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
await execFile("git", ["clone", input.repoUrl, cwd], {
|
||||||
|
env: sanitizeRuntimeServiceBaseEnv(process.env),
|
||||||
|
timeout: MANAGED_WORKSPACE_GIT_CLONE_TIMEOUT_MS,
|
||||||
|
});
|
||||||
|
return { cwd, warning: null };
|
||||||
|
} catch (error) {
|
||||||
|
const reason = error instanceof Error ? error.message : String(error);
|
||||||
|
throw new Error(`Failed to prepare managed checkout for "${input.repoUrl}" at "${cwd}": ${reason}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
const heartbeatRunListColumns = {
|
const heartbeatRunListColumns = {
|
||||||
id: heartbeatRuns.id,
|
id: heartbeatRuns.id,
|
||||||
@@ -158,6 +240,20 @@ export type ResolvedWorkspaceForRun = {
|
|||||||
warnings: string[];
|
warnings: string[];
|
||||||
};
|
};
|
||||||
|
|
||||||
|
type ProjectWorkspaceCandidate = {
|
||||||
|
id: string;
|
||||||
|
};
|
||||||
|
|
||||||
|
export function prioritizeProjectWorkspaceCandidatesForRun<T extends ProjectWorkspaceCandidate>(
|
||||||
|
rows: T[],
|
||||||
|
preferredWorkspaceId: string | null | undefined,
|
||||||
|
): T[] {
|
||||||
|
if (!preferredWorkspaceId) return rows;
|
||||||
|
const preferredIndex = rows.findIndex((row) => row.id === preferredWorkspaceId);
|
||||||
|
if (preferredIndex <= 0) return rows;
|
||||||
|
return [rows[preferredIndex]!, ...rows.slice(0, preferredIndex), ...rows.slice(preferredIndex + 1)];
|
||||||
|
}
|
||||||
|
|
||||||
function readNonEmptyString(value: unknown): string | null {
|
function readNonEmptyString(value: unknown): string | null {
|
||||||
return typeof value === "string" && value.trim().length > 0 ? value : null;
|
return typeof value === "string" && value.trim().length > 0 ? value : null;
|
||||||
}
|
}
|
||||||
@@ -591,9 +687,13 @@ function resolveNextSessionState(input: {
|
|||||||
}
|
}
|
||||||
|
|
||||||
export function heartbeatService(db: Db) {
|
export function heartbeatService(db: Db) {
|
||||||
|
const instanceSettings = instanceSettingsService(db);
|
||||||
|
|
||||||
const runLogStore = getRunLogStore();
|
const runLogStore = getRunLogStore();
|
||||||
const secretsSvc = secretService(db);
|
const secretsSvc = secretService(db);
|
||||||
const issuesSvc = issueService(db);
|
const issuesSvc = issueService(db);
|
||||||
|
const executionWorkspacesSvc = executionWorkspaceService(db);
|
||||||
|
const workspaceOperationsSvc = workspaceOperationService(db);
|
||||||
const activeRunExecutions = new Set<string>();
|
const activeRunExecutions = new Set<string>();
|
||||||
const budgetHooks = {
|
const budgetHooks = {
|
||||||
cancelWorkForScope: cancelBudgetScopeWork,
|
cancelWorkForScope: cancelBudgetScopeWork,
|
||||||
@@ -848,18 +948,25 @@ export function heartbeatService(db: Db) {
|
|||||||
): Promise<ResolvedWorkspaceForRun> {
|
): Promise<ResolvedWorkspaceForRun> {
|
||||||
const issueId = readNonEmptyString(context.issueId);
|
const issueId = readNonEmptyString(context.issueId);
|
||||||
const contextProjectId = readNonEmptyString(context.projectId);
|
const contextProjectId = readNonEmptyString(context.projectId);
|
||||||
const issueProjectId = issueId
|
const contextProjectWorkspaceId = readNonEmptyString(context.projectWorkspaceId);
|
||||||
|
const issueProjectRef = issueId
|
||||||
? await db
|
? await db
|
||||||
.select({ projectId: issues.projectId })
|
.select({
|
||||||
|
projectId: issues.projectId,
|
||||||
|
projectWorkspaceId: issues.projectWorkspaceId,
|
||||||
|
})
|
||||||
.from(issues)
|
.from(issues)
|
||||||
.where(and(eq(issues.id, issueId), eq(issues.companyId, agent.companyId)))
|
.where(and(eq(issues.id, issueId), eq(issues.companyId, agent.companyId)))
|
||||||
.then((rows) => rows[0]?.projectId ?? null)
|
.then((rows) => rows[0] ?? null)
|
||||||
: null;
|
: null;
|
||||||
|
const issueProjectId = issueProjectRef?.projectId ?? null;
|
||||||
|
const preferredProjectWorkspaceId =
|
||||||
|
issueProjectRef?.projectWorkspaceId ?? contextProjectWorkspaceId ?? null;
|
||||||
const resolvedProjectId = issueProjectId ?? contextProjectId;
|
const resolvedProjectId = issueProjectId ?? contextProjectId;
|
||||||
const useProjectWorkspace = opts?.useProjectWorkspace !== false;
|
const useProjectWorkspace = opts?.useProjectWorkspace !== false;
|
||||||
const workspaceProjectId = useProjectWorkspace ? resolvedProjectId : null;
|
const workspaceProjectId = useProjectWorkspace ? resolvedProjectId : null;
|
||||||
|
|
||||||
const projectWorkspaceRows = workspaceProjectId
|
const unorderedProjectWorkspaceRows = workspaceProjectId
|
||||||
? await db
|
? await db
|
||||||
.select()
|
.select()
|
||||||
.from(projectWorkspaces)
|
.from(projectWorkspaces)
|
||||||
@@ -871,6 +978,10 @@ export function heartbeatService(db: Db) {
|
|||||||
)
|
)
|
||||||
.orderBy(asc(projectWorkspaces.createdAt), asc(projectWorkspaces.id))
|
.orderBy(asc(projectWorkspaces.createdAt), asc(projectWorkspaces.id))
|
||||||
: [];
|
: [];
|
||||||
|
const projectWorkspaceRows = prioritizeProjectWorkspaceCandidatesForRun(
|
||||||
|
unorderedProjectWorkspaceRows,
|
||||||
|
preferredProjectWorkspaceId,
|
||||||
|
);
|
||||||
|
|
||||||
const workspaceHints = projectWorkspaceRows.map((workspace) => ({
|
const workspaceHints = projectWorkspaceRows.map((workspace) => ({
|
||||||
workspaceId: workspace.id,
|
workspaceId: workspace.id,
|
||||||
@@ -880,12 +991,34 @@ export function heartbeatService(db: Db) {
|
|||||||
}));
|
}));
|
||||||
|
|
||||||
if (projectWorkspaceRows.length > 0) {
|
if (projectWorkspaceRows.length > 0) {
|
||||||
|
const preferredWorkspace = preferredProjectWorkspaceId
|
||||||
|
? projectWorkspaceRows.find((workspace) => workspace.id === preferredProjectWorkspaceId) ?? null
|
||||||
|
: null;
|
||||||
const missingProjectCwds: string[] = [];
|
const missingProjectCwds: string[] = [];
|
||||||
let hasConfiguredProjectCwd = false;
|
let hasConfiguredProjectCwd = false;
|
||||||
|
let preferredWorkspaceWarning: string | null = null;
|
||||||
|
if (preferredProjectWorkspaceId && !preferredWorkspace) {
|
||||||
|
preferredWorkspaceWarning =
|
||||||
|
`Selected project workspace "${preferredProjectWorkspaceId}" is not available on this project.`;
|
||||||
|
}
|
||||||
for (const workspace of projectWorkspaceRows) {
|
for (const workspace of projectWorkspaceRows) {
|
||||||
const projectCwd = readNonEmptyString(workspace.cwd);
|
let projectCwd = readNonEmptyString(workspace.cwd);
|
||||||
|
let managedWorkspaceWarning: string | null = null;
|
||||||
if (!projectCwd || projectCwd === REPO_ONLY_CWD_SENTINEL) {
|
if (!projectCwd || projectCwd === REPO_ONLY_CWD_SENTINEL) {
|
||||||
continue;
|
try {
|
||||||
|
const managedWorkspace = await ensureManagedProjectWorkspace({
|
||||||
|
companyId: agent.companyId,
|
||||||
|
projectId: workspaceProjectId ?? resolvedProjectId ?? workspace.projectId,
|
||||||
|
repoUrl: readNonEmptyString(workspace.repoUrl),
|
||||||
|
});
|
||||||
|
projectCwd = managedWorkspace.cwd;
|
||||||
|
managedWorkspaceWarning = managedWorkspace.warning;
|
||||||
|
} catch (error) {
|
||||||
|
if (preferredWorkspace?.id === workspace.id) {
|
||||||
|
preferredWorkspaceWarning = error instanceof Error ? error.message : String(error);
|
||||||
|
}
|
||||||
|
continue;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
hasConfiguredProjectCwd = true;
|
hasConfiguredProjectCwd = true;
|
||||||
const projectCwdExists = await fs
|
const projectCwdExists = await fs
|
||||||
@@ -901,15 +1034,24 @@ export function heartbeatService(db: Db) {
|
|||||||
repoUrl: workspace.repoUrl,
|
repoUrl: workspace.repoUrl,
|
||||||
repoRef: workspace.repoRef,
|
repoRef: workspace.repoRef,
|
||||||
workspaceHints,
|
workspaceHints,
|
||||||
warnings: [],
|
warnings: [preferredWorkspaceWarning, managedWorkspaceWarning].filter(
|
||||||
|
(value): value is string => Boolean(value),
|
||||||
|
),
|
||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
if (preferredWorkspace?.id === workspace.id) {
|
||||||
|
preferredWorkspaceWarning =
|
||||||
|
`Selected project workspace path "${projectCwd}" is not available yet.`;
|
||||||
|
}
|
||||||
missingProjectCwds.push(projectCwd);
|
missingProjectCwds.push(projectCwd);
|
||||||
}
|
}
|
||||||
|
|
||||||
const fallbackCwd = resolveDefaultAgentWorkspaceDir(agent.id);
|
const fallbackCwd = resolveDefaultAgentWorkspaceDir(agent.id);
|
||||||
await fs.mkdir(fallbackCwd, { recursive: true });
|
await fs.mkdir(fallbackCwd, { recursive: true });
|
||||||
const warnings: string[] = [];
|
const warnings: string[] = [];
|
||||||
|
if (preferredWorkspaceWarning) {
|
||||||
|
warnings.push(preferredWorkspaceWarning);
|
||||||
|
}
|
||||||
if (missingProjectCwds.length > 0) {
|
if (missingProjectCwds.length > 0) {
|
||||||
const firstMissing = missingProjectCwds[0];
|
const firstMissing = missingProjectCwds[0];
|
||||||
const extraMissingCount = Math.max(0, missingProjectCwds.length - 1);
|
const extraMissingCount = Math.max(0, missingProjectCwds.length - 1);
|
||||||
@@ -935,6 +1077,24 @@ export function heartbeatService(db: Db) {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (workspaceProjectId) {
|
||||||
|
const managedWorkspace = await ensureManagedProjectWorkspace({
|
||||||
|
companyId: agent.companyId,
|
||||||
|
projectId: workspaceProjectId,
|
||||||
|
repoUrl: null,
|
||||||
|
});
|
||||||
|
return {
|
||||||
|
cwd: managedWorkspace.cwd,
|
||||||
|
source: "project_primary" as const,
|
||||||
|
projectId: resolvedProjectId,
|
||||||
|
workspaceId: null,
|
||||||
|
repoUrl: null,
|
||||||
|
repoRef: null,
|
||||||
|
workspaceHints,
|
||||||
|
warnings: managedWorkspace.warning ? [managedWorkspace.warning] : [],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
const sessionCwd = readNonEmptyString(previousSessionParams?.cwd);
|
const sessionCwd = readNonEmptyString(previousSessionParams?.cwd);
|
||||||
if (sessionCwd) {
|
if (sessionCwd) {
|
||||||
const sessionCwdExists = await fs
|
const sessionCwdExists = await fs
|
||||||
@@ -1473,10 +1633,16 @@ export function heartbeatService(db: Db) {
|
|||||||
const taskKey = deriveTaskKey(context, null);
|
const taskKey = deriveTaskKey(context, null);
|
||||||
const sessionCodec = getAdapterSessionCodec(agent.adapterType);
|
const sessionCodec = getAdapterSessionCodec(agent.adapterType);
|
||||||
const issueId = readNonEmptyString(context.issueId);
|
const issueId = readNonEmptyString(context.issueId);
|
||||||
const issueAssigneeConfig = issueId
|
const issueContext = issueId
|
||||||
? await db
|
? await db
|
||||||
.select({
|
.select({
|
||||||
|
id: issues.id,
|
||||||
|
identifier: issues.identifier,
|
||||||
|
title: issues.title,
|
||||||
projectId: issues.projectId,
|
projectId: issues.projectId,
|
||||||
|
projectWorkspaceId: issues.projectWorkspaceId,
|
||||||
|
executionWorkspaceId: issues.executionWorkspaceId,
|
||||||
|
executionWorkspacePreference: issues.executionWorkspacePreference,
|
||||||
assigneeAgentId: issues.assigneeAgentId,
|
assigneeAgentId: issues.assigneeAgentId,
|
||||||
assigneeAdapterOverrides: issues.assigneeAdapterOverrides,
|
assigneeAdapterOverrides: issues.assigneeAdapterOverrides,
|
||||||
executionWorkspaceSettings: issues.executionWorkspaceSettings,
|
executionWorkspaceSettings: issues.executionWorkspaceSettings,
|
||||||
@@ -1486,22 +1652,27 @@ export function heartbeatService(db: Db) {
|
|||||||
.then((rows) => rows[0] ?? null)
|
.then((rows) => rows[0] ?? null)
|
||||||
: null;
|
: null;
|
||||||
const issueAssigneeOverrides =
|
const issueAssigneeOverrides =
|
||||||
issueAssigneeConfig && issueAssigneeConfig.assigneeAgentId === agent.id
|
issueContext && issueContext.assigneeAgentId === agent.id
|
||||||
? parseIssueAssigneeAdapterOverrides(
|
? parseIssueAssigneeAdapterOverrides(
|
||||||
issueAssigneeConfig.assigneeAdapterOverrides,
|
issueContext.assigneeAdapterOverrides,
|
||||||
)
|
)
|
||||||
: null;
|
: null;
|
||||||
const issueExecutionWorkspaceSettings = parseIssueExecutionWorkspaceSettings(
|
const isolatedWorkspacesEnabled = (await instanceSettings.getExperimental()).enableIsolatedWorkspaces;
|
||||||
issueAssigneeConfig?.executionWorkspaceSettings,
|
const issueExecutionWorkspaceSettings = isolatedWorkspacesEnabled
|
||||||
);
|
? parseIssueExecutionWorkspaceSettings(issueContext?.executionWorkspaceSettings)
|
||||||
|
: null;
|
||||||
const contextProjectId = readNonEmptyString(context.projectId);
|
const contextProjectId = readNonEmptyString(context.projectId);
|
||||||
const executionProjectId = issueAssigneeConfig?.projectId ?? contextProjectId;
|
const executionProjectId = issueContext?.projectId ?? contextProjectId;
|
||||||
const projectExecutionWorkspacePolicy = executionProjectId
|
const projectExecutionWorkspacePolicy = executionProjectId
|
||||||
? await db
|
? await db
|
||||||
.select({ executionWorkspacePolicy: projects.executionWorkspacePolicy })
|
.select({ executionWorkspacePolicy: projects.executionWorkspacePolicy })
|
||||||
.from(projects)
|
.from(projects)
|
||||||
.where(and(eq(projects.id, executionProjectId), eq(projects.companyId, agent.companyId)))
|
.where(and(eq(projects.id, executionProjectId), eq(projects.companyId, agent.companyId)))
|
||||||
.then((rows) => parseProjectExecutionWorkspacePolicy(rows[0]?.executionWorkspacePolicy))
|
.then((rows) =>
|
||||||
|
gateProjectExecutionWorkspacePolicy(
|
||||||
|
parseProjectExecutionWorkspacePolicy(rows[0]?.executionWorkspacePolicy),
|
||||||
|
isolatedWorkspacesEnabled,
|
||||||
|
))
|
||||||
: null;
|
: null;
|
||||||
const taskSession = taskKey
|
const taskSession = taskKey
|
||||||
? await getTaskSession(agent.companyId, agent.id, agent.adapterType, taskKey)
|
? await getTaskSession(agent.companyId, agent.id, agent.adapterType, taskKey)
|
||||||
@@ -1538,17 +1709,24 @@ export function heartbeatService(db: Db) {
|
|||||||
agent.companyId,
|
agent.companyId,
|
||||||
mergedConfig,
|
mergedConfig,
|
||||||
);
|
);
|
||||||
const issueRef = issueId
|
const issueRef = issueContext
|
||||||
? await db
|
? {
|
||||||
.select({
|
id: issueContext.id,
|
||||||
id: issues.id,
|
identifier: issueContext.identifier,
|
||||||
identifier: issues.identifier,
|
title: issueContext.title,
|
||||||
title: issues.title,
|
projectId: issueContext.projectId,
|
||||||
})
|
projectWorkspaceId: issueContext.projectWorkspaceId,
|
||||||
.from(issues)
|
executionWorkspaceId: issueContext.executionWorkspaceId,
|
||||||
.where(and(eq(issues.id, issueId), eq(issues.companyId, agent.companyId)))
|
executionWorkspacePreference: issueContext.executionWorkspacePreference,
|
||||||
.then((rows) => rows[0] ?? null)
|
}
|
||||||
: null;
|
: null;
|
||||||
|
const existingExecutionWorkspace =
|
||||||
|
issueRef?.executionWorkspaceId ? await executionWorkspacesSvc.getById(issueRef.executionWorkspaceId) : null;
|
||||||
|
const workspaceOperationRecorder = workspaceOperationsSvc.createRecorder({
|
||||||
|
companyId: agent.companyId,
|
||||||
|
heartbeatRunId: run.id,
|
||||||
|
executionWorkspaceId: existingExecutionWorkspace?.id ?? null,
|
||||||
|
});
|
||||||
const executionWorkspace = await realizeExecutionWorkspace({
|
const executionWorkspace = await realizeExecutionWorkspace({
|
||||||
base: {
|
base: {
|
||||||
baseCwd: resolvedWorkspace.cwd,
|
baseCwd: resolvedWorkspace.cwd,
|
||||||
@@ -1565,7 +1743,132 @@ export function heartbeatService(db: Db) {
|
|||||||
name: agent.name,
|
name: agent.name,
|
||||||
companyId: agent.companyId,
|
companyId: agent.companyId,
|
||||||
},
|
},
|
||||||
|
recorder: workspaceOperationRecorder,
|
||||||
});
|
});
|
||||||
|
const resolvedProjectId = executionWorkspace.projectId ?? issueRef?.projectId ?? executionProjectId ?? null;
|
||||||
|
const resolvedProjectWorkspaceId = issueRef?.projectWorkspaceId ?? resolvedWorkspace.workspaceId ?? null;
|
||||||
|
const shouldReuseExisting =
|
||||||
|
issueRef?.executionWorkspacePreference === "reuse_existing" &&
|
||||||
|
existingExecutionWorkspace &&
|
||||||
|
existingExecutionWorkspace.status !== "archived";
|
||||||
|
let persistedExecutionWorkspace = null;
|
||||||
|
try {
|
||||||
|
persistedExecutionWorkspace = shouldReuseExisting && existingExecutionWorkspace
|
||||||
|
? await executionWorkspacesSvc.update(existingExecutionWorkspace.id, {
|
||||||
|
cwd: executionWorkspace.cwd,
|
||||||
|
repoUrl: executionWorkspace.repoUrl,
|
||||||
|
baseRef: executionWorkspace.repoRef,
|
||||||
|
branchName: executionWorkspace.branchName,
|
||||||
|
providerType: executionWorkspace.strategy === "git_worktree" ? "git_worktree" : "local_fs",
|
||||||
|
providerRef: executionWorkspace.worktreePath,
|
||||||
|
status: "active",
|
||||||
|
lastUsedAt: new Date(),
|
||||||
|
metadata: {
|
||||||
|
...(existingExecutionWorkspace.metadata ?? {}),
|
||||||
|
source: executionWorkspace.source,
|
||||||
|
createdByRuntime: executionWorkspace.created,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
: resolvedProjectId
|
||||||
|
? await executionWorkspacesSvc.create({
|
||||||
|
companyId: agent.companyId,
|
||||||
|
projectId: resolvedProjectId,
|
||||||
|
projectWorkspaceId: resolvedProjectWorkspaceId,
|
||||||
|
sourceIssueId: issueRef?.id ?? null,
|
||||||
|
mode:
|
||||||
|
executionWorkspaceMode === "isolated_workspace"
|
||||||
|
? "isolated_workspace"
|
||||||
|
: executionWorkspaceMode === "operator_branch"
|
||||||
|
? "operator_branch"
|
||||||
|
: executionWorkspaceMode === "agent_default"
|
||||||
|
? "adapter_managed"
|
||||||
|
: "shared_workspace",
|
||||||
|
strategyType: executionWorkspace.strategy === "git_worktree" ? "git_worktree" : "project_primary",
|
||||||
|
name: executionWorkspace.branchName ?? issueRef?.identifier ?? `workspace-${agent.id.slice(0, 8)}`,
|
||||||
|
status: "active",
|
||||||
|
cwd: executionWorkspace.cwd,
|
||||||
|
repoUrl: executionWorkspace.repoUrl,
|
||||||
|
baseRef: executionWorkspace.repoRef,
|
||||||
|
branchName: executionWorkspace.branchName,
|
||||||
|
providerType: executionWorkspace.strategy === "git_worktree" ? "git_worktree" : "local_fs",
|
||||||
|
providerRef: executionWorkspace.worktreePath,
|
||||||
|
lastUsedAt: new Date(),
|
||||||
|
openedAt: new Date(),
|
||||||
|
metadata: {
|
||||||
|
source: executionWorkspace.source,
|
||||||
|
createdByRuntime: executionWorkspace.created,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
: null;
|
||||||
|
} catch (error) {
|
||||||
|
if (executionWorkspace.created) {
|
||||||
|
try {
|
||||||
|
await cleanupExecutionWorkspaceArtifacts({
|
||||||
|
workspace: {
|
||||||
|
id: existingExecutionWorkspace?.id ?? `transient-${run.id}`,
|
||||||
|
cwd: executionWorkspace.cwd,
|
||||||
|
providerType: executionWorkspace.strategy === "git_worktree" ? "git_worktree" : "local_fs",
|
||||||
|
providerRef: executionWorkspace.worktreePath,
|
||||||
|
branchName: executionWorkspace.branchName,
|
||||||
|
repoUrl: executionWorkspace.repoUrl,
|
||||||
|
baseRef: executionWorkspace.repoRef,
|
||||||
|
projectId: resolvedProjectId,
|
||||||
|
projectWorkspaceId: resolvedProjectWorkspaceId,
|
||||||
|
sourceIssueId: issueRef?.id ?? null,
|
||||||
|
metadata: {
|
||||||
|
createdByRuntime: true,
|
||||||
|
source: executionWorkspace.source,
|
||||||
|
},
|
||||||
|
},
|
||||||
|
projectWorkspace: {
|
||||||
|
cwd: resolvedWorkspace.cwd,
|
||||||
|
cleanupCommand: null,
|
||||||
|
},
|
||||||
|
teardownCommand: projectExecutionWorkspacePolicy?.workspaceStrategy?.teardownCommand ?? null,
|
||||||
|
recorder: workspaceOperationRecorder,
|
||||||
|
});
|
||||||
|
} catch (cleanupError) {
|
||||||
|
logger.warn(
|
||||||
|
{
|
||||||
|
runId: run.id,
|
||||||
|
issueId,
|
||||||
|
executionWorkspaceCwd: executionWorkspace.cwd,
|
||||||
|
cleanupError: cleanupError instanceof Error ? cleanupError.message : String(cleanupError),
|
||||||
|
},
|
||||||
|
"Failed to cleanup realized execution workspace after persistence failure",
|
||||||
|
);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
await workspaceOperationRecorder.attachExecutionWorkspaceId(persistedExecutionWorkspace?.id ?? null);
|
||||||
|
if (
|
||||||
|
existingExecutionWorkspace &&
|
||||||
|
persistedExecutionWorkspace &&
|
||||||
|
existingExecutionWorkspace.id !== persistedExecutionWorkspace.id &&
|
||||||
|
existingExecutionWorkspace.status === "active"
|
||||||
|
) {
|
||||||
|
await executionWorkspacesSvc.update(existingExecutionWorkspace.id, {
|
||||||
|
status: "idle",
|
||||||
|
cleanupReason: null,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (issueId && persistedExecutionWorkspace && issueRef?.executionWorkspaceId !== persistedExecutionWorkspace.id) {
|
||||||
|
await issuesSvc.update(issueId, {
|
||||||
|
executionWorkspaceId: persistedExecutionWorkspace.id,
|
||||||
|
...(resolvedProjectWorkspaceId ? { projectWorkspaceId: resolvedProjectWorkspaceId } : {}),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
if (persistedExecutionWorkspace) {
|
||||||
|
context.executionWorkspaceId = persistedExecutionWorkspace.id;
|
||||||
|
await db
|
||||||
|
.update(heartbeatRuns)
|
||||||
|
.set({
|
||||||
|
contextSnapshot: context,
|
||||||
|
updatedAt: new Date(),
|
||||||
|
})
|
||||||
|
.where(eq(heartbeatRuns.id, run.id));
|
||||||
|
}
|
||||||
const runtimeSessionResolution = resolveRuntimeSessionParamsForWorkspace({
|
const runtimeSessionResolution = resolveRuntimeSessionParamsForWorkspace({
|
||||||
agentId: agent.id,
|
agentId: agent.id,
|
||||||
previousSessionParams,
|
previousSessionParams,
|
||||||
@@ -1769,6 +2072,7 @@ export function heartbeatService(db: Db) {
|
|||||||
},
|
},
|
||||||
issue: issueRef,
|
issue: issueRef,
|
||||||
workspace: executionWorkspace,
|
workspace: executionWorkspace,
|
||||||
|
executionWorkspaceId: persistedExecutionWorkspace?.id ?? issueRef?.executionWorkspaceId ?? null,
|
||||||
config: resolvedConfig,
|
config: resolvedConfig,
|
||||||
adapterEnv,
|
adapterEnv,
|
||||||
onLog,
|
onLog,
|
||||||
|
|||||||
@@ -16,7 +16,11 @@ export { heartbeatService } from "./heartbeat.js";
|
|||||||
export { dashboardService } from "./dashboard.js";
|
export { dashboardService } from "./dashboard.js";
|
||||||
export { sidebarBadgeService } from "./sidebar-badges.js";
|
export { sidebarBadgeService } from "./sidebar-badges.js";
|
||||||
export { accessService } from "./access.js";
|
export { accessService } from "./access.js";
|
||||||
|
export { instanceSettingsService } from "./instance-settings.js";
|
||||||
export { companyPortabilityService } from "./company-portability.js";
|
export { companyPortabilityService } from "./company-portability.js";
|
||||||
|
export { executionWorkspaceService } from "./execution-workspaces.js";
|
||||||
|
export { workspaceOperationService } from "./workspace-operations.js";
|
||||||
|
export { workProductService } from "./work-products.js";
|
||||||
export { logActivity, type LogActivityInput } from "./activity-log.js";
|
export { logActivity, type LogActivityInput } from "./activity-log.js";
|
||||||
export { notifyHireApproved, type NotifyHireApprovedInput } from "./hire-hook.js";
|
export { notifyHireApproved, type NotifyHireApprovedInput } from "./hire-hook.js";
|
||||||
export { publishLiveEvent, subscribeCompanyLiveEvents } from "./live-events.js";
|
export { publishLiveEvent, subscribeCompanyLiveEvents } from "./live-events.js";
|
||||||
|
|||||||
95
server/src/services/instance-settings.ts
Normal file
95
server/src/services/instance-settings.ts
Normal file
@@ -0,0 +1,95 @@
|
|||||||
|
import type { Db } from "@paperclipai/db";
|
||||||
|
import { companies, instanceSettings } from "@paperclipai/db";
|
||||||
|
import {
|
||||||
|
instanceExperimentalSettingsSchema,
|
||||||
|
type InstanceExperimentalSettings,
|
||||||
|
type InstanceSettings,
|
||||||
|
type PatchInstanceExperimentalSettings,
|
||||||
|
} from "@paperclipai/shared";
|
||||||
|
import { eq } from "drizzle-orm";
|
||||||
|
|
||||||
|
const DEFAULT_SINGLETON_KEY = "default";
|
||||||
|
|
||||||
|
function normalizeExperimentalSettings(raw: unknown): InstanceExperimentalSettings {
|
||||||
|
const parsed = instanceExperimentalSettingsSchema.safeParse(raw ?? {});
|
||||||
|
if (parsed.success) {
|
||||||
|
return {
|
||||||
|
enableIsolatedWorkspaces: parsed.data.enableIsolatedWorkspaces ?? false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
enableIsolatedWorkspaces: false,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function toInstanceSettings(row: typeof instanceSettings.$inferSelect): InstanceSettings {
|
||||||
|
return {
|
||||||
|
id: row.id,
|
||||||
|
experimental: normalizeExperimentalSettings(row.experimental),
|
||||||
|
createdAt: row.createdAt,
|
||||||
|
updatedAt: row.updatedAt,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function instanceSettingsService(db: Db) {
|
||||||
|
async function getOrCreateRow() {
|
||||||
|
const existing = await db
|
||||||
|
.select()
|
||||||
|
.from(instanceSettings)
|
||||||
|
.where(eq(instanceSettings.singletonKey, DEFAULT_SINGLETON_KEY))
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
if (existing) return existing;
|
||||||
|
|
||||||
|
const now = new Date();
|
||||||
|
const [created] = await db
|
||||||
|
.insert(instanceSettings)
|
||||||
|
.values({
|
||||||
|
singletonKey: DEFAULT_SINGLETON_KEY,
|
||||||
|
experimental: {},
|
||||||
|
createdAt: now,
|
||||||
|
updatedAt: now,
|
||||||
|
})
|
||||||
|
.onConflictDoUpdate({
|
||||||
|
target: [instanceSettings.singletonKey],
|
||||||
|
set: {
|
||||||
|
updatedAt: now,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
.returning();
|
||||||
|
|
||||||
|
return created;
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
get: async (): Promise<InstanceSettings> => toInstanceSettings(await getOrCreateRow()),
|
||||||
|
|
||||||
|
getExperimental: async (): Promise<InstanceExperimentalSettings> => {
|
||||||
|
const row = await getOrCreateRow();
|
||||||
|
return normalizeExperimentalSettings(row.experimental);
|
||||||
|
},
|
||||||
|
|
||||||
|
updateExperimental: async (patch: PatchInstanceExperimentalSettings): Promise<InstanceSettings> => {
|
||||||
|
const current = await getOrCreateRow();
|
||||||
|
const nextExperimental = normalizeExperimentalSettings({
|
||||||
|
...normalizeExperimentalSettings(current.experimental),
|
||||||
|
...patch,
|
||||||
|
});
|
||||||
|
const now = new Date();
|
||||||
|
const [updated] = await db
|
||||||
|
.update(instanceSettings)
|
||||||
|
.set({
|
||||||
|
experimental: { ...nextExperimental },
|
||||||
|
updatedAt: now,
|
||||||
|
})
|
||||||
|
.where(eq(instanceSettings.id, current.id))
|
||||||
|
.returning();
|
||||||
|
return toInstanceSettings(updated ?? current);
|
||||||
|
},
|
||||||
|
|
||||||
|
listCompanyIds: async (): Promise<string[]> =>
|
||||||
|
db
|
||||||
|
.select({ id: companies.id })
|
||||||
|
.from(companies)
|
||||||
|
.then((rows) => rows.map((row) => row.id)),
|
||||||
|
};
|
||||||
|
}
|
||||||
@@ -8,6 +8,7 @@ import {
|
|||||||
documents,
|
documents,
|
||||||
goals,
|
goals,
|
||||||
heartbeatRuns,
|
heartbeatRuns,
|
||||||
|
executionWorkspaces,
|
||||||
issueAttachments,
|
issueAttachments,
|
||||||
issueLabels,
|
issueLabels,
|
||||||
issueComments,
|
issueComments,
|
||||||
@@ -22,8 +23,10 @@ import { extractProjectMentionIds } from "@paperclipai/shared";
|
|||||||
import { conflict, notFound, unprocessable } from "../errors.js";
|
import { conflict, notFound, unprocessable } from "../errors.js";
|
||||||
import {
|
import {
|
||||||
defaultIssueExecutionWorkspaceSettingsForProject,
|
defaultIssueExecutionWorkspaceSettingsForProject,
|
||||||
|
gateProjectExecutionWorkspacePolicy,
|
||||||
parseProjectExecutionWorkspacePolicy,
|
parseProjectExecutionWorkspacePolicy,
|
||||||
} from "./execution-workspace-policy.js";
|
} from "./execution-workspace-policy.js";
|
||||||
|
import { instanceSettingsService } from "./instance-settings.js";
|
||||||
import { redactCurrentUserText } from "../log-redaction.js";
|
import { redactCurrentUserText } from "../log-redaction.js";
|
||||||
import { resolveIssueGoalId, resolveNextIssueGoalId } from "./issue-goal-fallback.js";
|
import { resolveIssueGoalId, resolveNextIssueGoalId } from "./issue-goal-fallback.js";
|
||||||
import { getDefaultCompanyGoal } from "./goals.js";
|
import { getDefaultCompanyGoal } from "./goals.js";
|
||||||
@@ -315,6 +318,8 @@ function withActiveRuns(
|
|||||||
}
|
}
|
||||||
|
|
||||||
export function issueService(db: Db) {
|
export function issueService(db: Db) {
|
||||||
|
const instanceSettings = instanceSettingsService(db);
|
||||||
|
|
||||||
async function assertAssignableAgent(companyId: string, agentId: string) {
|
async function assertAssignableAgent(companyId: string, agentId: string) {
|
||||||
const assignee = await db
|
const assignee = await db
|
||||||
.select({
|
.select({
|
||||||
@@ -356,6 +361,40 @@ export function issueService(db: Db) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async function assertValidProjectWorkspace(companyId: string, projectId: string | null | undefined, projectWorkspaceId: string) {
|
||||||
|
const workspace = await db
|
||||||
|
.select({
|
||||||
|
id: projectWorkspaces.id,
|
||||||
|
companyId: projectWorkspaces.companyId,
|
||||||
|
projectId: projectWorkspaces.projectId,
|
||||||
|
})
|
||||||
|
.from(projectWorkspaces)
|
||||||
|
.where(eq(projectWorkspaces.id, projectWorkspaceId))
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
if (!workspace) throw notFound("Project workspace not found");
|
||||||
|
if (workspace.companyId !== companyId) throw unprocessable("Project workspace must belong to same company");
|
||||||
|
if (projectId && workspace.projectId !== projectId) {
|
||||||
|
throw unprocessable("Project workspace must belong to the selected project");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function assertValidExecutionWorkspace(companyId: string, projectId: string | null | undefined, executionWorkspaceId: string) {
|
||||||
|
const workspace = await db
|
||||||
|
.select({
|
||||||
|
id: executionWorkspaces.id,
|
||||||
|
companyId: executionWorkspaces.companyId,
|
||||||
|
projectId: executionWorkspaces.projectId,
|
||||||
|
})
|
||||||
|
.from(executionWorkspaces)
|
||||||
|
.where(eq(executionWorkspaces.id, executionWorkspaceId))
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
if (!workspace) throw notFound("Execution workspace not found");
|
||||||
|
if (workspace.companyId !== companyId) throw unprocessable("Execution workspace must belong to same company");
|
||||||
|
if (projectId && workspace.projectId !== projectId) {
|
||||||
|
throw unprocessable("Execution workspace must belong to the selected project");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
async function assertValidLabelIds(companyId: string, labelIds: string[], dbOrTx: any = db) {
|
async function assertValidLabelIds(companyId: string, labelIds: string[], dbOrTx: any = db) {
|
||||||
if (labelIds.length === 0) return;
|
if (labelIds.length === 0) return;
|
||||||
const existing = await dbOrTx
|
const existing = await dbOrTx
|
||||||
@@ -641,6 +680,12 @@ export function issueService(db: Db) {
|
|||||||
data: Omit<typeof issues.$inferInsert, "companyId"> & { labelIds?: string[] },
|
data: Omit<typeof issues.$inferInsert, "companyId"> & { labelIds?: string[] },
|
||||||
) => {
|
) => {
|
||||||
const { labelIds: inputLabelIds, ...issueData } = data;
|
const { labelIds: inputLabelIds, ...issueData } = data;
|
||||||
|
const isolatedWorkspacesEnabled = (await instanceSettings.getExperimental()).enableIsolatedWorkspaces;
|
||||||
|
if (!isolatedWorkspacesEnabled) {
|
||||||
|
delete issueData.executionWorkspaceId;
|
||||||
|
delete issueData.executionWorkspacePreference;
|
||||||
|
delete issueData.executionWorkspaceSettings;
|
||||||
|
}
|
||||||
if (data.assigneeAgentId && data.assigneeUserId) {
|
if (data.assigneeAgentId && data.assigneeUserId) {
|
||||||
throw unprocessable("Issue can only have one assignee");
|
throw unprocessable("Issue can only have one assignee");
|
||||||
}
|
}
|
||||||
@@ -650,6 +695,12 @@ export function issueService(db: Db) {
|
|||||||
if (data.assigneeUserId) {
|
if (data.assigneeUserId) {
|
||||||
await assertAssignableUser(companyId, data.assigneeUserId);
|
await assertAssignableUser(companyId, data.assigneeUserId);
|
||||||
}
|
}
|
||||||
|
if (data.projectWorkspaceId) {
|
||||||
|
await assertValidProjectWorkspace(companyId, data.projectId, data.projectWorkspaceId);
|
||||||
|
}
|
||||||
|
if (data.executionWorkspaceId) {
|
||||||
|
await assertValidExecutionWorkspace(companyId, data.projectId, data.executionWorkspaceId);
|
||||||
|
}
|
||||||
if (data.status === "in_progress" && !data.assigneeAgentId && !data.assigneeUserId) {
|
if (data.status === "in_progress" && !data.assigneeAgentId && !data.assigneeUserId) {
|
||||||
throw unprocessable("in_progress issues require an assignee");
|
throw unprocessable("in_progress issues require an assignee");
|
||||||
}
|
}
|
||||||
@@ -665,9 +716,32 @@ export function issueService(db: Db) {
|
|||||||
.then((rows) => rows[0] ?? null);
|
.then((rows) => rows[0] ?? null);
|
||||||
executionWorkspaceSettings =
|
executionWorkspaceSettings =
|
||||||
defaultIssueExecutionWorkspaceSettingsForProject(
|
defaultIssueExecutionWorkspaceSettingsForProject(
|
||||||
parseProjectExecutionWorkspacePolicy(project?.executionWorkspacePolicy),
|
gateProjectExecutionWorkspacePolicy(
|
||||||
|
parseProjectExecutionWorkspacePolicy(project?.executionWorkspacePolicy),
|
||||||
|
isolatedWorkspacesEnabled,
|
||||||
|
),
|
||||||
) as Record<string, unknown> | null;
|
) as Record<string, unknown> | null;
|
||||||
}
|
}
|
||||||
|
let projectWorkspaceId = issueData.projectWorkspaceId ?? null;
|
||||||
|
if (!projectWorkspaceId && issueData.projectId) {
|
||||||
|
const project = await tx
|
||||||
|
.select({
|
||||||
|
executionWorkspacePolicy: projects.executionWorkspacePolicy,
|
||||||
|
})
|
||||||
|
.from(projects)
|
||||||
|
.where(and(eq(projects.id, issueData.projectId), eq(projects.companyId, companyId)))
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
const projectPolicy = parseProjectExecutionWorkspacePolicy(project?.executionWorkspacePolicy);
|
||||||
|
projectWorkspaceId = projectPolicy?.defaultProjectWorkspaceId ?? null;
|
||||||
|
if (!projectWorkspaceId) {
|
||||||
|
projectWorkspaceId = await tx
|
||||||
|
.select({ id: projectWorkspaces.id })
|
||||||
|
.from(projectWorkspaces)
|
||||||
|
.where(and(eq(projectWorkspaces.projectId, issueData.projectId), eq(projectWorkspaces.companyId, companyId)))
|
||||||
|
.orderBy(desc(projectWorkspaces.isPrimary), asc(projectWorkspaces.createdAt), asc(projectWorkspaces.id))
|
||||||
|
.then((rows) => rows[0]?.id ?? null);
|
||||||
|
}
|
||||||
|
}
|
||||||
const [company] = await tx
|
const [company] = await tx
|
||||||
.update(companies)
|
.update(companies)
|
||||||
.set({ issueCounter: sql`${companies.issueCounter} + 1` })
|
.set({ issueCounter: sql`${companies.issueCounter} + 1` })
|
||||||
@@ -684,6 +758,7 @@ export function issueService(db: Db) {
|
|||||||
goalId: issueData.goalId,
|
goalId: issueData.goalId,
|
||||||
defaultGoalId: defaultCompanyGoal?.id ?? null,
|
defaultGoalId: defaultCompanyGoal?.id ?? null,
|
||||||
}),
|
}),
|
||||||
|
...(projectWorkspaceId ? { projectWorkspaceId } : {}),
|
||||||
...(executionWorkspaceSettings ? { executionWorkspaceSettings } : {}),
|
...(executionWorkspaceSettings ? { executionWorkspaceSettings } : {}),
|
||||||
companyId,
|
companyId,
|
||||||
issueNumber,
|
issueNumber,
|
||||||
@@ -717,6 +792,12 @@ export function issueService(db: Db) {
|
|||||||
if (!existing) return null;
|
if (!existing) return null;
|
||||||
|
|
||||||
const { labelIds: nextLabelIds, ...issueData } = data;
|
const { labelIds: nextLabelIds, ...issueData } = data;
|
||||||
|
const isolatedWorkspacesEnabled = (await instanceSettings.getExperimental()).enableIsolatedWorkspaces;
|
||||||
|
if (!isolatedWorkspacesEnabled) {
|
||||||
|
delete issueData.executionWorkspaceId;
|
||||||
|
delete issueData.executionWorkspacePreference;
|
||||||
|
delete issueData.executionWorkspaceSettings;
|
||||||
|
}
|
||||||
|
|
||||||
if (issueData.status) {
|
if (issueData.status) {
|
||||||
assertTransition(existing.status, issueData.status);
|
assertTransition(existing.status, issueData.status);
|
||||||
@@ -744,6 +825,17 @@ export function issueService(db: Db) {
|
|||||||
if (issueData.assigneeUserId) {
|
if (issueData.assigneeUserId) {
|
||||||
await assertAssignableUser(existing.companyId, issueData.assigneeUserId);
|
await assertAssignableUser(existing.companyId, issueData.assigneeUserId);
|
||||||
}
|
}
|
||||||
|
const nextProjectId = issueData.projectId !== undefined ? issueData.projectId : existing.projectId;
|
||||||
|
const nextProjectWorkspaceId =
|
||||||
|
issueData.projectWorkspaceId !== undefined ? issueData.projectWorkspaceId : existing.projectWorkspaceId;
|
||||||
|
const nextExecutionWorkspaceId =
|
||||||
|
issueData.executionWorkspaceId !== undefined ? issueData.executionWorkspaceId : existing.executionWorkspaceId;
|
||||||
|
if (nextProjectWorkspaceId) {
|
||||||
|
await assertValidProjectWorkspace(existing.companyId, nextProjectId, nextProjectWorkspaceId);
|
||||||
|
}
|
||||||
|
if (nextExecutionWorkspaceId) {
|
||||||
|
await assertValidExecutionWorkspace(existing.companyId, nextProjectId, nextExecutionWorkspaceId);
|
||||||
|
}
|
||||||
|
|
||||||
applyStatusSideEffects(issueData.status, patch);
|
applyStatusSideEffects(issueData.status, patch);
|
||||||
if (issueData.status && issueData.status !== "done") {
|
if (issueData.status && issueData.status !== "done") {
|
||||||
|
|||||||
@@ -718,17 +718,16 @@ export function buildHostServices(
|
|||||||
const project = await projects.getById(params.projectId);
|
const project = await projects.getById(params.projectId);
|
||||||
if (!inCompany(project, companyId)) return null;
|
if (!inCompany(project, companyId)) return null;
|
||||||
const row = project.primaryWorkspace;
|
const row = project.primaryWorkspace;
|
||||||
if (!row) return null;
|
const path = sanitizeWorkspacePath(project.codebase.effectiveLocalFolder);
|
||||||
const path = sanitizeWorkspacePath(row.cwd);
|
const name = sanitizeWorkspaceName(row?.name ?? project.name, path);
|
||||||
const name = sanitizeWorkspaceName(row.name, path);
|
|
||||||
return {
|
return {
|
||||||
id: row.id,
|
id: row?.id ?? `${project.id}:managed`,
|
||||||
projectId: row.projectId,
|
projectId: project.id,
|
||||||
name,
|
name,
|
||||||
path,
|
path,
|
||||||
isPrimary: row.isPrimary,
|
isPrimary: true,
|
||||||
createdAt: row.createdAt.toISOString(),
|
createdAt: (row?.createdAt ?? project.createdAt).toISOString(),
|
||||||
updatedAt: row.updatedAt.toISOString(),
|
updatedAt: (row?.updatedAt ?? project.updatedAt).toISOString(),
|
||||||
};
|
};
|
||||||
},
|
},
|
||||||
|
|
||||||
@@ -742,17 +741,16 @@ export function buildHostServices(
|
|||||||
const project = await projects.getById(projectId);
|
const project = await projects.getById(projectId);
|
||||||
if (!inCompany(project, companyId)) return null;
|
if (!inCompany(project, companyId)) return null;
|
||||||
const row = project.primaryWorkspace;
|
const row = project.primaryWorkspace;
|
||||||
if (!row) return null;
|
const path = sanitizeWorkspacePath(project.codebase.effectiveLocalFolder);
|
||||||
const path = sanitizeWorkspacePath(row.cwd);
|
const name = sanitizeWorkspaceName(row?.name ?? project.name, path);
|
||||||
const name = sanitizeWorkspaceName(row.name, path);
|
|
||||||
return {
|
return {
|
||||||
id: row.id,
|
id: row?.id ?? `${project.id}:managed`,
|
||||||
projectId: row.projectId,
|
projectId: project.id,
|
||||||
name,
|
name,
|
||||||
path,
|
path,
|
||||||
isPrimary: row.isPrimary,
|
isPrimary: true,
|
||||||
createdAt: row.createdAt.toISOString(),
|
createdAt: (row?.createdAt ?? project.createdAt).toISOString(),
|
||||||
updatedAt: row.updatedAt.toISOString(),
|
updatedAt: (row?.updatedAt ?? project.updatedAt).toISOString(),
|
||||||
};
|
};
|
||||||
},
|
},
|
||||||
},
|
},
|
||||||
|
|||||||
@@ -6,6 +6,7 @@ import {
|
|||||||
deriveProjectUrlKey,
|
deriveProjectUrlKey,
|
||||||
isUuidLike,
|
isUuidLike,
|
||||||
normalizeProjectUrlKey,
|
normalizeProjectUrlKey,
|
||||||
|
type ProjectCodebase,
|
||||||
type ProjectExecutionWorkspacePolicy,
|
type ProjectExecutionWorkspacePolicy,
|
||||||
type ProjectGoalRef,
|
type ProjectGoalRef,
|
||||||
type ProjectWorkspace,
|
type ProjectWorkspace,
|
||||||
@@ -13,6 +14,7 @@ import {
|
|||||||
} from "@paperclipai/shared";
|
} from "@paperclipai/shared";
|
||||||
import { listWorkspaceRuntimeServicesForProjectWorkspaces } from "./workspace-runtime.js";
|
import { listWorkspaceRuntimeServicesForProjectWorkspaces } from "./workspace-runtime.js";
|
||||||
import { parseProjectExecutionWorkspacePolicy } from "./execution-workspace-policy.js";
|
import { parseProjectExecutionWorkspacePolicy } from "./execution-workspace-policy.js";
|
||||||
|
import { resolveManagedProjectWorkspaceDir } from "../home-paths.js";
|
||||||
|
|
||||||
type ProjectRow = typeof projects.$inferSelect;
|
type ProjectRow = typeof projects.$inferSelect;
|
||||||
type ProjectWorkspaceRow = typeof projectWorkspaces.$inferSelect;
|
type ProjectWorkspaceRow = typeof projectWorkspaces.$inferSelect;
|
||||||
@@ -20,9 +22,17 @@ type WorkspaceRuntimeServiceRow = typeof workspaceRuntimeServices.$inferSelect;
|
|||||||
const REPO_ONLY_CWD_SENTINEL = "/__paperclip_repo_only__";
|
const REPO_ONLY_CWD_SENTINEL = "/__paperclip_repo_only__";
|
||||||
type CreateWorkspaceInput = {
|
type CreateWorkspaceInput = {
|
||||||
name?: string | null;
|
name?: string | null;
|
||||||
|
sourceType?: string | null;
|
||||||
cwd?: string | null;
|
cwd?: string | null;
|
||||||
repoUrl?: string | null;
|
repoUrl?: string | null;
|
||||||
repoRef?: string | null;
|
repoRef?: string | null;
|
||||||
|
defaultRef?: string | null;
|
||||||
|
visibility?: string | null;
|
||||||
|
setupCommand?: string | null;
|
||||||
|
cleanupCommand?: string | null;
|
||||||
|
remoteProvider?: string | null;
|
||||||
|
remoteWorkspaceRef?: string | null;
|
||||||
|
sharedWorkspaceKey?: string | null;
|
||||||
metadata?: Record<string, unknown> | null;
|
metadata?: Record<string, unknown> | null;
|
||||||
isPrimary?: boolean;
|
isPrimary?: boolean;
|
||||||
};
|
};
|
||||||
@@ -33,6 +43,7 @@ interface ProjectWithGoals extends Omit<ProjectRow, "executionWorkspacePolicy">
|
|||||||
goalIds: string[];
|
goalIds: string[];
|
||||||
goals: ProjectGoalRef[];
|
goals: ProjectGoalRef[];
|
||||||
executionWorkspacePolicy: ProjectExecutionWorkspacePolicy | null;
|
executionWorkspacePolicy: ProjectExecutionWorkspacePolicy | null;
|
||||||
|
codebase: ProjectCodebase;
|
||||||
workspaces: ProjectWorkspace[];
|
workspaces: ProjectWorkspace[];
|
||||||
primaryWorkspace: ProjectWorkspace | null;
|
primaryWorkspace: ProjectWorkspace | null;
|
||||||
}
|
}
|
||||||
@@ -91,6 +102,7 @@ function toRuntimeService(row: WorkspaceRuntimeServiceRow): WorkspaceRuntimeServ
|
|||||||
companyId: row.companyId,
|
companyId: row.companyId,
|
||||||
projectId: row.projectId ?? null,
|
projectId: row.projectId ?? null,
|
||||||
projectWorkspaceId: row.projectWorkspaceId ?? null,
|
projectWorkspaceId: row.projectWorkspaceId ?? null,
|
||||||
|
executionWorkspaceId: row.executionWorkspaceId ?? null,
|
||||||
issueId: row.issueId ?? null,
|
issueId: row.issueId ?? null,
|
||||||
scopeType: row.scopeType as WorkspaceRuntimeService["scopeType"],
|
scopeType: row.scopeType as WorkspaceRuntimeService["scopeType"],
|
||||||
scopeId: row.scopeId ?? null,
|
scopeId: row.scopeId ?? null,
|
||||||
@@ -125,9 +137,17 @@ function toWorkspace(
|
|||||||
companyId: row.companyId,
|
companyId: row.companyId,
|
||||||
projectId: row.projectId,
|
projectId: row.projectId,
|
||||||
name: row.name,
|
name: row.name,
|
||||||
cwd: row.cwd,
|
sourceType: row.sourceType as ProjectWorkspace["sourceType"],
|
||||||
|
cwd: normalizeWorkspaceCwd(row.cwd),
|
||||||
repoUrl: row.repoUrl ?? null,
|
repoUrl: row.repoUrl ?? null,
|
||||||
repoRef: row.repoRef ?? null,
|
repoRef: row.repoRef ?? null,
|
||||||
|
defaultRef: row.defaultRef ?? row.repoRef ?? null,
|
||||||
|
visibility: row.visibility as ProjectWorkspace["visibility"],
|
||||||
|
setupCommand: row.setupCommand ?? null,
|
||||||
|
cleanupCommand: row.cleanupCommand ?? null,
|
||||||
|
remoteProvider: row.remoteProvider ?? null,
|
||||||
|
remoteWorkspaceRef: row.remoteWorkspaceRef ?? null,
|
||||||
|
sharedWorkspaceKey: row.sharedWorkspaceKey ?? null,
|
||||||
metadata: (row.metadata as Record<string, unknown> | null) ?? null,
|
metadata: (row.metadata as Record<string, unknown> | null) ?? null,
|
||||||
isPrimary: row.isPrimary,
|
isPrimary: row.isPrimary,
|
||||||
runtimeServices,
|
runtimeServices,
|
||||||
@@ -136,6 +156,48 @@ function toWorkspace(
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function deriveRepoNameFromRepoUrl(repoUrl: string | null): string | null {
|
||||||
|
const raw = readNonEmptyString(repoUrl);
|
||||||
|
if (!raw) return null;
|
||||||
|
try {
|
||||||
|
const parsed = new URL(raw);
|
||||||
|
const cleanedPath = parsed.pathname.replace(/\/+$/, "");
|
||||||
|
const repoName = cleanedPath.split("/").filter(Boolean).pop()?.replace(/\.git$/i, "") ?? "";
|
||||||
|
return repoName || null;
|
||||||
|
} catch {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function deriveProjectCodebase(input: {
|
||||||
|
companyId: string;
|
||||||
|
projectId: string;
|
||||||
|
primaryWorkspace: ProjectWorkspace | null;
|
||||||
|
fallbackWorkspaces: ProjectWorkspace[];
|
||||||
|
}): ProjectCodebase {
|
||||||
|
const primaryWorkspace = input.primaryWorkspace ?? input.fallbackWorkspaces[0] ?? null;
|
||||||
|
const repoUrl = primaryWorkspace?.repoUrl ?? null;
|
||||||
|
const repoName = deriveRepoNameFromRepoUrl(repoUrl);
|
||||||
|
const localFolder = primaryWorkspace?.cwd ?? null;
|
||||||
|
const managedFolder = resolveManagedProjectWorkspaceDir({
|
||||||
|
companyId: input.companyId,
|
||||||
|
projectId: input.projectId,
|
||||||
|
repoName,
|
||||||
|
});
|
||||||
|
|
||||||
|
return {
|
||||||
|
workspaceId: primaryWorkspace?.id ?? null,
|
||||||
|
repoUrl,
|
||||||
|
repoRef: primaryWorkspace?.repoRef ?? null,
|
||||||
|
defaultRef: primaryWorkspace?.defaultRef ?? null,
|
||||||
|
repoName,
|
||||||
|
localFolder,
|
||||||
|
managedFolder,
|
||||||
|
effectiveLocalFolder: localFolder ?? managedFolder,
|
||||||
|
origin: localFolder ? "local_folder" : "managed_checkout",
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
function pickPrimaryWorkspace(
|
function pickPrimaryWorkspace(
|
||||||
rows: ProjectWorkspaceRow[],
|
rows: ProjectWorkspaceRow[],
|
||||||
runtimeServicesByWorkspaceId?: Map<string, WorkspaceRuntimeService[]>,
|
runtimeServicesByWorkspaceId?: Map<string, WorkspaceRuntimeService[]>,
|
||||||
@@ -186,10 +248,17 @@ async function attachWorkspaces(db: Db, rows: ProjectWithGoals[]): Promise<Proje
|
|||||||
sharedRuntimeServicesByWorkspaceId.get(workspace.id) ?? [],
|
sharedRuntimeServicesByWorkspaceId.get(workspace.id) ?? [],
|
||||||
),
|
),
|
||||||
);
|
);
|
||||||
|
const primaryWorkspace = pickPrimaryWorkspace(projectWorkspaceRows, sharedRuntimeServicesByWorkspaceId);
|
||||||
return {
|
return {
|
||||||
...row,
|
...row,
|
||||||
|
codebase: deriveProjectCodebase({
|
||||||
|
companyId: row.companyId,
|
||||||
|
projectId: row.id,
|
||||||
|
primaryWorkspace,
|
||||||
|
fallbackWorkspaces: workspaces,
|
||||||
|
}),
|
||||||
workspaces,
|
workspaces,
|
||||||
primaryWorkspace: pickPrimaryWorkspace(projectWorkspaceRows, sharedRuntimeServicesByWorkspaceId),
|
primaryWorkspace,
|
||||||
};
|
};
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
@@ -491,7 +560,13 @@ export function projectService(db: Db) {
|
|||||||
|
|
||||||
const cwd = normalizeWorkspaceCwd(data.cwd);
|
const cwd = normalizeWorkspaceCwd(data.cwd);
|
||||||
const repoUrl = readNonEmptyString(data.repoUrl);
|
const repoUrl = readNonEmptyString(data.repoUrl);
|
||||||
if (!cwd && !repoUrl) return null;
|
const sourceType = readNonEmptyString(data.sourceType) ?? (repoUrl ? "git_repo" : cwd ? "local_path" : "remote_managed");
|
||||||
|
const remoteWorkspaceRef = readNonEmptyString(data.remoteWorkspaceRef);
|
||||||
|
if (sourceType === "remote_managed") {
|
||||||
|
if (!remoteWorkspaceRef && !repoUrl) return null;
|
||||||
|
} else if (!cwd && !repoUrl) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
const name = deriveWorkspaceName({
|
const name = deriveWorkspaceName({
|
||||||
name: data.name,
|
name: data.name,
|
||||||
cwd,
|
cwd,
|
||||||
@@ -525,9 +600,17 @@ export function projectService(db: Db) {
|
|||||||
companyId: project.companyId,
|
companyId: project.companyId,
|
||||||
projectId,
|
projectId,
|
||||||
name,
|
name,
|
||||||
|
sourceType,
|
||||||
cwd: cwd ?? null,
|
cwd: cwd ?? null,
|
||||||
repoUrl: repoUrl ?? null,
|
repoUrl: repoUrl ?? null,
|
||||||
repoRef: readNonEmptyString(data.repoRef),
|
repoRef: readNonEmptyString(data.repoRef),
|
||||||
|
defaultRef: readNonEmptyString(data.defaultRef) ?? readNonEmptyString(data.repoRef),
|
||||||
|
visibility: readNonEmptyString(data.visibility) ?? "default",
|
||||||
|
setupCommand: readNonEmptyString(data.setupCommand),
|
||||||
|
cleanupCommand: readNonEmptyString(data.cleanupCommand),
|
||||||
|
remoteProvider: readNonEmptyString(data.remoteProvider),
|
||||||
|
remoteWorkspaceRef,
|
||||||
|
sharedWorkspaceKey: readNonEmptyString(data.sharedWorkspaceKey),
|
||||||
metadata: (data.metadata as Record<string, unknown> | null | undefined) ?? null,
|
metadata: (data.metadata as Record<string, unknown> | null | undefined) ?? null,
|
||||||
isPrimary: shouldBePrimary,
|
isPrimary: shouldBePrimary,
|
||||||
})
|
})
|
||||||
@@ -564,7 +647,19 @@ export function projectService(db: Db) {
|
|||||||
data.repoUrl !== undefined
|
data.repoUrl !== undefined
|
||||||
? readNonEmptyString(data.repoUrl)
|
? readNonEmptyString(data.repoUrl)
|
||||||
: readNonEmptyString(existing.repoUrl);
|
: readNonEmptyString(existing.repoUrl);
|
||||||
if (!nextCwd && !nextRepoUrl) return null;
|
const nextSourceType =
|
||||||
|
data.sourceType !== undefined
|
||||||
|
? readNonEmptyString(data.sourceType)
|
||||||
|
: readNonEmptyString(existing.sourceType);
|
||||||
|
const nextRemoteWorkspaceRef =
|
||||||
|
data.remoteWorkspaceRef !== undefined
|
||||||
|
? readNonEmptyString(data.remoteWorkspaceRef)
|
||||||
|
: readNonEmptyString(existing.remoteWorkspaceRef);
|
||||||
|
if (nextSourceType === "remote_managed") {
|
||||||
|
if (!nextRemoteWorkspaceRef && !nextRepoUrl) return null;
|
||||||
|
} else if (!nextCwd && !nextRepoUrl) {
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
|
||||||
const patch: Partial<typeof projectWorkspaces.$inferInsert> = {
|
const patch: Partial<typeof projectWorkspaces.$inferInsert> = {
|
||||||
updatedAt: new Date(),
|
updatedAt: new Date(),
|
||||||
@@ -576,6 +671,16 @@ export function projectService(db: Db) {
|
|||||||
if (data.cwd !== undefined) patch.cwd = nextCwd ?? null;
|
if (data.cwd !== undefined) patch.cwd = nextCwd ?? null;
|
||||||
if (data.repoUrl !== undefined) patch.repoUrl = nextRepoUrl ?? null;
|
if (data.repoUrl !== undefined) patch.repoUrl = nextRepoUrl ?? null;
|
||||||
if (data.repoRef !== undefined) patch.repoRef = readNonEmptyString(data.repoRef);
|
if (data.repoRef !== undefined) patch.repoRef = readNonEmptyString(data.repoRef);
|
||||||
|
if (data.sourceType !== undefined && nextSourceType) patch.sourceType = nextSourceType;
|
||||||
|
if (data.defaultRef !== undefined) patch.defaultRef = readNonEmptyString(data.defaultRef);
|
||||||
|
if (data.visibility !== undefined && readNonEmptyString(data.visibility)) {
|
||||||
|
patch.visibility = readNonEmptyString(data.visibility)!;
|
||||||
|
}
|
||||||
|
if (data.setupCommand !== undefined) patch.setupCommand = readNonEmptyString(data.setupCommand);
|
||||||
|
if (data.cleanupCommand !== undefined) patch.cleanupCommand = readNonEmptyString(data.cleanupCommand);
|
||||||
|
if (data.remoteProvider !== undefined) patch.remoteProvider = readNonEmptyString(data.remoteProvider);
|
||||||
|
if (data.remoteWorkspaceRef !== undefined) patch.remoteWorkspaceRef = nextRemoteWorkspaceRef;
|
||||||
|
if (data.sharedWorkspaceKey !== undefined) patch.sharedWorkspaceKey = readNonEmptyString(data.sharedWorkspaceKey);
|
||||||
if (data.metadata !== undefined) patch.metadata = data.metadata;
|
if (data.metadata !== undefined) patch.metadata = data.metadata;
|
||||||
|
|
||||||
const updated = await db.transaction(async (tx) => {
|
const updated = await db.transaction(async (tx) => {
|
||||||
|
|||||||
123
server/src/services/work-products.ts
Normal file
123
server/src/services/work-products.ts
Normal file
@@ -0,0 +1,123 @@
|
|||||||
|
import { and, desc, eq } from "drizzle-orm";
|
||||||
|
import type { Db } from "@paperclipai/db";
|
||||||
|
import { issueWorkProducts } from "@paperclipai/db";
|
||||||
|
import type { IssueWorkProduct } from "@paperclipai/shared";
|
||||||
|
|
||||||
|
type IssueWorkProductRow = typeof issueWorkProducts.$inferSelect;
|
||||||
|
|
||||||
|
function toIssueWorkProduct(row: IssueWorkProductRow): IssueWorkProduct {
|
||||||
|
return {
|
||||||
|
id: row.id,
|
||||||
|
companyId: row.companyId,
|
||||||
|
projectId: row.projectId ?? null,
|
||||||
|
issueId: row.issueId,
|
||||||
|
executionWorkspaceId: row.executionWorkspaceId ?? null,
|
||||||
|
runtimeServiceId: row.runtimeServiceId ?? null,
|
||||||
|
type: row.type as IssueWorkProduct["type"],
|
||||||
|
provider: row.provider,
|
||||||
|
externalId: row.externalId ?? null,
|
||||||
|
title: row.title,
|
||||||
|
url: row.url ?? null,
|
||||||
|
status: row.status,
|
||||||
|
reviewState: row.reviewState as IssueWorkProduct["reviewState"],
|
||||||
|
isPrimary: row.isPrimary,
|
||||||
|
healthStatus: row.healthStatus as IssueWorkProduct["healthStatus"],
|
||||||
|
summary: row.summary ?? null,
|
||||||
|
metadata: (row.metadata as Record<string, unknown> | null) ?? null,
|
||||||
|
createdByRunId: row.createdByRunId ?? null,
|
||||||
|
createdAt: row.createdAt,
|
||||||
|
updatedAt: row.updatedAt,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function workProductService(db: Db) {
|
||||||
|
return {
|
||||||
|
listForIssue: async (issueId: string) => {
|
||||||
|
const rows = await db
|
||||||
|
.select()
|
||||||
|
.from(issueWorkProducts)
|
||||||
|
.where(eq(issueWorkProducts.issueId, issueId))
|
||||||
|
.orderBy(desc(issueWorkProducts.isPrimary), desc(issueWorkProducts.updatedAt));
|
||||||
|
return rows.map(toIssueWorkProduct);
|
||||||
|
},
|
||||||
|
|
||||||
|
getById: async (id: string) => {
|
||||||
|
const row = await db
|
||||||
|
.select()
|
||||||
|
.from(issueWorkProducts)
|
||||||
|
.where(eq(issueWorkProducts.id, id))
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
return row ? toIssueWorkProduct(row) : null;
|
||||||
|
},
|
||||||
|
|
||||||
|
createForIssue: async (issueId: string, companyId: string, data: Omit<typeof issueWorkProducts.$inferInsert, "issueId" | "companyId">) => {
|
||||||
|
const row = await db.transaction(async (tx) => {
|
||||||
|
if (data.isPrimary) {
|
||||||
|
await tx
|
||||||
|
.update(issueWorkProducts)
|
||||||
|
.set({ isPrimary: false, updatedAt: new Date() })
|
||||||
|
.where(
|
||||||
|
and(
|
||||||
|
eq(issueWorkProducts.companyId, companyId),
|
||||||
|
eq(issueWorkProducts.issueId, issueId),
|
||||||
|
eq(issueWorkProducts.type, data.type),
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
return await tx
|
||||||
|
.insert(issueWorkProducts)
|
||||||
|
.values({
|
||||||
|
...data,
|
||||||
|
companyId,
|
||||||
|
issueId,
|
||||||
|
})
|
||||||
|
.returning()
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
});
|
||||||
|
return row ? toIssueWorkProduct(row) : null;
|
||||||
|
},
|
||||||
|
|
||||||
|
update: async (id: string, patch: Partial<typeof issueWorkProducts.$inferInsert>) => {
|
||||||
|
const row = await db.transaction(async (tx) => {
|
||||||
|
const existing = await tx
|
||||||
|
.select()
|
||||||
|
.from(issueWorkProducts)
|
||||||
|
.where(eq(issueWorkProducts.id, id))
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
if (!existing) return null;
|
||||||
|
|
||||||
|
if (patch.isPrimary === true) {
|
||||||
|
await tx
|
||||||
|
.update(issueWorkProducts)
|
||||||
|
.set({ isPrimary: false, updatedAt: new Date() })
|
||||||
|
.where(
|
||||||
|
and(
|
||||||
|
eq(issueWorkProducts.companyId, existing.companyId),
|
||||||
|
eq(issueWorkProducts.issueId, existing.issueId),
|
||||||
|
eq(issueWorkProducts.type, existing.type),
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
return await tx
|
||||||
|
.update(issueWorkProducts)
|
||||||
|
.set({ ...patch, updatedAt: new Date() })
|
||||||
|
.where(eq(issueWorkProducts.id, id))
|
||||||
|
.returning()
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
});
|
||||||
|
return row ? toIssueWorkProduct(row) : null;
|
||||||
|
},
|
||||||
|
|
||||||
|
remove: async (id: string) => {
|
||||||
|
const row = await db
|
||||||
|
.delete(issueWorkProducts)
|
||||||
|
.where(eq(issueWorkProducts.id, id))
|
||||||
|
.returning()
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
return row ? toIssueWorkProduct(row) : null;
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export { toIssueWorkProduct };
|
||||||
156
server/src/services/workspace-operation-log-store.ts
Normal file
156
server/src/services/workspace-operation-log-store.ts
Normal file
@@ -0,0 +1,156 @@
|
|||||||
|
import { createReadStream, promises as fs } from "node:fs";
|
||||||
|
import path from "node:path";
|
||||||
|
import { createHash } from "node:crypto";
|
||||||
|
import { notFound } from "../errors.js";
|
||||||
|
import { resolvePaperclipInstanceRoot } from "../home-paths.js";
|
||||||
|
|
||||||
|
export type WorkspaceOperationLogStoreType = "local_file";
|
||||||
|
|
||||||
|
export interface WorkspaceOperationLogHandle {
|
||||||
|
store: WorkspaceOperationLogStoreType;
|
||||||
|
logRef: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface WorkspaceOperationLogReadOptions {
|
||||||
|
offset?: number;
|
||||||
|
limitBytes?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface WorkspaceOperationLogReadResult {
|
||||||
|
content: string;
|
||||||
|
nextOffset?: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface WorkspaceOperationLogFinalizeSummary {
|
||||||
|
bytes: number;
|
||||||
|
sha256?: string;
|
||||||
|
compressed: boolean;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface WorkspaceOperationLogStore {
|
||||||
|
begin(input: { companyId: string; operationId: string }): Promise<WorkspaceOperationLogHandle>;
|
||||||
|
append(
|
||||||
|
handle: WorkspaceOperationLogHandle,
|
||||||
|
event: { stream: "stdout" | "stderr" | "system"; chunk: string; ts: string },
|
||||||
|
): Promise<void>;
|
||||||
|
finalize(handle: WorkspaceOperationLogHandle): Promise<WorkspaceOperationLogFinalizeSummary>;
|
||||||
|
read(handle: WorkspaceOperationLogHandle, opts?: WorkspaceOperationLogReadOptions): Promise<WorkspaceOperationLogReadResult>;
|
||||||
|
}
|
||||||
|
|
||||||
|
function safeSegments(...segments: string[]) {
|
||||||
|
return segments.map((segment) => segment.replace(/[^a-zA-Z0-9._-]/g, "_"));
|
||||||
|
}
|
||||||
|
|
||||||
|
function resolveWithin(basePath: string, relativePath: string) {
|
||||||
|
const resolved = path.resolve(basePath, relativePath);
|
||||||
|
const base = path.resolve(basePath) + path.sep;
|
||||||
|
if (!resolved.startsWith(base) && resolved !== path.resolve(basePath)) {
|
||||||
|
throw new Error("Invalid log path");
|
||||||
|
}
|
||||||
|
return resolved;
|
||||||
|
}
|
||||||
|
|
||||||
|
function createLocalFileWorkspaceOperationLogStore(basePath: string): WorkspaceOperationLogStore {
|
||||||
|
async function ensureDir(relativeDir: string) {
|
||||||
|
const dir = resolveWithin(basePath, relativeDir);
|
||||||
|
await fs.mkdir(dir, { recursive: true });
|
||||||
|
}
|
||||||
|
|
||||||
|
async function readFileRange(filePath: string, offset: number, limitBytes: number): Promise<WorkspaceOperationLogReadResult> {
|
||||||
|
const stat = await fs.stat(filePath).catch(() => null);
|
||||||
|
if (!stat) throw notFound("Workspace operation log not found");
|
||||||
|
|
||||||
|
const start = Math.max(0, Math.min(offset, stat.size));
|
||||||
|
const end = Math.max(start, Math.min(start + limitBytes - 1, stat.size - 1));
|
||||||
|
|
||||||
|
if (start > end) {
|
||||||
|
return { content: "", nextOffset: start };
|
||||||
|
}
|
||||||
|
|
||||||
|
const chunks: Buffer[] = [];
|
||||||
|
await new Promise<void>((resolve, reject) => {
|
||||||
|
const stream = createReadStream(filePath, { start, end });
|
||||||
|
stream.on("data", (chunk) => {
|
||||||
|
chunks.push(Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk));
|
||||||
|
});
|
||||||
|
stream.on("error", reject);
|
||||||
|
stream.on("end", () => resolve());
|
||||||
|
});
|
||||||
|
|
||||||
|
const content = Buffer.concat(chunks).toString("utf8");
|
||||||
|
const nextOffset = end + 1 < stat.size ? end + 1 : undefined;
|
||||||
|
return { content, nextOffset };
|
||||||
|
}
|
||||||
|
|
||||||
|
async function sha256File(filePath: string): Promise<string> {
|
||||||
|
return new Promise<string>((resolve, reject) => {
|
||||||
|
const hash = createHash("sha256");
|
||||||
|
const stream = createReadStream(filePath);
|
||||||
|
stream.on("data", (chunk) => hash.update(chunk));
|
||||||
|
stream.on("error", reject);
|
||||||
|
stream.on("end", () => resolve(hash.digest("hex")));
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
async begin(input) {
|
||||||
|
const [companyId] = safeSegments(input.companyId);
|
||||||
|
const operationId = safeSegments(input.operationId)[0]!;
|
||||||
|
const relDir = companyId;
|
||||||
|
const relPath = path.join(relDir, `${operationId}.ndjson`);
|
||||||
|
await ensureDir(relDir);
|
||||||
|
|
||||||
|
const absPath = resolveWithin(basePath, relPath);
|
||||||
|
await fs.writeFile(absPath, "", "utf8");
|
||||||
|
|
||||||
|
return { store: "local_file", logRef: relPath };
|
||||||
|
},
|
||||||
|
|
||||||
|
async append(handle, event) {
|
||||||
|
if (handle.store !== "local_file") return;
|
||||||
|
const absPath = resolveWithin(basePath, handle.logRef);
|
||||||
|
const line = JSON.stringify({
|
||||||
|
ts: event.ts,
|
||||||
|
stream: event.stream,
|
||||||
|
chunk: event.chunk,
|
||||||
|
});
|
||||||
|
await fs.appendFile(absPath, `${line}\n`, "utf8");
|
||||||
|
},
|
||||||
|
|
||||||
|
async finalize(handle) {
|
||||||
|
if (handle.store !== "local_file") {
|
||||||
|
return { bytes: 0, compressed: false };
|
||||||
|
}
|
||||||
|
const absPath = resolveWithin(basePath, handle.logRef);
|
||||||
|
const stat = await fs.stat(absPath).catch(() => null);
|
||||||
|
if (!stat) throw notFound("Workspace operation log not found");
|
||||||
|
|
||||||
|
const hash = await sha256File(absPath);
|
||||||
|
return {
|
||||||
|
bytes: stat.size,
|
||||||
|
sha256: hash,
|
||||||
|
compressed: false,
|
||||||
|
};
|
||||||
|
},
|
||||||
|
|
||||||
|
async read(handle, opts) {
|
||||||
|
if (handle.store !== "local_file") {
|
||||||
|
throw notFound("Workspace operation log not found");
|
||||||
|
}
|
||||||
|
const absPath = resolveWithin(basePath, handle.logRef);
|
||||||
|
const offset = opts?.offset ?? 0;
|
||||||
|
const limitBytes = opts?.limitBytes ?? 256_000;
|
||||||
|
return readFileRange(absPath, offset, limitBytes);
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
let cachedStore: WorkspaceOperationLogStore | null = null;
|
||||||
|
|
||||||
|
export function getWorkspaceOperationLogStore() {
|
||||||
|
if (cachedStore) return cachedStore;
|
||||||
|
const basePath = process.env.WORKSPACE_OPERATION_LOG_BASE_PATH
|
||||||
|
?? path.resolve(resolvePaperclipInstanceRoot(), "data", "workspace-operation-logs");
|
||||||
|
cachedStore = createLocalFileWorkspaceOperationLogStore(basePath);
|
||||||
|
return cachedStore;
|
||||||
|
}
|
||||||
250
server/src/services/workspace-operations.ts
Normal file
250
server/src/services/workspace-operations.ts
Normal file
@@ -0,0 +1,250 @@
|
|||||||
|
import { randomUUID } from "node:crypto";
|
||||||
|
import type { Db } from "@paperclipai/db";
|
||||||
|
import { workspaceOperations } from "@paperclipai/db";
|
||||||
|
import type { WorkspaceOperation, WorkspaceOperationPhase, WorkspaceOperationStatus } from "@paperclipai/shared";
|
||||||
|
import { asc, desc, eq, inArray, isNull, or, and } from "drizzle-orm";
|
||||||
|
import { notFound } from "../errors.js";
|
||||||
|
import { redactCurrentUserText, redactCurrentUserValue } from "../log-redaction.js";
|
||||||
|
import { getWorkspaceOperationLogStore } from "./workspace-operation-log-store.js";
|
||||||
|
|
||||||
|
type WorkspaceOperationRow = typeof workspaceOperations.$inferSelect;
|
||||||
|
|
||||||
|
function toWorkspaceOperation(row: WorkspaceOperationRow): WorkspaceOperation {
|
||||||
|
return {
|
||||||
|
id: row.id,
|
||||||
|
companyId: row.companyId,
|
||||||
|
executionWorkspaceId: row.executionWorkspaceId ?? null,
|
||||||
|
heartbeatRunId: row.heartbeatRunId ?? null,
|
||||||
|
phase: row.phase as WorkspaceOperationPhase,
|
||||||
|
command: row.command ?? null,
|
||||||
|
cwd: row.cwd ?? null,
|
||||||
|
status: row.status as WorkspaceOperationStatus,
|
||||||
|
exitCode: row.exitCode ?? null,
|
||||||
|
logStore: row.logStore ?? null,
|
||||||
|
logRef: row.logRef ?? null,
|
||||||
|
logBytes: row.logBytes ?? null,
|
||||||
|
logSha256: row.logSha256 ?? null,
|
||||||
|
logCompressed: row.logCompressed,
|
||||||
|
stdoutExcerpt: row.stdoutExcerpt ?? null,
|
||||||
|
stderrExcerpt: row.stderrExcerpt ?? null,
|
||||||
|
metadata: (row.metadata as Record<string, unknown> | null) ?? null,
|
||||||
|
startedAt: row.startedAt,
|
||||||
|
finishedAt: row.finishedAt ?? null,
|
||||||
|
createdAt: row.createdAt,
|
||||||
|
updatedAt: row.updatedAt,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function appendExcerpt(current: string, chunk: string) {
|
||||||
|
return `${current}${chunk}`.slice(-4096);
|
||||||
|
}
|
||||||
|
|
||||||
|
function combineMetadata(
|
||||||
|
base: Record<string, unknown> | null | undefined,
|
||||||
|
patch: Record<string, unknown> | null | undefined,
|
||||||
|
) {
|
||||||
|
if (!base && !patch) return null;
|
||||||
|
return {
|
||||||
|
...(base ?? {}),
|
||||||
|
...(patch ?? {}),
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface WorkspaceOperationRecorder {
|
||||||
|
attachExecutionWorkspaceId(executionWorkspaceId: string | null): Promise<void>;
|
||||||
|
recordOperation(input: {
|
||||||
|
phase: WorkspaceOperationPhase;
|
||||||
|
command?: string | null;
|
||||||
|
cwd?: string | null;
|
||||||
|
metadata?: Record<string, unknown> | null;
|
||||||
|
run: () => Promise<{
|
||||||
|
status?: WorkspaceOperationStatus;
|
||||||
|
exitCode?: number | null;
|
||||||
|
stdout?: string | null;
|
||||||
|
stderr?: string | null;
|
||||||
|
system?: string | null;
|
||||||
|
metadata?: Record<string, unknown> | null;
|
||||||
|
}>;
|
||||||
|
}): Promise<WorkspaceOperation>;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function workspaceOperationService(db: Db) {
|
||||||
|
const logStore = getWorkspaceOperationLogStore();
|
||||||
|
|
||||||
|
async function getById(id: string) {
|
||||||
|
const row = await db
|
||||||
|
.select()
|
||||||
|
.from(workspaceOperations)
|
||||||
|
.where(eq(workspaceOperations.id, id))
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
return row ? toWorkspaceOperation(row) : null;
|
||||||
|
}
|
||||||
|
|
||||||
|
return {
|
||||||
|
getById,
|
||||||
|
|
||||||
|
createRecorder(input: {
|
||||||
|
companyId: string;
|
||||||
|
heartbeatRunId?: string | null;
|
||||||
|
executionWorkspaceId?: string | null;
|
||||||
|
}): WorkspaceOperationRecorder {
|
||||||
|
let executionWorkspaceId = input.executionWorkspaceId ?? null;
|
||||||
|
const createdIds: string[] = [];
|
||||||
|
|
||||||
|
return {
|
||||||
|
async attachExecutionWorkspaceId(nextExecutionWorkspaceId) {
|
||||||
|
executionWorkspaceId = nextExecutionWorkspaceId ?? null;
|
||||||
|
if (!executionWorkspaceId || createdIds.length === 0) return;
|
||||||
|
await db
|
||||||
|
.update(workspaceOperations)
|
||||||
|
.set({
|
||||||
|
executionWorkspaceId,
|
||||||
|
updatedAt: new Date(),
|
||||||
|
})
|
||||||
|
.where(inArray(workspaceOperations.id, createdIds));
|
||||||
|
},
|
||||||
|
|
||||||
|
async recordOperation(recordInput) {
|
||||||
|
const startedAt = new Date();
|
||||||
|
const id = randomUUID();
|
||||||
|
const handle = await logStore.begin({
|
||||||
|
companyId: input.companyId,
|
||||||
|
operationId: id,
|
||||||
|
});
|
||||||
|
|
||||||
|
let stdoutExcerpt = "";
|
||||||
|
let stderrExcerpt = "";
|
||||||
|
const append = async (stream: "stdout" | "stderr" | "system", chunk: string | null | undefined) => {
|
||||||
|
if (!chunk) return;
|
||||||
|
const sanitizedChunk = redactCurrentUserText(chunk);
|
||||||
|
if (stream === "stdout") stdoutExcerpt = appendExcerpt(stdoutExcerpt, sanitizedChunk);
|
||||||
|
if (stream === "stderr") stderrExcerpt = appendExcerpt(stderrExcerpt, sanitizedChunk);
|
||||||
|
await logStore.append(handle, {
|
||||||
|
stream,
|
||||||
|
chunk: sanitizedChunk,
|
||||||
|
ts: new Date().toISOString(),
|
||||||
|
});
|
||||||
|
};
|
||||||
|
|
||||||
|
await db.insert(workspaceOperations).values({
|
||||||
|
id,
|
||||||
|
companyId: input.companyId,
|
||||||
|
executionWorkspaceId,
|
||||||
|
heartbeatRunId: input.heartbeatRunId ?? null,
|
||||||
|
phase: recordInput.phase,
|
||||||
|
command: recordInput.command ?? null,
|
||||||
|
cwd: recordInput.cwd ?? null,
|
||||||
|
status: "running",
|
||||||
|
logStore: handle.store,
|
||||||
|
logRef: handle.logRef,
|
||||||
|
metadata: redactCurrentUserValue(recordInput.metadata ?? null) as Record<string, unknown> | null,
|
||||||
|
startedAt,
|
||||||
|
});
|
||||||
|
createdIds.push(id);
|
||||||
|
|
||||||
|
try {
|
||||||
|
const result = await recordInput.run();
|
||||||
|
await append("system", result.system ?? null);
|
||||||
|
await append("stdout", result.stdout ?? null);
|
||||||
|
await append("stderr", result.stderr ?? null);
|
||||||
|
const finalized = await logStore.finalize(handle);
|
||||||
|
const finishedAt = new Date();
|
||||||
|
const row = await db
|
||||||
|
.update(workspaceOperations)
|
||||||
|
.set({
|
||||||
|
executionWorkspaceId,
|
||||||
|
status: result.status ?? "succeeded",
|
||||||
|
exitCode: result.exitCode ?? null,
|
||||||
|
stdoutExcerpt: stdoutExcerpt || null,
|
||||||
|
stderrExcerpt: stderrExcerpt || null,
|
||||||
|
logBytes: finalized.bytes,
|
||||||
|
logSha256: finalized.sha256,
|
||||||
|
logCompressed: finalized.compressed,
|
||||||
|
metadata: redactCurrentUserValue(
|
||||||
|
combineMetadata(recordInput.metadata, result.metadata),
|
||||||
|
) as Record<string, unknown> | null,
|
||||||
|
finishedAt,
|
||||||
|
updatedAt: finishedAt,
|
||||||
|
})
|
||||||
|
.where(eq(workspaceOperations.id, id))
|
||||||
|
.returning()
|
||||||
|
.then((rows) => rows[0] ?? null);
|
||||||
|
if (!row) throw notFound("Workspace operation not found");
|
||||||
|
return toWorkspaceOperation(row);
|
||||||
|
} catch (error) {
|
||||||
|
await append("stderr", error instanceof Error ? error.message : String(error));
|
||||||
|
const finalized = await logStore.finalize(handle).catch(() => null);
|
||||||
|
const finishedAt = new Date();
|
||||||
|
await db
|
||||||
|
.update(workspaceOperations)
|
||||||
|
.set({
|
||||||
|
executionWorkspaceId,
|
||||||
|
status: "failed",
|
||||||
|
stdoutExcerpt: stdoutExcerpt || null,
|
||||||
|
stderrExcerpt: stderrExcerpt || null,
|
||||||
|
logBytes: finalized?.bytes ?? null,
|
||||||
|
logSha256: finalized?.sha256 ?? null,
|
||||||
|
logCompressed: finalized?.compressed ?? false,
|
||||||
|
finishedAt,
|
||||||
|
updatedAt: finishedAt,
|
||||||
|
})
|
||||||
|
.where(eq(workspaceOperations.id, id));
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
},
|
||||||
|
};
|
||||||
|
},
|
||||||
|
|
||||||
|
listForRun: async (runId: string, executionWorkspaceId?: string | null) => {
|
||||||
|
const conditions = [eq(workspaceOperations.heartbeatRunId, runId)];
|
||||||
|
if (executionWorkspaceId) {
|
||||||
|
const cleanupCondition = and(
|
||||||
|
eq(workspaceOperations.executionWorkspaceId, executionWorkspaceId)!,
|
||||||
|
isNull(workspaceOperations.heartbeatRunId),
|
||||||
|
)!;
|
||||||
|
if (cleanupCondition) conditions.push(cleanupCondition);
|
||||||
|
}
|
||||||
|
|
||||||
|
const rows = await db
|
||||||
|
.select()
|
||||||
|
.from(workspaceOperations)
|
||||||
|
.where(conditions.length === 1 ? conditions[0]! : or(...conditions)!)
|
||||||
|
.orderBy(asc(workspaceOperations.startedAt), asc(workspaceOperations.createdAt), asc(workspaceOperations.id));
|
||||||
|
|
||||||
|
return rows.map(toWorkspaceOperation);
|
||||||
|
},
|
||||||
|
|
||||||
|
listForExecutionWorkspace: async (executionWorkspaceId: string) => {
|
||||||
|
const rows = await db
|
||||||
|
.select()
|
||||||
|
.from(workspaceOperations)
|
||||||
|
.where(eq(workspaceOperations.executionWorkspaceId, executionWorkspaceId))
|
||||||
|
.orderBy(desc(workspaceOperations.startedAt), desc(workspaceOperations.createdAt));
|
||||||
|
return rows.map(toWorkspaceOperation);
|
||||||
|
},
|
||||||
|
|
||||||
|
readLog: async (operationId: string, opts?: { offset?: number; limitBytes?: number }) => {
|
||||||
|
const operation = await getById(operationId);
|
||||||
|
if (!operation) throw notFound("Workspace operation not found");
|
||||||
|
if (!operation.logStore || !operation.logRef) throw notFound("Workspace operation log not found");
|
||||||
|
|
||||||
|
const result = await logStore.read(
|
||||||
|
{
|
||||||
|
store: operation.logStore as "local_file",
|
||||||
|
logRef: operation.logRef,
|
||||||
|
},
|
||||||
|
opts,
|
||||||
|
);
|
||||||
|
|
||||||
|
return {
|
||||||
|
operationId,
|
||||||
|
store: operation.logStore,
|
||||||
|
logRef: operation.logRef,
|
||||||
|
...result,
|
||||||
|
content: redactCurrentUserText(result.content),
|
||||||
|
};
|
||||||
|
},
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export { toWorkspaceOperation };
|
||||||
@@ -10,6 +10,7 @@ import { workspaceRuntimeServices } from "@paperclipai/db";
|
|||||||
import { and, desc, eq, inArray } from "drizzle-orm";
|
import { and, desc, eq, inArray } from "drizzle-orm";
|
||||||
import { asNumber, asString, parseObject, renderTemplate } from "../adapters/utils.js";
|
import { asNumber, asString, parseObject, renderTemplate } from "../adapters/utils.js";
|
||||||
import { resolveHomeAwarePath } from "../home-paths.js";
|
import { resolveHomeAwarePath } from "../home-paths.js";
|
||||||
|
import type { WorkspaceOperationRecorder } from "./workspace-operations.js";
|
||||||
|
|
||||||
export interface ExecutionWorkspaceInput {
|
export interface ExecutionWorkspaceInput {
|
||||||
baseCwd: string;
|
baseCwd: string;
|
||||||
@@ -46,6 +47,7 @@ export interface RuntimeServiceRef {
|
|||||||
companyId: string;
|
companyId: string;
|
||||||
projectId: string | null;
|
projectId: string | null;
|
||||||
projectWorkspaceId: string | null;
|
projectWorkspaceId: string | null;
|
||||||
|
executionWorkspaceId: string | null;
|
||||||
issueId: string | null;
|
issueId: string | null;
|
||||||
serviceName: string;
|
serviceName: string;
|
||||||
status: "starting" | "running" | "stopped" | "failed";
|
status: "starting" | "running" | "stopped" | "failed";
|
||||||
@@ -92,6 +94,17 @@ function stableStringify(value: unknown): string {
|
|||||||
return JSON.stringify(value);
|
return JSON.stringify(value);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function sanitizeRuntimeServiceBaseEnv(baseEnv: NodeJS.ProcessEnv): NodeJS.ProcessEnv {
|
||||||
|
const env: NodeJS.ProcessEnv = { ...baseEnv };
|
||||||
|
for (const key of Object.keys(env)) {
|
||||||
|
if (key.startsWith("PAPERCLIP_")) {
|
||||||
|
delete env[key];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
delete env.DATABASE_URL;
|
||||||
|
return env;
|
||||||
|
}
|
||||||
|
|
||||||
function stableRuntimeServiceId(input: {
|
function stableRuntimeServiceId(input: {
|
||||||
adapterType: string;
|
adapterType: string;
|
||||||
runId: string;
|
runId: string;
|
||||||
@@ -126,6 +139,7 @@ function toRuntimeServiceRef(record: RuntimeServiceRecord, overrides?: Partial<R
|
|||||||
companyId: record.companyId,
|
companyId: record.companyId,
|
||||||
projectId: record.projectId,
|
projectId: record.projectId,
|
||||||
projectWorkspaceId: record.projectWorkspaceId,
|
projectWorkspaceId: record.projectWorkspaceId,
|
||||||
|
executionWorkspaceId: record.executionWorkspaceId,
|
||||||
issueId: record.issueId,
|
issueId: record.issueId,
|
||||||
serviceName: record.serviceName,
|
serviceName: record.serviceName,
|
||||||
status: record.status,
|
status: record.status,
|
||||||
@@ -208,12 +222,23 @@ function resolveConfiguredPath(value: string, baseDir: string): string {
|
|||||||
return path.resolve(baseDir, value);
|
return path.resolve(baseDir, value);
|
||||||
}
|
}
|
||||||
|
|
||||||
async function runGit(args: string[], cwd: string): Promise<string> {
|
function formatCommandForDisplay(command: string, args: string[]) {
|
||||||
|
return [command, ...args]
|
||||||
|
.map((part) => (/^[A-Za-z0-9_./:-]+$/.test(part) ? part : JSON.stringify(part)))
|
||||||
|
.join(" ");
|
||||||
|
}
|
||||||
|
|
||||||
|
async function executeProcess(input: {
|
||||||
|
command: string;
|
||||||
|
args: string[];
|
||||||
|
cwd: string;
|
||||||
|
env?: NodeJS.ProcessEnv;
|
||||||
|
}): Promise<{ stdout: string; stderr: string; code: number | null }> {
|
||||||
const proc = await new Promise<{ stdout: string; stderr: string; code: number | null }>((resolve, reject) => {
|
const proc = await new Promise<{ stdout: string; stderr: string; code: number | null }>((resolve, reject) => {
|
||||||
const child = spawn("git", args, {
|
const child = spawn(input.command, input.args, {
|
||||||
cwd,
|
cwd: input.cwd,
|
||||||
stdio: ["ignore", "pipe", "pipe"],
|
stdio: ["ignore", "pipe", "pipe"],
|
||||||
env: process.env,
|
env: input.env ?? process.env,
|
||||||
});
|
});
|
||||||
let stdout = "";
|
let stdout = "";
|
||||||
let stderr = "";
|
let stderr = "";
|
||||||
@@ -226,16 +251,45 @@ async function runGit(args: string[], cwd: string): Promise<string> {
|
|||||||
child.on("error", reject);
|
child.on("error", reject);
|
||||||
child.on("close", (code) => resolve({ stdout, stderr, code }));
|
child.on("close", (code) => resolve({ stdout, stderr, code }));
|
||||||
});
|
});
|
||||||
|
return proc;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function runGit(args: string[], cwd: string): Promise<string> {
|
||||||
|
const proc = await executeProcess({
|
||||||
|
command: "git",
|
||||||
|
args,
|
||||||
|
cwd,
|
||||||
|
});
|
||||||
if (proc.code !== 0) {
|
if (proc.code !== 0) {
|
||||||
throw new Error(proc.stderr.trim() || proc.stdout.trim() || `git ${args.join(" ")} failed`);
|
throw new Error(proc.stderr.trim() || proc.stdout.trim() || `git ${args.join(" ")} failed`);
|
||||||
}
|
}
|
||||||
return proc.stdout.trim();
|
return proc.stdout.trim();
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function gitErrorIncludes(error: unknown, needle: string) {
|
||||||
|
const message = error instanceof Error ? error.message : String(error);
|
||||||
|
return message.toLowerCase().includes(needle.toLowerCase());
|
||||||
|
}
|
||||||
|
|
||||||
async function directoryExists(value: string) {
|
async function directoryExists(value: string) {
|
||||||
return fs.stat(value).then((stats) => stats.isDirectory()).catch(() => false);
|
return fs.stat(value).then((stats) => stats.isDirectory()).catch(() => false);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function terminateChildProcess(child: ChildProcess) {
|
||||||
|
if (!child.pid) return;
|
||||||
|
if (process.platform !== "win32") {
|
||||||
|
try {
|
||||||
|
process.kill(-child.pid, "SIGTERM");
|
||||||
|
return;
|
||||||
|
} catch {
|
||||||
|
// Fall through to the direct child kill.
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (!child.killed) {
|
||||||
|
child.kill("SIGTERM");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
function buildWorkspaceCommandEnv(input: {
|
function buildWorkspaceCommandEnv(input: {
|
||||||
base: ExecutionWorkspaceInput;
|
base: ExecutionWorkspaceInput;
|
||||||
repoRoot: string;
|
repoRoot: string;
|
||||||
@@ -274,22 +328,11 @@ async function runWorkspaceCommand(input: {
|
|||||||
label: string;
|
label: string;
|
||||||
}) {
|
}) {
|
||||||
const shell = process.env.SHELL?.trim() || "/bin/sh";
|
const shell = process.env.SHELL?.trim() || "/bin/sh";
|
||||||
const proc = await new Promise<{ stdout: string; stderr: string; code: number | null }>((resolve, reject) => {
|
const proc = await executeProcess({
|
||||||
const child = spawn(shell, ["-c", input.command], {
|
command: shell,
|
||||||
cwd: input.cwd,
|
args: ["-c", input.command],
|
||||||
env: input.env,
|
cwd: input.cwd,
|
||||||
stdio: ["ignore", "pipe", "pipe"],
|
env: input.env,
|
||||||
});
|
|
||||||
let stdout = "";
|
|
||||||
let stderr = "";
|
|
||||||
child.stdout?.on("data", (chunk) => {
|
|
||||||
stdout += String(chunk);
|
|
||||||
});
|
|
||||||
child.stderr?.on("data", (chunk) => {
|
|
||||||
stderr += String(chunk);
|
|
||||||
});
|
|
||||||
child.on("error", reject);
|
|
||||||
child.on("close", (code) => resolve({ stdout, stderr, code }));
|
|
||||||
});
|
});
|
||||||
if (proc.code === 0) return;
|
if (proc.code === 0) return;
|
||||||
|
|
||||||
@@ -301,6 +344,115 @@ async function runWorkspaceCommand(input: {
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async function recordGitOperation(
|
||||||
|
recorder: WorkspaceOperationRecorder | null | undefined,
|
||||||
|
input: {
|
||||||
|
phase: "worktree_prepare" | "worktree_cleanup";
|
||||||
|
args: string[];
|
||||||
|
cwd: string;
|
||||||
|
metadata?: Record<string, unknown> | null;
|
||||||
|
successMessage?: string | null;
|
||||||
|
failureLabel?: string | null;
|
||||||
|
},
|
||||||
|
): Promise<string> {
|
||||||
|
if (!recorder) {
|
||||||
|
return runGit(input.args, input.cwd);
|
||||||
|
}
|
||||||
|
|
||||||
|
let stdout = "";
|
||||||
|
let stderr = "";
|
||||||
|
let code: number | null = null;
|
||||||
|
await recorder.recordOperation({
|
||||||
|
phase: input.phase,
|
||||||
|
command: formatCommandForDisplay("git", input.args),
|
||||||
|
cwd: input.cwd,
|
||||||
|
metadata: input.metadata ?? null,
|
||||||
|
run: async () => {
|
||||||
|
const result = await executeProcess({
|
||||||
|
command: "git",
|
||||||
|
args: input.args,
|
||||||
|
cwd: input.cwd,
|
||||||
|
});
|
||||||
|
stdout = result.stdout;
|
||||||
|
stderr = result.stderr;
|
||||||
|
code = result.code;
|
||||||
|
return {
|
||||||
|
status: result.code === 0 ? "succeeded" : "failed",
|
||||||
|
exitCode: result.code,
|
||||||
|
stdout: result.stdout,
|
||||||
|
stderr: result.stderr,
|
||||||
|
system: result.code === 0 ? input.successMessage ?? null : null,
|
||||||
|
};
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (code !== 0) {
|
||||||
|
const details = [stderr.trim(), stdout.trim()].filter(Boolean).join("\n");
|
||||||
|
throw new Error(
|
||||||
|
details.length > 0
|
||||||
|
? `${input.failureLabel ?? `git ${input.args.join(" ")}`} failed: ${details}`
|
||||||
|
: `${input.failureLabel ?? `git ${input.args.join(" ")}`} failed with exit code ${code ?? -1}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
return stdout.trim();
|
||||||
|
}
|
||||||
|
|
||||||
|
async function recordWorkspaceCommandOperation(
|
||||||
|
recorder: WorkspaceOperationRecorder | null | undefined,
|
||||||
|
input: {
|
||||||
|
phase: "workspace_provision" | "workspace_teardown";
|
||||||
|
command: string;
|
||||||
|
cwd: string;
|
||||||
|
env: NodeJS.ProcessEnv;
|
||||||
|
label: string;
|
||||||
|
metadata?: Record<string, unknown> | null;
|
||||||
|
successMessage?: string | null;
|
||||||
|
},
|
||||||
|
) {
|
||||||
|
if (!recorder) {
|
||||||
|
await runWorkspaceCommand(input);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
let stdout = "";
|
||||||
|
let stderr = "";
|
||||||
|
let code: number | null = null;
|
||||||
|
await recorder.recordOperation({
|
||||||
|
phase: input.phase,
|
||||||
|
command: input.command,
|
||||||
|
cwd: input.cwd,
|
||||||
|
metadata: input.metadata ?? null,
|
||||||
|
run: async () => {
|
||||||
|
const shell = process.env.SHELL?.trim() || "/bin/sh";
|
||||||
|
const result = await executeProcess({
|
||||||
|
command: shell,
|
||||||
|
args: ["-c", input.command],
|
||||||
|
cwd: input.cwd,
|
||||||
|
env: input.env,
|
||||||
|
});
|
||||||
|
stdout = result.stdout;
|
||||||
|
stderr = result.stderr;
|
||||||
|
code = result.code;
|
||||||
|
return {
|
||||||
|
status: result.code === 0 ? "succeeded" : "failed",
|
||||||
|
exitCode: result.code,
|
||||||
|
stdout: result.stdout,
|
||||||
|
stderr: result.stderr,
|
||||||
|
system: result.code === 0 ? input.successMessage ?? null : null,
|
||||||
|
};
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (code === 0) return;
|
||||||
|
|
||||||
|
const details = [stderr.trim(), stdout.trim()].filter(Boolean).join("\n");
|
||||||
|
throw new Error(
|
||||||
|
details.length > 0
|
||||||
|
? `${input.label} failed: ${details}`
|
||||||
|
: `${input.label} failed with exit code ${code ?? -1}`,
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
async function provisionExecutionWorktree(input: {
|
async function provisionExecutionWorktree(input: {
|
||||||
strategy: Record<string, unknown>;
|
strategy: Record<string, unknown>;
|
||||||
base: ExecutionWorkspaceInput;
|
base: ExecutionWorkspaceInput;
|
||||||
@@ -310,11 +462,13 @@ async function provisionExecutionWorktree(input: {
|
|||||||
issue: ExecutionWorkspaceIssueRef | null;
|
issue: ExecutionWorkspaceIssueRef | null;
|
||||||
agent: ExecutionWorkspaceAgentRef;
|
agent: ExecutionWorkspaceAgentRef;
|
||||||
created: boolean;
|
created: boolean;
|
||||||
|
recorder?: WorkspaceOperationRecorder | null;
|
||||||
}) {
|
}) {
|
||||||
const provisionCommand = asString(input.strategy.provisionCommand, "").trim();
|
const provisionCommand = asString(input.strategy.provisionCommand, "").trim();
|
||||||
if (!provisionCommand) return;
|
if (!provisionCommand) return;
|
||||||
|
|
||||||
await runWorkspaceCommand({
|
await recordWorkspaceCommandOperation(input.recorder, {
|
||||||
|
phase: "workspace_provision",
|
||||||
command: provisionCommand,
|
command: provisionCommand,
|
||||||
cwd: input.worktreePath,
|
cwd: input.worktreePath,
|
||||||
env: buildWorkspaceCommandEnv({
|
env: buildWorkspaceCommandEnv({
|
||||||
@@ -327,14 +481,71 @@ async function provisionExecutionWorktree(input: {
|
|||||||
created: input.created,
|
created: input.created,
|
||||||
}),
|
}),
|
||||||
label: `Execution workspace provision command "${provisionCommand}"`,
|
label: `Execution workspace provision command "${provisionCommand}"`,
|
||||||
|
metadata: {
|
||||||
|
repoRoot: input.repoRoot,
|
||||||
|
worktreePath: input.worktreePath,
|
||||||
|
branchName: input.branchName,
|
||||||
|
created: input.created,
|
||||||
|
},
|
||||||
|
successMessage: `Provisioned workspace at ${input.worktreePath}\n`,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function buildExecutionWorkspaceCleanupEnv(input: {
|
||||||
|
workspace: {
|
||||||
|
cwd: string | null;
|
||||||
|
providerRef: string | null;
|
||||||
|
branchName: string | null;
|
||||||
|
repoUrl: string | null;
|
||||||
|
baseRef: string | null;
|
||||||
|
projectId: string | null;
|
||||||
|
projectWorkspaceId: string | null;
|
||||||
|
sourceIssueId: string | null;
|
||||||
|
};
|
||||||
|
projectWorkspaceCwd?: string | null;
|
||||||
|
}) {
|
||||||
|
const env: NodeJS.ProcessEnv = sanitizeRuntimeServiceBaseEnv(process.env);
|
||||||
|
env.PAPERCLIP_WORKSPACE_CWD = input.workspace.cwd ?? "";
|
||||||
|
env.PAPERCLIP_WORKSPACE_PATH = input.workspace.cwd ?? "";
|
||||||
|
env.PAPERCLIP_WORKSPACE_WORKTREE_PATH =
|
||||||
|
input.workspace.providerRef ?? input.workspace.cwd ?? "";
|
||||||
|
env.PAPERCLIP_WORKSPACE_BRANCH = input.workspace.branchName ?? "";
|
||||||
|
env.PAPERCLIP_WORKSPACE_BASE_CWD = input.projectWorkspaceCwd ?? "";
|
||||||
|
env.PAPERCLIP_WORKSPACE_REPO_ROOT = input.projectWorkspaceCwd ?? "";
|
||||||
|
env.PAPERCLIP_WORKSPACE_REPO_URL = input.workspace.repoUrl ?? "";
|
||||||
|
env.PAPERCLIP_WORKSPACE_REPO_REF = input.workspace.baseRef ?? "";
|
||||||
|
env.PAPERCLIP_PROJECT_ID = input.workspace.projectId ?? "";
|
||||||
|
env.PAPERCLIP_PROJECT_WORKSPACE_ID = input.workspace.projectWorkspaceId ?? "";
|
||||||
|
env.PAPERCLIP_ISSUE_ID = input.workspace.sourceIssueId ?? "";
|
||||||
|
return env;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function resolveGitRepoRootForWorkspaceCleanup(
|
||||||
|
worktreePath: string,
|
||||||
|
projectWorkspaceCwd: string | null,
|
||||||
|
): Promise<string | null> {
|
||||||
|
if (projectWorkspaceCwd) {
|
||||||
|
const resolvedProjectWorkspaceCwd = path.resolve(projectWorkspaceCwd);
|
||||||
|
const gitDir = await runGit(["rev-parse", "--git-common-dir"], resolvedProjectWorkspaceCwd)
|
||||||
|
.catch(() => null);
|
||||||
|
if (gitDir) {
|
||||||
|
const resolvedGitDir = path.resolve(resolvedProjectWorkspaceCwd, gitDir);
|
||||||
|
return path.dirname(resolvedGitDir);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const gitDir = await runGit(["rev-parse", "--git-common-dir"], worktreePath).catch(() => null);
|
||||||
|
if (!gitDir) return null;
|
||||||
|
const resolvedGitDir = path.resolve(worktreePath, gitDir);
|
||||||
|
return path.dirname(resolvedGitDir);
|
||||||
|
}
|
||||||
|
|
||||||
export async function realizeExecutionWorkspace(input: {
|
export async function realizeExecutionWorkspace(input: {
|
||||||
base: ExecutionWorkspaceInput;
|
base: ExecutionWorkspaceInput;
|
||||||
config: Record<string, unknown>;
|
config: Record<string, unknown>;
|
||||||
issue: ExecutionWorkspaceIssueRef | null;
|
issue: ExecutionWorkspaceIssueRef | null;
|
||||||
agent: ExecutionWorkspaceAgentRef;
|
agent: ExecutionWorkspaceAgentRef;
|
||||||
|
recorder?: WorkspaceOperationRecorder | null;
|
||||||
}): Promise<RealizedExecutionWorkspace> {
|
}): Promise<RealizedExecutionWorkspace> {
|
||||||
const rawStrategy = parseObject(input.config.workspaceStrategy);
|
const rawStrategy = parseObject(input.config.workspaceStrategy);
|
||||||
const strategyType = asString(rawStrategy.type, "project_primary");
|
const strategyType = asString(rawStrategy.type, "project_primary");
|
||||||
@@ -372,6 +583,25 @@ export async function realizeExecutionWorkspace(input: {
|
|||||||
if (existingWorktree) {
|
if (existingWorktree) {
|
||||||
const existingGitDir = await runGit(["rev-parse", "--git-dir"], worktreePath).catch(() => null);
|
const existingGitDir = await runGit(["rev-parse", "--git-dir"], worktreePath).catch(() => null);
|
||||||
if (existingGitDir) {
|
if (existingGitDir) {
|
||||||
|
if (input.recorder) {
|
||||||
|
await input.recorder.recordOperation({
|
||||||
|
phase: "worktree_prepare",
|
||||||
|
cwd: repoRoot,
|
||||||
|
metadata: {
|
||||||
|
repoRoot,
|
||||||
|
worktreePath,
|
||||||
|
branchName,
|
||||||
|
baseRef,
|
||||||
|
created: false,
|
||||||
|
reused: true,
|
||||||
|
},
|
||||||
|
run: async () => ({
|
||||||
|
status: "succeeded",
|
||||||
|
exitCode: 0,
|
||||||
|
system: `Reused existing git worktree at ${worktreePath}\n`,
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
}
|
||||||
await provisionExecutionWorktree({
|
await provisionExecutionWorktree({
|
||||||
strategy: rawStrategy,
|
strategy: rawStrategy,
|
||||||
base: input.base,
|
base: input.base,
|
||||||
@@ -381,6 +611,7 @@ export async function realizeExecutionWorkspace(input: {
|
|||||||
issue: input.issue,
|
issue: input.issue,
|
||||||
agent: input.agent,
|
agent: input.agent,
|
||||||
created: false,
|
created: false,
|
||||||
|
recorder: input.recorder ?? null,
|
||||||
});
|
});
|
||||||
return {
|
return {
|
||||||
...input.base,
|
...input.base,
|
||||||
@@ -395,7 +626,41 @@ export async function realizeExecutionWorkspace(input: {
|
|||||||
throw new Error(`Configured worktree path "${worktreePath}" already exists and is not a git worktree.`);
|
throw new Error(`Configured worktree path "${worktreePath}" already exists and is not a git worktree.`);
|
||||||
}
|
}
|
||||||
|
|
||||||
await runGit(["worktree", "add", "-B", branchName, worktreePath, baseRef], repoRoot);
|
try {
|
||||||
|
await recordGitOperation(input.recorder, {
|
||||||
|
phase: "worktree_prepare",
|
||||||
|
args: ["worktree", "add", "-b", branchName, worktreePath, baseRef],
|
||||||
|
cwd: repoRoot,
|
||||||
|
metadata: {
|
||||||
|
repoRoot,
|
||||||
|
worktreePath,
|
||||||
|
branchName,
|
||||||
|
baseRef,
|
||||||
|
created: true,
|
||||||
|
},
|
||||||
|
successMessage: `Created git worktree at ${worktreePath}\n`,
|
||||||
|
failureLabel: `git worktree add ${worktreePath}`,
|
||||||
|
});
|
||||||
|
} catch (error) {
|
||||||
|
if (!gitErrorIncludes(error, "already exists")) {
|
||||||
|
throw error;
|
||||||
|
}
|
||||||
|
await recordGitOperation(input.recorder, {
|
||||||
|
phase: "worktree_prepare",
|
||||||
|
args: ["worktree", "add", worktreePath, branchName],
|
||||||
|
cwd: repoRoot,
|
||||||
|
metadata: {
|
||||||
|
repoRoot,
|
||||||
|
worktreePath,
|
||||||
|
branchName,
|
||||||
|
baseRef,
|
||||||
|
created: false,
|
||||||
|
reusedExistingBranch: true,
|
||||||
|
},
|
||||||
|
successMessage: `Attached existing branch ${branchName} at ${worktreePath}\n`,
|
||||||
|
failureLabel: `git worktree add ${worktreePath}`,
|
||||||
|
});
|
||||||
|
}
|
||||||
await provisionExecutionWorktree({
|
await provisionExecutionWorktree({
|
||||||
strategy: rawStrategy,
|
strategy: rawStrategy,
|
||||||
base: input.base,
|
base: input.base,
|
||||||
@@ -405,6 +670,7 @@ export async function realizeExecutionWorkspace(input: {
|
|||||||
issue: input.issue,
|
issue: input.issue,
|
||||||
agent: input.agent,
|
agent: input.agent,
|
||||||
created: true,
|
created: true,
|
||||||
|
recorder: input.recorder ?? null,
|
||||||
});
|
});
|
||||||
|
|
||||||
return {
|
return {
|
||||||
@@ -418,6 +684,158 @@ export async function realizeExecutionWorkspace(input: {
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export async function cleanupExecutionWorkspaceArtifacts(input: {
|
||||||
|
workspace: {
|
||||||
|
id: string;
|
||||||
|
cwd: string | null;
|
||||||
|
providerType: string;
|
||||||
|
providerRef: string | null;
|
||||||
|
branchName: string | null;
|
||||||
|
repoUrl: string | null;
|
||||||
|
baseRef: string | null;
|
||||||
|
projectId: string | null;
|
||||||
|
projectWorkspaceId: string | null;
|
||||||
|
sourceIssueId: string | null;
|
||||||
|
metadata?: Record<string, unknown> | null;
|
||||||
|
};
|
||||||
|
projectWorkspace?: {
|
||||||
|
cwd: string | null;
|
||||||
|
cleanupCommand: string | null;
|
||||||
|
} | null;
|
||||||
|
teardownCommand?: string | null;
|
||||||
|
recorder?: WorkspaceOperationRecorder | null;
|
||||||
|
}) {
|
||||||
|
const warnings: string[] = [];
|
||||||
|
const workspacePath = input.workspace.providerRef ?? input.workspace.cwd;
|
||||||
|
const cleanupEnv = buildExecutionWorkspaceCleanupEnv({
|
||||||
|
workspace: input.workspace,
|
||||||
|
projectWorkspaceCwd: input.projectWorkspace?.cwd ?? null,
|
||||||
|
});
|
||||||
|
const createdByRuntime = input.workspace.metadata?.createdByRuntime === true;
|
||||||
|
const cleanupCommands = [
|
||||||
|
input.projectWorkspace?.cleanupCommand ?? null,
|
||||||
|
input.teardownCommand ?? null,
|
||||||
|
]
|
||||||
|
.map((value) => asString(value, "").trim())
|
||||||
|
.filter(Boolean);
|
||||||
|
|
||||||
|
for (const command of cleanupCommands) {
|
||||||
|
try {
|
||||||
|
await recordWorkspaceCommandOperation(input.recorder, {
|
||||||
|
phase: "workspace_teardown",
|
||||||
|
command,
|
||||||
|
cwd: workspacePath ?? input.projectWorkspace?.cwd ?? process.cwd(),
|
||||||
|
env: cleanupEnv,
|
||||||
|
label: `Execution workspace cleanup command "${command}"`,
|
||||||
|
metadata: {
|
||||||
|
workspaceId: input.workspace.id,
|
||||||
|
workspacePath,
|
||||||
|
branchName: input.workspace.branchName,
|
||||||
|
providerType: input.workspace.providerType,
|
||||||
|
},
|
||||||
|
successMessage: `Completed cleanup command "${command}"\n`,
|
||||||
|
});
|
||||||
|
} catch (err) {
|
||||||
|
warnings.push(err instanceof Error ? err.message : String(err));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (input.workspace.providerType === "git_worktree" && workspacePath) {
|
||||||
|
const repoRoot = await resolveGitRepoRootForWorkspaceCleanup(
|
||||||
|
workspacePath,
|
||||||
|
input.projectWorkspace?.cwd ?? null,
|
||||||
|
);
|
||||||
|
const worktreeExists = await directoryExists(workspacePath);
|
||||||
|
if (worktreeExists) {
|
||||||
|
if (!repoRoot) {
|
||||||
|
warnings.push(`Could not resolve git repo root for "${workspacePath}".`);
|
||||||
|
} else {
|
||||||
|
try {
|
||||||
|
await recordGitOperation(input.recorder, {
|
||||||
|
phase: "worktree_cleanup",
|
||||||
|
args: ["worktree", "remove", "--force", workspacePath],
|
||||||
|
cwd: repoRoot,
|
||||||
|
metadata: {
|
||||||
|
workspaceId: input.workspace.id,
|
||||||
|
workspacePath,
|
||||||
|
branchName: input.workspace.branchName,
|
||||||
|
cleanupAction: "worktree_remove",
|
||||||
|
},
|
||||||
|
successMessage: `Removed git worktree ${workspacePath}\n`,
|
||||||
|
failureLabel: `git worktree remove ${workspacePath}`,
|
||||||
|
});
|
||||||
|
} catch (err) {
|
||||||
|
warnings.push(err instanceof Error ? err.message : String(err));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (createdByRuntime && input.workspace.branchName) {
|
||||||
|
if (!repoRoot) {
|
||||||
|
warnings.push(`Could not resolve git repo root to delete branch "${input.workspace.branchName}".`);
|
||||||
|
} else {
|
||||||
|
try {
|
||||||
|
await recordGitOperation(input.recorder, {
|
||||||
|
phase: "worktree_cleanup",
|
||||||
|
args: ["branch", "-d", input.workspace.branchName],
|
||||||
|
cwd: repoRoot,
|
||||||
|
metadata: {
|
||||||
|
workspaceId: input.workspace.id,
|
||||||
|
workspacePath,
|
||||||
|
branchName: input.workspace.branchName,
|
||||||
|
cleanupAction: "branch_delete",
|
||||||
|
},
|
||||||
|
successMessage: `Deleted branch ${input.workspace.branchName}\n`,
|
||||||
|
failureLabel: `git branch -d ${input.workspace.branchName}`,
|
||||||
|
});
|
||||||
|
} catch (err) {
|
||||||
|
const message = err instanceof Error ? err.message : String(err);
|
||||||
|
warnings.push(`Skipped deleting branch "${input.workspace.branchName}": ${message}`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else if (input.workspace.providerType === "local_fs" && createdByRuntime && workspacePath) {
|
||||||
|
const projectWorkspaceCwd = input.projectWorkspace?.cwd ? path.resolve(input.projectWorkspace.cwd) : null;
|
||||||
|
const resolvedWorkspacePath = path.resolve(workspacePath);
|
||||||
|
const containsProjectWorkspace = projectWorkspaceCwd
|
||||||
|
? (
|
||||||
|
resolvedWorkspacePath === projectWorkspaceCwd ||
|
||||||
|
projectWorkspaceCwd.startsWith(`${resolvedWorkspacePath}${path.sep}`)
|
||||||
|
)
|
||||||
|
: false;
|
||||||
|
if (containsProjectWorkspace) {
|
||||||
|
warnings.push(`Refusing to remove path "${workspacePath}" because it contains the project workspace.`);
|
||||||
|
} else {
|
||||||
|
await fs.rm(resolvedWorkspacePath, { recursive: true, force: true });
|
||||||
|
if (input.recorder) {
|
||||||
|
await input.recorder.recordOperation({
|
||||||
|
phase: "workspace_teardown",
|
||||||
|
cwd: projectWorkspaceCwd ?? process.cwd(),
|
||||||
|
metadata: {
|
||||||
|
workspaceId: input.workspace.id,
|
||||||
|
workspacePath: resolvedWorkspacePath,
|
||||||
|
cleanupAction: "remove_local_fs",
|
||||||
|
},
|
||||||
|
run: async () => ({
|
||||||
|
status: "succeeded",
|
||||||
|
exitCode: 0,
|
||||||
|
system: `Removed local workspace directory ${resolvedWorkspacePath}\n`,
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
const cleaned =
|
||||||
|
!workspacePath ||
|
||||||
|
!(await directoryExists(workspacePath));
|
||||||
|
|
||||||
|
return {
|
||||||
|
cleanedPath: workspacePath,
|
||||||
|
cleaned,
|
||||||
|
warnings,
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
async function allocatePort(): Promise<number> {
|
async function allocatePort(): Promise<number> {
|
||||||
return await new Promise<number>((resolve, reject) => {
|
return await new Promise<number>((resolve, reject) => {
|
||||||
const server = net.createServer();
|
const server = net.createServer();
|
||||||
@@ -471,6 +889,7 @@ function buildTemplateData(input: {
|
|||||||
function resolveServiceScopeId(input: {
|
function resolveServiceScopeId(input: {
|
||||||
service: Record<string, unknown>;
|
service: Record<string, unknown>;
|
||||||
workspace: RealizedExecutionWorkspace;
|
workspace: RealizedExecutionWorkspace;
|
||||||
|
executionWorkspaceId?: string | null;
|
||||||
issue: ExecutionWorkspaceIssueRef | null;
|
issue: ExecutionWorkspaceIssueRef | null;
|
||||||
runId: string;
|
runId: string;
|
||||||
agent: ExecutionWorkspaceAgentRef;
|
agent: ExecutionWorkspaceAgentRef;
|
||||||
@@ -486,7 +905,9 @@ function resolveServiceScopeId(input: {
|
|||||||
? scopeTypeRaw
|
? scopeTypeRaw
|
||||||
: "run";
|
: "run";
|
||||||
if (scopeType === "project_workspace") return { scopeType, scopeId: input.workspace.workspaceId ?? input.workspace.projectId };
|
if (scopeType === "project_workspace") return { scopeType, scopeId: input.workspace.workspaceId ?? input.workspace.projectId };
|
||||||
if (scopeType === "execution_workspace") return { scopeType, scopeId: input.workspace.cwd };
|
if (scopeType === "execution_workspace") {
|
||||||
|
return { scopeType, scopeId: input.executionWorkspaceId ?? input.workspace.cwd };
|
||||||
|
}
|
||||||
if (scopeType === "agent") return { scopeType, scopeId: input.agent.id };
|
if (scopeType === "agent") return { scopeType, scopeId: input.agent.id };
|
||||||
return { scopeType: "run" as const, scopeId: input.runId };
|
return { scopeType: "run" as const, scopeId: input.runId };
|
||||||
}
|
}
|
||||||
@@ -521,6 +942,7 @@ function toPersistedWorkspaceRuntimeService(record: RuntimeServiceRecord): typeo
|
|||||||
companyId: record.companyId,
|
companyId: record.companyId,
|
||||||
projectId: record.projectId,
|
projectId: record.projectId,
|
||||||
projectWorkspaceId: record.projectWorkspaceId,
|
projectWorkspaceId: record.projectWorkspaceId,
|
||||||
|
executionWorkspaceId: record.executionWorkspaceId,
|
||||||
issueId: record.issueId,
|
issueId: record.issueId,
|
||||||
scopeType: record.scopeType,
|
scopeType: record.scopeType,
|
||||||
scopeId: record.scopeId,
|
scopeId: record.scopeId,
|
||||||
@@ -556,6 +978,7 @@ async function persistRuntimeServiceRecord(db: Db | undefined, record: RuntimeSe
|
|||||||
set: {
|
set: {
|
||||||
projectId: values.projectId,
|
projectId: values.projectId,
|
||||||
projectWorkspaceId: values.projectWorkspaceId,
|
projectWorkspaceId: values.projectWorkspaceId,
|
||||||
|
executionWorkspaceId: values.executionWorkspaceId,
|
||||||
issueId: values.issueId,
|
issueId: values.issueId,
|
||||||
scopeType: values.scopeType,
|
scopeType: values.scopeType,
|
||||||
scopeId: values.scopeId,
|
scopeId: values.scopeId,
|
||||||
@@ -593,6 +1016,7 @@ export function normalizeAdapterManagedRuntimeServices(input: {
|
|||||||
agent: ExecutionWorkspaceAgentRef;
|
agent: ExecutionWorkspaceAgentRef;
|
||||||
issue: ExecutionWorkspaceIssueRef | null;
|
issue: ExecutionWorkspaceIssueRef | null;
|
||||||
workspace: RealizedExecutionWorkspace;
|
workspace: RealizedExecutionWorkspace;
|
||||||
|
executionWorkspaceId?: string | null;
|
||||||
reports: AdapterRuntimeServiceReport[];
|
reports: AdapterRuntimeServiceReport[];
|
||||||
now?: Date;
|
now?: Date;
|
||||||
}): RuntimeServiceRef[] {
|
}): RuntimeServiceRef[] {
|
||||||
@@ -604,7 +1028,7 @@ export function normalizeAdapterManagedRuntimeServices(input: {
|
|||||||
(scopeType === "project_workspace"
|
(scopeType === "project_workspace"
|
||||||
? input.workspace.workspaceId
|
? input.workspace.workspaceId
|
||||||
: scopeType === "execution_workspace"
|
: scopeType === "execution_workspace"
|
||||||
? input.workspace.cwd
|
? input.executionWorkspaceId ?? input.workspace.cwd
|
||||||
: scopeType === "agent"
|
: scopeType === "agent"
|
||||||
? input.agent.id
|
? input.agent.id
|
||||||
: input.runId) ??
|
: input.runId) ??
|
||||||
@@ -629,6 +1053,7 @@ export function normalizeAdapterManagedRuntimeServices(input: {
|
|||||||
companyId: input.agent.companyId,
|
companyId: input.agent.companyId,
|
||||||
projectId: report.projectId ?? input.workspace.projectId,
|
projectId: report.projectId ?? input.workspace.projectId,
|
||||||
projectWorkspaceId: report.projectWorkspaceId ?? input.workspace.workspaceId,
|
projectWorkspaceId: report.projectWorkspaceId ?? input.workspace.workspaceId,
|
||||||
|
executionWorkspaceId: input.executionWorkspaceId ?? null,
|
||||||
issueId: report.issueId ?? input.issue?.id ?? null,
|
issueId: report.issueId ?? input.issue?.id ?? null,
|
||||||
serviceName,
|
serviceName,
|
||||||
status,
|
status,
|
||||||
@@ -660,6 +1085,7 @@ async function startLocalRuntimeService(input: {
|
|||||||
agent: ExecutionWorkspaceAgentRef;
|
agent: ExecutionWorkspaceAgentRef;
|
||||||
issue: ExecutionWorkspaceIssueRef | null;
|
issue: ExecutionWorkspaceIssueRef | null;
|
||||||
workspace: RealizedExecutionWorkspace;
|
workspace: RealizedExecutionWorkspace;
|
||||||
|
executionWorkspaceId?: string | null;
|
||||||
adapterEnv: Record<string, string>;
|
adapterEnv: Record<string, string>;
|
||||||
service: Record<string, unknown>;
|
service: Record<string, unknown>;
|
||||||
onLog?: (stream: "stdout" | "stderr", chunk: string) => Promise<void>;
|
onLog?: (stream: "stdout" | "stderr", chunk: string) => Promise<void>;
|
||||||
@@ -683,7 +1109,10 @@ async function startLocalRuntimeService(input: {
|
|||||||
port,
|
port,
|
||||||
});
|
});
|
||||||
const serviceCwd = resolveConfiguredPath(renderTemplate(serviceCwdTemplate, templateData), input.workspace.cwd);
|
const serviceCwd = resolveConfiguredPath(renderTemplate(serviceCwdTemplate, templateData), input.workspace.cwd);
|
||||||
const env: Record<string, string> = { ...process.env, ...input.adapterEnv } as Record<string, string>;
|
const env: Record<string, string> = {
|
||||||
|
...sanitizeRuntimeServiceBaseEnv(process.env),
|
||||||
|
...input.adapterEnv,
|
||||||
|
} as Record<string, string>;
|
||||||
for (const [key, value] of Object.entries(envConfig)) {
|
for (const [key, value] of Object.entries(envConfig)) {
|
||||||
if (typeof value === "string") {
|
if (typeof value === "string") {
|
||||||
env[key] = renderTemplate(value, templateData);
|
env[key] = renderTemplate(value, templateData);
|
||||||
@@ -697,7 +1126,7 @@ async function startLocalRuntimeService(input: {
|
|||||||
const child = spawn(shell, ["-lc", command], {
|
const child = spawn(shell, ["-lc", command], {
|
||||||
cwd: serviceCwd,
|
cwd: serviceCwd,
|
||||||
env,
|
env,
|
||||||
detached: false,
|
detached: process.platform !== "win32",
|
||||||
stdio: ["ignore", "pipe", "pipe"],
|
stdio: ["ignore", "pipe", "pipe"],
|
||||||
});
|
});
|
||||||
let stderrExcerpt = "";
|
let stderrExcerpt = "";
|
||||||
@@ -723,7 +1152,7 @@ async function startLocalRuntimeService(input: {
|
|||||||
try {
|
try {
|
||||||
await waitForReadiness({ service: input.service, url });
|
await waitForReadiness({ service: input.service, url });
|
||||||
} catch (err) {
|
} catch (err) {
|
||||||
child.kill("SIGTERM");
|
terminateChildProcess(child);
|
||||||
throw new Error(
|
throw new Error(
|
||||||
`Failed to start runtime service "${serviceName}": ${err instanceof Error ? err.message : String(err)}${stderrExcerpt ? ` | stderr: ${stderrExcerpt.trim()}` : ""}`,
|
`Failed to start runtime service "${serviceName}": ${err instanceof Error ? err.message : String(err)}${stderrExcerpt ? ` | stderr: ${stderrExcerpt.trim()}` : ""}`,
|
||||||
);
|
);
|
||||||
@@ -735,6 +1164,7 @@ async function startLocalRuntimeService(input: {
|
|||||||
companyId: input.agent.companyId,
|
companyId: input.agent.companyId,
|
||||||
projectId: input.workspace.projectId,
|
projectId: input.workspace.projectId,
|
||||||
projectWorkspaceId: input.workspace.workspaceId,
|
projectWorkspaceId: input.workspace.workspaceId,
|
||||||
|
executionWorkspaceId: input.executionWorkspaceId ?? null,
|
||||||
issueId: input.issue?.id ?? null,
|
issueId: input.issue?.id ?? null,
|
||||||
serviceName,
|
serviceName,
|
||||||
status: "running",
|
status: "running",
|
||||||
@@ -781,8 +1211,8 @@ async function stopRuntimeService(serviceId: string) {
|
|||||||
record.status = "stopped";
|
record.status = "stopped";
|
||||||
record.lastUsedAt = new Date().toISOString();
|
record.lastUsedAt = new Date().toISOString();
|
||||||
record.stoppedAt = new Date().toISOString();
|
record.stoppedAt = new Date().toISOString();
|
||||||
if (record.child && !record.child.killed) {
|
if (record.child && record.child.pid) {
|
||||||
record.child.kill("SIGTERM");
|
terminateChildProcess(record.child);
|
||||||
}
|
}
|
||||||
runtimeServicesById.delete(serviceId);
|
runtimeServicesById.delete(serviceId);
|
||||||
if (record.reuseKey) {
|
if (record.reuseKey) {
|
||||||
@@ -791,6 +1221,28 @@ async function stopRuntimeService(serviceId: string) {
|
|||||||
await persistRuntimeServiceRecord(record.db, record);
|
await persistRuntimeServiceRecord(record.db, record);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
async function markPersistedRuntimeServicesStoppedForExecutionWorkspace(input: {
|
||||||
|
db: Db;
|
||||||
|
executionWorkspaceId: string;
|
||||||
|
}) {
|
||||||
|
const now = new Date();
|
||||||
|
await input.db
|
||||||
|
.update(workspaceRuntimeServices)
|
||||||
|
.set({
|
||||||
|
status: "stopped",
|
||||||
|
healthStatus: "unknown",
|
||||||
|
stoppedAt: now,
|
||||||
|
lastUsedAt: now,
|
||||||
|
updatedAt: now,
|
||||||
|
})
|
||||||
|
.where(
|
||||||
|
and(
|
||||||
|
eq(workspaceRuntimeServices.executionWorkspaceId, input.executionWorkspaceId),
|
||||||
|
inArray(workspaceRuntimeServices.status, ["starting", "running"]),
|
||||||
|
),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
function registerRuntimeService(db: Db | undefined, record: RuntimeServiceRecord) {
|
function registerRuntimeService(db: Db | undefined, record: RuntimeServiceRecord) {
|
||||||
record.db = db;
|
record.db = db;
|
||||||
runtimeServicesById.set(record.id, record);
|
runtimeServicesById.set(record.id, record);
|
||||||
@@ -820,6 +1272,7 @@ export async function ensureRuntimeServicesForRun(input: {
|
|||||||
agent: ExecutionWorkspaceAgentRef;
|
agent: ExecutionWorkspaceAgentRef;
|
||||||
issue: ExecutionWorkspaceIssueRef | null;
|
issue: ExecutionWorkspaceIssueRef | null;
|
||||||
workspace: RealizedExecutionWorkspace;
|
workspace: RealizedExecutionWorkspace;
|
||||||
|
executionWorkspaceId?: string | null;
|
||||||
config: Record<string, unknown>;
|
config: Record<string, unknown>;
|
||||||
adapterEnv: Record<string, string>;
|
adapterEnv: Record<string, string>;
|
||||||
onLog?: (stream: "stdout" | "stderr", chunk: string) => Promise<void>;
|
onLog?: (stream: "stdout" | "stderr", chunk: string) => Promise<void>;
|
||||||
@@ -838,6 +1291,7 @@ export async function ensureRuntimeServicesForRun(input: {
|
|||||||
const { scopeType, scopeId } = resolveServiceScopeId({
|
const { scopeType, scopeId } = resolveServiceScopeId({
|
||||||
service,
|
service,
|
||||||
workspace: input.workspace,
|
workspace: input.workspace,
|
||||||
|
executionWorkspaceId: input.executionWorkspaceId,
|
||||||
issue: input.issue,
|
issue: input.issue,
|
||||||
runId: input.runId,
|
runId: input.runId,
|
||||||
agent: input.agent,
|
agent: input.agent,
|
||||||
@@ -871,6 +1325,7 @@ export async function ensureRuntimeServicesForRun(input: {
|
|||||||
agent: input.agent,
|
agent: input.agent,
|
||||||
issue: input.issue,
|
issue: input.issue,
|
||||||
workspace: input.workspace,
|
workspace: input.workspace,
|
||||||
|
executionWorkspaceId: input.executionWorkspaceId,
|
||||||
adapterEnv: input.adapterEnv,
|
adapterEnv: input.adapterEnv,
|
||||||
service,
|
service,
|
||||||
onLog: input.onLog,
|
onLog: input.onLog,
|
||||||
@@ -911,6 +1366,36 @@ export async function releaseRuntimeServicesForRun(runId: string) {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export async function stopRuntimeServicesForExecutionWorkspace(input: {
|
||||||
|
db?: Db;
|
||||||
|
executionWorkspaceId: string;
|
||||||
|
workspaceCwd?: string | null;
|
||||||
|
}) {
|
||||||
|
const normalizedWorkspaceCwd = input.workspaceCwd ? path.resolve(input.workspaceCwd) : null;
|
||||||
|
const matchingServiceIds = Array.from(runtimeServicesById.values())
|
||||||
|
.filter((record) => {
|
||||||
|
if (record.executionWorkspaceId === input.executionWorkspaceId) return true;
|
||||||
|
if (!normalizedWorkspaceCwd || !record.cwd) return false;
|
||||||
|
const resolvedCwd = path.resolve(record.cwd);
|
||||||
|
return (
|
||||||
|
resolvedCwd === normalizedWorkspaceCwd ||
|
||||||
|
resolvedCwd.startsWith(`${normalizedWorkspaceCwd}${path.sep}`)
|
||||||
|
);
|
||||||
|
})
|
||||||
|
.map((record) => record.id);
|
||||||
|
|
||||||
|
for (const serviceId of matchingServiceIds) {
|
||||||
|
await stopRuntimeService(serviceId);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (input.db) {
|
||||||
|
await markPersistedRuntimeServicesStoppedForExecutionWorkspace({
|
||||||
|
db: input.db,
|
||||||
|
executionWorkspaceId: input.executionWorkspaceId,
|
||||||
|
});
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
export async function listWorkspaceRuntimeServicesForProjectWorkspaces(
|
export async function listWorkspaceRuntimeServicesForProjectWorkspaces(
|
||||||
db: Db,
|
db: Db,
|
||||||
companyId: string,
|
companyId: string,
|
||||||
@@ -978,6 +1463,7 @@ export async function persistAdapterManagedRuntimeServices(input: {
|
|||||||
agent: ExecutionWorkspaceAgentRef;
|
agent: ExecutionWorkspaceAgentRef;
|
||||||
issue: ExecutionWorkspaceIssueRef | null;
|
issue: ExecutionWorkspaceIssueRef | null;
|
||||||
workspace: RealizedExecutionWorkspace;
|
workspace: RealizedExecutionWorkspace;
|
||||||
|
executionWorkspaceId?: string | null;
|
||||||
reports: AdapterRuntimeServiceReport[];
|
reports: AdapterRuntimeServiceReport[];
|
||||||
}) {
|
}) {
|
||||||
const refs = normalizeAdapterManagedRuntimeServices(input);
|
const refs = normalizeAdapterManagedRuntimeServices(input);
|
||||||
@@ -1000,6 +1486,7 @@ export async function persistAdapterManagedRuntimeServices(input: {
|
|||||||
companyId: ref.companyId,
|
companyId: ref.companyId,
|
||||||
projectId: ref.projectId,
|
projectId: ref.projectId,
|
||||||
projectWorkspaceId: ref.projectWorkspaceId,
|
projectWorkspaceId: ref.projectWorkspaceId,
|
||||||
|
executionWorkspaceId: ref.executionWorkspaceId,
|
||||||
issueId: ref.issueId,
|
issueId: ref.issueId,
|
||||||
scopeType: ref.scopeType,
|
scopeType: ref.scopeType,
|
||||||
scopeId: ref.scopeId,
|
scopeId: ref.scopeId,
|
||||||
@@ -1028,6 +1515,7 @@ export async function persistAdapterManagedRuntimeServices(input: {
|
|||||||
set: {
|
set: {
|
||||||
projectId: ref.projectId,
|
projectId: ref.projectId,
|
||||||
projectWorkspaceId: ref.projectWorkspaceId,
|
projectWorkspaceId: ref.projectWorkspaceId,
|
||||||
|
executionWorkspaceId: ref.executionWorkspaceId,
|
||||||
issueId: ref.issueId,
|
issueId: ref.issueId,
|
||||||
scopeType: ref.scopeType,
|
scopeType: ref.scopeType,
|
||||||
scopeId: ref.scopeId,
|
scopeId: ref.scopeId,
|
||||||
|
|||||||
@@ -14,6 +14,7 @@ import { Projects } from "./pages/Projects";
|
|||||||
import { ProjectDetail } from "./pages/ProjectDetail";
|
import { ProjectDetail } from "./pages/ProjectDetail";
|
||||||
import { Issues } from "./pages/Issues";
|
import { Issues } from "./pages/Issues";
|
||||||
import { IssueDetail } from "./pages/IssueDetail";
|
import { IssueDetail } from "./pages/IssueDetail";
|
||||||
|
import { ExecutionWorkspaceDetail } from "./pages/ExecutionWorkspaceDetail";
|
||||||
import { Goals } from "./pages/Goals";
|
import { Goals } from "./pages/Goals";
|
||||||
import { GoalDetail } from "./pages/GoalDetail";
|
import { GoalDetail } from "./pages/GoalDetail";
|
||||||
import { Approvals } from "./pages/Approvals";
|
import { Approvals } from "./pages/Approvals";
|
||||||
@@ -24,6 +25,7 @@ import { Inbox } from "./pages/Inbox";
|
|||||||
import { CompanySettings } from "./pages/CompanySettings";
|
import { CompanySettings } from "./pages/CompanySettings";
|
||||||
import { DesignGuide } from "./pages/DesignGuide";
|
import { DesignGuide } from "./pages/DesignGuide";
|
||||||
import { InstanceSettings } from "./pages/InstanceSettings";
|
import { InstanceSettings } from "./pages/InstanceSettings";
|
||||||
|
import { InstanceExperimentalSettings } from "./pages/InstanceExperimentalSettings";
|
||||||
import { PluginManager } from "./pages/PluginManager";
|
import { PluginManager } from "./pages/PluginManager";
|
||||||
import { PluginSettings } from "./pages/PluginSettings";
|
import { PluginSettings } from "./pages/PluginSettings";
|
||||||
import { PluginPage } from "./pages/PluginPage";
|
import { PluginPage } from "./pages/PluginPage";
|
||||||
@@ -141,6 +143,7 @@ function boardRoutes() {
|
|||||||
<Route path="issues/done" element={<Navigate to="/issues" replace />} />
|
<Route path="issues/done" element={<Navigate to="/issues" replace />} />
|
||||||
<Route path="issues/recent" element={<Navigate to="/issues" replace />} />
|
<Route path="issues/recent" element={<Navigate to="/issues" replace />} />
|
||||||
<Route path="issues/:issueId" element={<IssueDetail />} />
|
<Route path="issues/:issueId" element={<IssueDetail />} />
|
||||||
|
<Route path="execution-workspaces/:workspaceId" element={<ExecutionWorkspaceDetail />} />
|
||||||
<Route path="goals" element={<Goals />} />
|
<Route path="goals" element={<Goals />} />
|
||||||
<Route path="goals/:goalId" element={<GoalDetail />} />
|
<Route path="goals/:goalId" element={<GoalDetail />} />
|
||||||
<Route path="approvals" element={<Navigate to="/approvals/pending" replace />} />
|
<Route path="approvals" element={<Navigate to="/approvals/pending" replace />} />
|
||||||
@@ -305,6 +308,7 @@ export function App() {
|
|||||||
<Route path="instance/settings" element={<Layout />}>
|
<Route path="instance/settings" element={<Layout />}>
|
||||||
<Route index element={<Navigate to="heartbeats" replace />} />
|
<Route index element={<Navigate to="heartbeats" replace />} />
|
||||||
<Route path="heartbeats" element={<InstanceSettings />} />
|
<Route path="heartbeats" element={<InstanceSettings />} />
|
||||||
|
<Route path="experimental" element={<InstanceExperimentalSettings />} />
|
||||||
<Route path="plugins" element={<PluginManager />} />
|
<Route path="plugins" element={<PluginManager />} />
|
||||||
<Route path="plugins/:pluginId" element={<PluginSettings />} />
|
<Route path="plugins/:pluginId" element={<PluginSettings />} />
|
||||||
</Route>
|
</Route>
|
||||||
|
|||||||
26
ui/src/api/execution-workspaces.ts
Normal file
26
ui/src/api/execution-workspaces.ts
Normal file
@@ -0,0 +1,26 @@
|
|||||||
|
import type { ExecutionWorkspace } from "@paperclipai/shared";
|
||||||
|
import { api } from "./client";
|
||||||
|
|
||||||
|
export const executionWorkspacesApi = {
|
||||||
|
list: (
|
||||||
|
companyId: string,
|
||||||
|
filters?: {
|
||||||
|
projectId?: string;
|
||||||
|
projectWorkspaceId?: string;
|
||||||
|
issueId?: string;
|
||||||
|
status?: string;
|
||||||
|
reuseEligible?: boolean;
|
||||||
|
},
|
||||||
|
) => {
|
||||||
|
const params = new URLSearchParams();
|
||||||
|
if (filters?.projectId) params.set("projectId", filters.projectId);
|
||||||
|
if (filters?.projectWorkspaceId) params.set("projectWorkspaceId", filters.projectWorkspaceId);
|
||||||
|
if (filters?.issueId) params.set("issueId", filters.issueId);
|
||||||
|
if (filters?.status) params.set("status", filters.status);
|
||||||
|
if (filters?.reuseEligible) params.set("reuseEligible", "true");
|
||||||
|
const qs = params.toString();
|
||||||
|
return api.get<ExecutionWorkspace[]>(`/companies/${companyId}/execution-workspaces${qs ? `?${qs}` : ""}`);
|
||||||
|
},
|
||||||
|
get: (id: string) => api.get<ExecutionWorkspace>(`/execution-workspaces/${id}`),
|
||||||
|
update: (id: string, data: Record<string, unknown>) => api.patch<ExecutionWorkspace>(`/execution-workspaces/${id}`, data),
|
||||||
|
};
|
||||||
@@ -2,6 +2,7 @@ import type {
|
|||||||
HeartbeatRun,
|
HeartbeatRun,
|
||||||
HeartbeatRunEvent,
|
HeartbeatRunEvent,
|
||||||
InstanceSchedulerHeartbeatAgent,
|
InstanceSchedulerHeartbeatAgent,
|
||||||
|
WorkspaceOperation,
|
||||||
} from "@paperclipai/shared";
|
} from "@paperclipai/shared";
|
||||||
import { api } from "./client";
|
import { api } from "./client";
|
||||||
|
|
||||||
@@ -42,6 +43,12 @@ export const heartbeatsApi = {
|
|||||||
api.get<{ runId: string; store: string; logRef: string; content: string; nextOffset?: number }>(
|
api.get<{ runId: string; store: string; logRef: string; content: string; nextOffset?: number }>(
|
||||||
`/heartbeat-runs/${runId}/log?offset=${encodeURIComponent(String(offset))}&limitBytes=${encodeURIComponent(String(limitBytes))}`,
|
`/heartbeat-runs/${runId}/log?offset=${encodeURIComponent(String(offset))}&limitBytes=${encodeURIComponent(String(limitBytes))}`,
|
||||||
),
|
),
|
||||||
|
workspaceOperations: (runId: string) =>
|
||||||
|
api.get<WorkspaceOperation[]>(`/heartbeat-runs/${runId}/workspace-operations`),
|
||||||
|
workspaceOperationLog: (operationId: string, offset = 0, limitBytes = 256000) =>
|
||||||
|
api.get<{ operationId: string; store: string; logRef: string; content: string; nextOffset?: number }>(
|
||||||
|
`/workspace-operations/${operationId}/log?offset=${encodeURIComponent(String(offset))}&limitBytes=${encodeURIComponent(String(limitBytes))}`,
|
||||||
|
),
|
||||||
cancel: (runId: string) => api.post<void>(`/heartbeat-runs/${runId}/cancel`, {}),
|
cancel: (runId: string) => api.post<void>(`/heartbeat-runs/${runId}/cancel`, {}),
|
||||||
liveRunsForIssue: (issueId: string) =>
|
liveRunsForIssue: (issueId: string) =>
|
||||||
api.get<LiveRunForIssue[]>(`/issues/${issueId}/live-runs`),
|
api.get<LiveRunForIssue[]>(`/issues/${issueId}/live-runs`),
|
||||||
|
|||||||
@@ -12,4 +12,5 @@ export { costsApi } from "./costs";
|
|||||||
export { activityApi } from "./activity";
|
export { activityApi } from "./activity";
|
||||||
export { dashboardApi } from "./dashboard";
|
export { dashboardApi } from "./dashboard";
|
||||||
export { heartbeatsApi } from "./heartbeats";
|
export { heartbeatsApi } from "./heartbeats";
|
||||||
|
export { instanceSettingsApi } from "./instanceSettings";
|
||||||
export { sidebarBadgesApi } from "./sidebarBadges";
|
export { sidebarBadgesApi } from "./sidebarBadges";
|
||||||
|
|||||||
12
ui/src/api/instanceSettings.ts
Normal file
12
ui/src/api/instanceSettings.ts
Normal file
@@ -0,0 +1,12 @@
|
|||||||
|
import type {
|
||||||
|
InstanceExperimentalSettings,
|
||||||
|
PatchInstanceExperimentalSettings,
|
||||||
|
} from "@paperclipai/shared";
|
||||||
|
import { api } from "./client";
|
||||||
|
|
||||||
|
export const instanceSettingsApi = {
|
||||||
|
getExperimental: () =>
|
||||||
|
api.get<InstanceExperimentalSettings>("/instance/settings/experimental"),
|
||||||
|
updateExperimental: (patch: PatchInstanceExperimentalSettings) =>
|
||||||
|
api.patch<InstanceExperimentalSettings>("/instance/settings/experimental", patch),
|
||||||
|
};
|
||||||
@@ -6,6 +6,7 @@ import type {
|
|||||||
IssueComment,
|
IssueComment,
|
||||||
IssueDocument,
|
IssueDocument,
|
||||||
IssueLabel,
|
IssueLabel,
|
||||||
|
IssueWorkProduct,
|
||||||
UpsertIssueDocument,
|
UpsertIssueDocument,
|
||||||
} from "@paperclipai/shared";
|
} from "@paperclipai/shared";
|
||||||
import { api } from "./client";
|
import { api } from "./client";
|
||||||
@@ -90,4 +91,10 @@ export const issuesApi = {
|
|||||||
api.post<Approval[]>(`/issues/${id}/approvals`, { approvalId }),
|
api.post<Approval[]>(`/issues/${id}/approvals`, { approvalId }),
|
||||||
unlinkApproval: (id: string, approvalId: string) =>
|
unlinkApproval: (id: string, approvalId: string) =>
|
||||||
api.delete<{ ok: true }>(`/issues/${id}/approvals/${approvalId}`),
|
api.delete<{ ok: true }>(`/issues/${id}/approvals/${approvalId}`),
|
||||||
|
listWorkProducts: (id: string) => api.get<IssueWorkProduct[]>(`/issues/${id}/work-products`),
|
||||||
|
createWorkProduct: (id: string, data: Record<string, unknown>) =>
|
||||||
|
api.post<IssueWorkProduct>(`/issues/${id}/work-products`, data),
|
||||||
|
updateWorkProduct: (id: string, data: Record<string, unknown>) =>
|
||||||
|
api.patch<IssueWorkProduct>(`/work-products/${id}`, data),
|
||||||
|
deleteWorkProduct: (id: string) => api.delete<IssueWorkProduct>(`/work-products/${id}`),
|
||||||
};
|
};
|
||||||
|
|||||||
@@ -1,5 +1,5 @@
|
|||||||
import { useQuery } from "@tanstack/react-query";
|
import { useQuery } from "@tanstack/react-query";
|
||||||
import { Clock3, Puzzle, Settings } from "lucide-react";
|
import { Clock3, FlaskConical, Puzzle, Settings } from "lucide-react";
|
||||||
import { NavLink } from "@/lib/router";
|
import { NavLink } from "@/lib/router";
|
||||||
import { pluginsApi } from "@/api/plugins";
|
import { pluginsApi } from "@/api/plugins";
|
||||||
import { queryKeys } from "@/lib/queryKeys";
|
import { queryKeys } from "@/lib/queryKeys";
|
||||||
@@ -23,6 +23,7 @@ export function InstanceSidebar() {
|
|||||||
<nav className="flex-1 min-h-0 overflow-y-auto scrollbar-auto-hide flex flex-col gap-4 px-3 py-2">
|
<nav className="flex-1 min-h-0 overflow-y-auto scrollbar-auto-hide flex flex-col gap-4 px-3 py-2">
|
||||||
<div className="flex flex-col gap-0.5">
|
<div className="flex flex-col gap-0.5">
|
||||||
<SidebarNavItem to="/instance/settings/heartbeats" label="Heartbeats" icon={Clock3} end />
|
<SidebarNavItem to="/instance/settings/heartbeats" label="Heartbeats" icon={Clock3} end />
|
||||||
|
<SidebarNavItem to="/instance/settings/experimental" label="Experimental" icon={FlaskConical} />
|
||||||
<SidebarNavItem to="/instance/settings/plugins" label="Plugins" icon={Puzzle} />
|
<SidebarNavItem to="/instance/settings/plugins" label="Plugins" icon={Puzzle} />
|
||||||
{(plugins ?? []).length > 0 ? (
|
{(plugins ?? []).length > 0 ? (
|
||||||
<div className="ml-4 mt-1 flex flex-col gap-0.5 border-l border-border/70 pl-3">
|
<div className="ml-4 mt-1 flex flex-col gap-0.5 border-l border-border/70 pl-3">
|
||||||
|
|||||||
@@ -1,9 +1,11 @@
|
|||||||
import { useMemo, useState } from "react";
|
import { useCallback, useMemo, useRef, useState } from "react";
|
||||||
import { Link } from "@/lib/router";
|
import { Link } from "@/lib/router";
|
||||||
import type { Issue } from "@paperclipai/shared";
|
import type { Issue } from "@paperclipai/shared";
|
||||||
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
|
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
|
||||||
import { agentsApi } from "../api/agents";
|
import { agentsApi } from "../api/agents";
|
||||||
import { authApi } from "../api/auth";
|
import { authApi } from "../api/auth";
|
||||||
|
import { executionWorkspacesApi } from "../api/execution-workspaces";
|
||||||
|
import { instanceSettingsApi } from "../api/instanceSettings";
|
||||||
import { issuesApi } from "../api/issues";
|
import { issuesApi } from "../api/issues";
|
||||||
import { projectsApi } from "../api/projects";
|
import { projectsApi } from "../api/projects";
|
||||||
import { useCompany } from "../context/CompanyContext";
|
import { useCompany } from "../context/CompanyContext";
|
||||||
@@ -18,11 +20,38 @@ import { formatDate, cn, projectUrl } from "../lib/utils";
|
|||||||
import { timeAgo } from "../lib/timeAgo";
|
import { timeAgo } from "../lib/timeAgo";
|
||||||
import { Separator } from "@/components/ui/separator";
|
import { Separator } from "@/components/ui/separator";
|
||||||
import { Popover, PopoverContent, PopoverTrigger } from "@/components/ui/popover";
|
import { Popover, PopoverContent, PopoverTrigger } from "@/components/ui/popover";
|
||||||
import { User, Hexagon, ArrowUpRight, Tag, Plus, Trash2 } from "lucide-react";
|
import { User, Hexagon, ArrowUpRight, Tag, Plus, Trash2, Copy, Check } from "lucide-react";
|
||||||
import { AgentIcon } from "./AgentIconPicker";
|
import { AgentIcon } from "./AgentIconPicker";
|
||||||
|
|
||||||
// TODO(issue-worktree-support): re-enable this UI once the workflow is ready to ship.
|
const EXECUTION_WORKSPACE_OPTIONS = [
|
||||||
const SHOW_EXPERIMENTAL_ISSUE_WORKTREE_UI = false;
|
{ value: "shared_workspace", label: "Project default" },
|
||||||
|
{ value: "isolated_workspace", label: "New isolated workspace" },
|
||||||
|
{ value: "reuse_existing", label: "Reuse existing workspace" },
|
||||||
|
] as const;
|
||||||
|
|
||||||
|
function defaultProjectWorkspaceIdForProject(project: {
|
||||||
|
workspaces?: Array<{ id: string; isPrimary: boolean }>;
|
||||||
|
executionWorkspacePolicy?: { defaultProjectWorkspaceId?: string | null } | null;
|
||||||
|
} | null | undefined) {
|
||||||
|
if (!project) return null;
|
||||||
|
return project.executionWorkspacePolicy?.defaultProjectWorkspaceId
|
||||||
|
?? project.workspaces?.find((workspace) => workspace.isPrimary)?.id
|
||||||
|
?? project.workspaces?.[0]?.id
|
||||||
|
?? null;
|
||||||
|
}
|
||||||
|
|
||||||
|
function defaultExecutionWorkspaceModeForProject(project: { executionWorkspacePolicy?: { enabled?: boolean; defaultMode?: string | null } | null } | null | undefined) {
|
||||||
|
const defaultMode = project?.executionWorkspacePolicy?.enabled ? project.executionWorkspacePolicy.defaultMode : null;
|
||||||
|
if (defaultMode === "isolated_workspace" || defaultMode === "operator_branch") return defaultMode;
|
||||||
|
if (defaultMode === "adapter_default") return "agent_default";
|
||||||
|
return "shared_workspace";
|
||||||
|
}
|
||||||
|
|
||||||
|
function issueModeForExistingWorkspace(mode: string | null | undefined) {
|
||||||
|
if (mode === "isolated_workspace" || mode === "operator_branch" || mode === "shared_workspace") return mode;
|
||||||
|
if (mode === "adapter_managed" || mode === "cloud_sandbox") return "agent_default";
|
||||||
|
return "shared_workspace";
|
||||||
|
}
|
||||||
|
|
||||||
interface IssuePropertiesProps {
|
interface IssuePropertiesProps {
|
||||||
issue: Issue;
|
issue: Issue;
|
||||||
@@ -101,6 +130,49 @@ function PropertyPicker({
|
|||||||
);
|
);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/** Splits a string at `/` and `-` boundaries, inserting <wbr> for natural line breaks. */
|
||||||
|
function BreakablePath({ text }: { text: string }) {
|
||||||
|
const parts: React.ReactNode[] = [];
|
||||||
|
// Split on path separators and hyphens, keeping them in the output
|
||||||
|
const segments = text.split(/(?<=[\/-])/);
|
||||||
|
for (let i = 0; i < segments.length; i++) {
|
||||||
|
if (i > 0) parts.push(<wbr key={i} />);
|
||||||
|
parts.push(segments[i]);
|
||||||
|
}
|
||||||
|
return <>{parts}</>;
|
||||||
|
}
|
||||||
|
|
||||||
|
/** Displays a value with a copy-to-clipboard icon and "Copied!" feedback. */
|
||||||
|
function CopyableValue({ value, label, mono, className }: { value: string; label?: string; mono?: boolean; className?: string }) {
|
||||||
|
const [copied, setCopied] = useState(false);
|
||||||
|
const timerRef = useRef<ReturnType<typeof setTimeout>>(undefined);
|
||||||
|
const handleCopy = useCallback(async () => {
|
||||||
|
try {
|
||||||
|
await navigator.clipboard.writeText(value);
|
||||||
|
setCopied(true);
|
||||||
|
clearTimeout(timerRef.current);
|
||||||
|
timerRef.current = setTimeout(() => setCopied(false), 1500);
|
||||||
|
} catch { /* noop */ }
|
||||||
|
}, [value]);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className={cn("flex items-start gap-1 group", className)}>
|
||||||
|
<span className="min-w-0" style={{ overflowWrap: "anywhere" }}>
|
||||||
|
{label && <span className="text-muted-foreground">{label} </span>}
|
||||||
|
<span className={mono ? "font-mono" : undefined}><BreakablePath text={value} /></span>
|
||||||
|
</span>
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
className="shrink-0 mt-0.5 p-0.5 rounded hover:bg-accent/50 transition-colors text-muted-foreground hover:text-foreground opacity-0 group-hover:opacity-100 focus:opacity-100"
|
||||||
|
onClick={handleCopy}
|
||||||
|
title={copied ? "Copied!" : "Copy to clipboard"}
|
||||||
|
>
|
||||||
|
{copied ? <Check className="h-3 w-3 text-green-500" /> : <Copy className="h-3 w-3" />}
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
export function IssueProperties({ issue, onUpdate, inline }: IssuePropertiesProps) {
|
export function IssueProperties({ issue, onUpdate, inline }: IssuePropertiesProps) {
|
||||||
const { selectedCompanyId } = useCompany();
|
const { selectedCompanyId } = useCompany();
|
||||||
const queryClient = useQueryClient();
|
const queryClient = useQueryClient();
|
||||||
@@ -118,6 +190,10 @@ export function IssueProperties({ issue, onUpdate, inline }: IssuePropertiesProp
|
|||||||
queryKey: queryKeys.auth.session,
|
queryKey: queryKeys.auth.session,
|
||||||
queryFn: () => authApi.getSession(),
|
queryFn: () => authApi.getSession(),
|
||||||
});
|
});
|
||||||
|
const { data: experimentalSettings } = useQuery({
|
||||||
|
queryKey: queryKeys.instance.experimentalSettings,
|
||||||
|
queryFn: () => instanceSettingsApi.getExperimental(),
|
||||||
|
});
|
||||||
const currentUserId = session?.user?.id ?? session?.session?.userId;
|
const currentUserId = session?.user?.id ?? session?.session?.userId;
|
||||||
|
|
||||||
const { data: agents } = useQuery({
|
const { data: agents } = useQuery({
|
||||||
@@ -187,15 +263,44 @@ export function IssueProperties({ issue, onUpdate, inline }: IssuePropertiesProp
|
|||||||
const currentProject = issue.projectId
|
const currentProject = issue.projectId
|
||||||
? orderedProjects.find((project) => project.id === issue.projectId) ?? null
|
? orderedProjects.find((project) => project.id === issue.projectId) ?? null
|
||||||
: null;
|
: null;
|
||||||
const currentProjectExecutionWorkspacePolicy = SHOW_EXPERIMENTAL_ISSUE_WORKTREE_UI
|
const currentProjectExecutionWorkspacePolicy =
|
||||||
? currentProject?.executionWorkspacePolicy ?? null
|
experimentalSettings?.enableIsolatedWorkspaces === true
|
||||||
: null;
|
? currentProject?.executionWorkspacePolicy ?? null
|
||||||
|
: null;
|
||||||
const currentProjectSupportsExecutionWorkspace = Boolean(currentProjectExecutionWorkspacePolicy?.enabled);
|
const currentProjectSupportsExecutionWorkspace = Boolean(currentProjectExecutionWorkspacePolicy?.enabled);
|
||||||
const usesIsolatedExecutionWorkspace = issue.executionWorkspaceSettings?.mode === "isolated"
|
const currentExecutionWorkspaceSelection =
|
||||||
? true
|
issue.executionWorkspacePreference
|
||||||
: issue.executionWorkspaceSettings?.mode === "project_primary"
|
?? issue.executionWorkspaceSettings?.mode
|
||||||
? false
|
?? defaultExecutionWorkspaceModeForProject(currentProject);
|
||||||
: currentProjectExecutionWorkspacePolicy?.defaultMode === "isolated";
|
const { data: reusableExecutionWorkspaces } = useQuery({
|
||||||
|
queryKey: queryKeys.executionWorkspaces.list(companyId!, {
|
||||||
|
projectId: issue.projectId ?? undefined,
|
||||||
|
projectWorkspaceId: issue.projectWorkspaceId ?? undefined,
|
||||||
|
reuseEligible: true,
|
||||||
|
}),
|
||||||
|
queryFn: () =>
|
||||||
|
executionWorkspacesApi.list(companyId!, {
|
||||||
|
projectId: issue.projectId ?? undefined,
|
||||||
|
projectWorkspaceId: issue.projectWorkspaceId ?? undefined,
|
||||||
|
reuseEligible: true,
|
||||||
|
}),
|
||||||
|
enabled: Boolean(companyId) && Boolean(issue.projectId),
|
||||||
|
});
|
||||||
|
const deduplicatedReusableWorkspaces = useMemo(() => {
|
||||||
|
const workspaces = reusableExecutionWorkspaces ?? [];
|
||||||
|
const seen = new Map<string, typeof workspaces[number]>();
|
||||||
|
for (const ws of workspaces) {
|
||||||
|
const key = ws.cwd ?? ws.id;
|
||||||
|
const existing = seen.get(key);
|
||||||
|
if (!existing || new Date(ws.lastUsedAt) > new Date(existing.lastUsedAt)) {
|
||||||
|
seen.set(key, ws);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return Array.from(seen.values());
|
||||||
|
}, [reusableExecutionWorkspaces]);
|
||||||
|
const selectedReusableExecutionWorkspace = deduplicatedReusableWorkspaces.find(
|
||||||
|
(workspace) => workspace.id === issue.executionWorkspaceId,
|
||||||
|
);
|
||||||
const projectLink = (id: string | null) => {
|
const projectLink = (id: string | null) => {
|
||||||
if (!id) return null;
|
if (!id) return null;
|
||||||
const project = projects?.find((p) => p.id === id) ?? null;
|
const project = projects?.find((p) => p.id === id) ?? null;
|
||||||
@@ -431,7 +536,13 @@ export function IssueProperties({ issue, onUpdate, inline }: IssuePropertiesProp
|
|||||||
!issue.projectId && "bg-accent"
|
!issue.projectId && "bg-accent"
|
||||||
)}
|
)}
|
||||||
onClick={() => {
|
onClick={() => {
|
||||||
onUpdate({ projectId: null, executionWorkspaceSettings: null });
|
onUpdate({
|
||||||
|
projectId: null,
|
||||||
|
projectWorkspaceId: null,
|
||||||
|
executionWorkspaceId: null,
|
||||||
|
executionWorkspacePreference: null,
|
||||||
|
executionWorkspaceSettings: null,
|
||||||
|
});
|
||||||
setProjectOpen(false);
|
setProjectOpen(false);
|
||||||
}}
|
}}
|
||||||
>
|
>
|
||||||
@@ -451,10 +562,14 @@ export function IssueProperties({ issue, onUpdate, inline }: IssuePropertiesProp
|
|||||||
p.id === issue.projectId && "bg-accent"
|
p.id === issue.projectId && "bg-accent"
|
||||||
)}
|
)}
|
||||||
onClick={() => {
|
onClick={() => {
|
||||||
|
const defaultMode = defaultExecutionWorkspaceModeForProject(p);
|
||||||
onUpdate({
|
onUpdate({
|
||||||
projectId: p.id,
|
projectId: p.id,
|
||||||
executionWorkspaceSettings: SHOW_EXPERIMENTAL_ISSUE_WORKTREE_UI && p.executionWorkspacePolicy?.enabled
|
projectWorkspaceId: defaultProjectWorkspaceIdForProject(p),
|
||||||
? { mode: p.executionWorkspacePolicy.defaultMode === "isolated" ? "isolated" : "project_primary" }
|
executionWorkspaceId: null,
|
||||||
|
executionWorkspacePreference: defaultMode,
|
||||||
|
executionWorkspaceSettings: p.executionWorkspacePolicy?.enabled
|
||||||
|
? { mode: defaultMode }
|
||||||
: null,
|
: null,
|
||||||
});
|
});
|
||||||
setProjectOpen(false);
|
setProjectOpen(false);
|
||||||
@@ -545,36 +660,85 @@ export function IssueProperties({ issue, onUpdate, inline }: IssuePropertiesProp
|
|||||||
|
|
||||||
{currentProjectSupportsExecutionWorkspace && (
|
{currentProjectSupportsExecutionWorkspace && (
|
||||||
<PropertyRow label="Workspace">
|
<PropertyRow label="Workspace">
|
||||||
<div className="flex items-center justify-between gap-3 w-full">
|
<div className="w-full space-y-2">
|
||||||
<div className="min-w-0">
|
<select
|
||||||
<div className="text-sm">
|
className="w-full rounded border border-border bg-transparent px-2 py-1.5 text-xs outline-none"
|
||||||
{usesIsolatedExecutionWorkspace ? "Isolated issue checkout" : "Project primary checkout"}
|
value={currentExecutionWorkspaceSelection}
|
||||||
</div>
|
onChange={(e) => {
|
||||||
<div className="text-[11px] text-muted-foreground">
|
const nextMode = e.target.value;
|
||||||
Toggle whether this issue runs in its own execution workspace.
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<button
|
|
||||||
className={cn(
|
|
||||||
"relative inline-flex h-5 w-9 items-center rounded-full transition-colors",
|
|
||||||
usesIsolatedExecutionWorkspace ? "bg-green-600" : "bg-muted",
|
|
||||||
)}
|
|
||||||
type="button"
|
|
||||||
onClick={() =>
|
|
||||||
onUpdate({
|
onUpdate({
|
||||||
|
executionWorkspacePreference: nextMode,
|
||||||
|
executionWorkspaceId: nextMode === "reuse_existing" ? issue.executionWorkspaceId : null,
|
||||||
executionWorkspaceSettings: {
|
executionWorkspaceSettings: {
|
||||||
mode: usesIsolatedExecutionWorkspace ? "project_primary" : "isolated",
|
mode:
|
||||||
|
nextMode === "reuse_existing"
|
||||||
|
? issueModeForExistingWorkspace(selectedReusableExecutionWorkspace?.mode)
|
||||||
|
: nextMode,
|
||||||
},
|
},
|
||||||
})
|
});
|
||||||
}
|
}}
|
||||||
>
|
>
|
||||||
<span
|
{EXECUTION_WORKSPACE_OPTIONS.map((option) => (
|
||||||
className={cn(
|
<option key={option.value} value={option.value}>
|
||||||
"inline-block h-3.5 w-3.5 rounded-full bg-white transition-transform",
|
{option.label}
|
||||||
usesIsolatedExecutionWorkspace ? "translate-x-4.5" : "translate-x-0.5",
|
</option>
|
||||||
|
))}
|
||||||
|
</select>
|
||||||
|
|
||||||
|
{currentExecutionWorkspaceSelection === "reuse_existing" && (
|
||||||
|
<select
|
||||||
|
className="w-full rounded border border-border bg-transparent px-2 py-1.5 text-xs outline-none"
|
||||||
|
value={issue.executionWorkspaceId ?? ""}
|
||||||
|
onChange={(e) => {
|
||||||
|
const nextExecutionWorkspaceId = e.target.value || null;
|
||||||
|
const nextExecutionWorkspace = deduplicatedReusableWorkspaces.find(
|
||||||
|
(workspace) => workspace.id === nextExecutionWorkspaceId,
|
||||||
|
);
|
||||||
|
onUpdate({
|
||||||
|
executionWorkspacePreference: "reuse_existing",
|
||||||
|
executionWorkspaceId: nextExecutionWorkspaceId,
|
||||||
|
executionWorkspaceSettings: {
|
||||||
|
mode: issueModeForExistingWorkspace(nextExecutionWorkspace?.mode),
|
||||||
|
},
|
||||||
|
});
|
||||||
|
}}
|
||||||
|
>
|
||||||
|
<option value="">Choose an existing workspace</option>
|
||||||
|
{deduplicatedReusableWorkspaces.map((workspace) => (
|
||||||
|
<option key={workspace.id} value={workspace.id}>
|
||||||
|
{workspace.name} · {workspace.status} · {workspace.branchName ?? workspace.cwd ?? workspace.id.slice(0, 8)}
|
||||||
|
</option>
|
||||||
|
))}
|
||||||
|
</select>
|
||||||
|
)}
|
||||||
|
|
||||||
|
{issue.currentExecutionWorkspace && (
|
||||||
|
<div className="text-[11px] text-muted-foreground space-y-0.5">
|
||||||
|
<div style={{ overflowWrap: "anywhere" }}>
|
||||||
|
Current:{" "}
|
||||||
|
<Link
|
||||||
|
to={`/execution-workspaces/${issue.currentExecutionWorkspace.id}`}
|
||||||
|
className="hover:text-foreground hover:underline"
|
||||||
|
>
|
||||||
|
<BreakablePath text={issue.currentExecutionWorkspace.name} />
|
||||||
|
</Link>
|
||||||
|
{" · "}
|
||||||
|
{issue.currentExecutionWorkspace.status}
|
||||||
|
</div>
|
||||||
|
{issue.currentExecutionWorkspace.cwd && (
|
||||||
|
<CopyableValue value={issue.currentExecutionWorkspace.cwd} mono className="text-[11px]" />
|
||||||
)}
|
)}
|
||||||
/>
|
{issue.currentExecutionWorkspace.branchName && (
|
||||||
</button>
|
<CopyableValue value={issue.currentExecutionWorkspace.branchName} label="Branch:" className="text-[11px]" />
|
||||||
|
)}
|
||||||
|
{issue.currentExecutionWorkspace.repoUrl && (
|
||||||
|
<CopyableValue value={issue.currentExecutionWorkspace.repoUrl} label="Repo:" mono className="text-[11px]" />
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{!issue.currentExecutionWorkspace && currentProject?.primaryWorkspace?.cwd && (
|
||||||
|
<CopyableValue value={currentProject.primaryWorkspace.cwd} mono className="text-[11px] text-muted-foreground" />
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
</PropertyRow>
|
</PropertyRow>
|
||||||
)}
|
)}
|
||||||
|
|||||||
@@ -24,32 +24,16 @@ import { useKeyboardShortcuts } from "../hooks/useKeyboardShortcuts";
|
|||||||
import { useCompanyPageMemory } from "../hooks/useCompanyPageMemory";
|
import { useCompanyPageMemory } from "../hooks/useCompanyPageMemory";
|
||||||
import { healthApi } from "../api/health";
|
import { healthApi } from "../api/health";
|
||||||
import { shouldSyncCompanySelectionFromRoute } from "../lib/company-selection";
|
import { shouldSyncCompanySelectionFromRoute } from "../lib/company-selection";
|
||||||
|
import {
|
||||||
|
DEFAULT_INSTANCE_SETTINGS_PATH,
|
||||||
|
normalizeRememberedInstanceSettingsPath,
|
||||||
|
} from "../lib/instance-settings";
|
||||||
import { queryKeys } from "../lib/queryKeys";
|
import { queryKeys } from "../lib/queryKeys";
|
||||||
import { cn } from "../lib/utils";
|
import { cn } from "../lib/utils";
|
||||||
import { NotFoundPage } from "../pages/NotFound";
|
import { NotFoundPage } from "../pages/NotFound";
|
||||||
import { Button } from "@/components/ui/button";
|
import { Button } from "@/components/ui/button";
|
||||||
|
|
||||||
const INSTANCE_SETTINGS_MEMORY_KEY = "paperclip.lastInstanceSettingsPath";
|
const INSTANCE_SETTINGS_MEMORY_KEY = "paperclip.lastInstanceSettingsPath";
|
||||||
const DEFAULT_INSTANCE_SETTINGS_PATH = "/instance/settings/heartbeats";
|
|
||||||
|
|
||||||
function normalizeRememberedInstanceSettingsPath(rawPath: string | null): string {
|
|
||||||
if (!rawPath) return DEFAULT_INSTANCE_SETTINGS_PATH;
|
|
||||||
|
|
||||||
const match = rawPath.match(/^([^?#]*)(\?[^#]*)?(#.*)?$/);
|
|
||||||
const pathname = match?.[1] ?? rawPath;
|
|
||||||
const search = match?.[2] ?? "";
|
|
||||||
const hash = match?.[3] ?? "";
|
|
||||||
|
|
||||||
if (pathname === "/instance/settings/heartbeats" || pathname === "/instance/settings/plugins") {
|
|
||||||
return `${pathname}${search}${hash}`;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (/^\/instance\/settings\/plugins\/[^/?#]+$/.test(pathname)) {
|
|
||||||
return `${pathname}${search}${hash}`;
|
|
||||||
}
|
|
||||||
|
|
||||||
return DEFAULT_INSTANCE_SETTINGS_PATH;
|
|
||||||
}
|
|
||||||
|
|
||||||
function readRememberedInstanceSettingsPath(): string {
|
function readRememberedInstanceSettingsPath(): string {
|
||||||
if (typeof window === "undefined") return DEFAULT_INSTANCE_SETTINGS_PATH;
|
if (typeof window === "undefined") return DEFAULT_INSTANCE_SETTINGS_PATH;
|
||||||
|
|||||||
@@ -2,7 +2,9 @@ import { useState, useEffect, useRef, useCallback, useMemo, type ChangeEvent, ty
|
|||||||
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
|
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
|
||||||
import { useDialog } from "../context/DialogContext";
|
import { useDialog } from "../context/DialogContext";
|
||||||
import { useCompany } from "../context/CompanyContext";
|
import { useCompany } from "../context/CompanyContext";
|
||||||
|
import { executionWorkspacesApi } from "../api/execution-workspaces";
|
||||||
import { issuesApi } from "../api/issues";
|
import { issuesApi } from "../api/issues";
|
||||||
|
import { instanceSettingsApi } from "../api/instanceSettings";
|
||||||
import { projectsApi } from "../api/projects";
|
import { projectsApi } from "../api/projects";
|
||||||
import { agentsApi } from "../api/agents";
|
import { agentsApi } from "../api/agents";
|
||||||
import { authApi } from "../api/auth";
|
import { authApi } from "../api/auth";
|
||||||
@@ -53,8 +55,6 @@ import { InlineEntitySelector, type InlineEntityOption } from "./InlineEntitySel
|
|||||||
|
|
||||||
const DRAFT_KEY = "paperclip:issue-draft";
|
const DRAFT_KEY = "paperclip:issue-draft";
|
||||||
const DEBOUNCE_MS = 800;
|
const DEBOUNCE_MS = 800;
|
||||||
// TODO(issue-worktree-support): re-enable this UI once the workflow is ready to ship.
|
|
||||||
const SHOW_EXPERIMENTAL_ISSUE_WORKTREE_UI = false;
|
|
||||||
|
|
||||||
/** Return black or white hex based on background luminance (WCAG perceptual weights). */
|
/** Return black or white hex based on background luminance (WCAG perceptual weights). */
|
||||||
function getContrastTextColor(hexColor: string): string {
|
function getContrastTextColor(hexColor: string): string {
|
||||||
@@ -74,10 +74,13 @@ interface IssueDraft {
|
|||||||
assigneeValue: string;
|
assigneeValue: string;
|
||||||
assigneeId?: string;
|
assigneeId?: string;
|
||||||
projectId: string;
|
projectId: string;
|
||||||
|
projectWorkspaceId?: string;
|
||||||
assigneeModelOverride: string;
|
assigneeModelOverride: string;
|
||||||
assigneeThinkingEffort: string;
|
assigneeThinkingEffort: string;
|
||||||
assigneeChrome: boolean;
|
assigneeChrome: boolean;
|
||||||
useIsolatedExecutionWorkspace: boolean;
|
executionWorkspaceMode?: string;
|
||||||
|
selectedExecutionWorkspaceId?: string;
|
||||||
|
useIsolatedExecutionWorkspace?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
type StagedIssueFile = {
|
type StagedIssueFile = {
|
||||||
@@ -236,6 +239,42 @@ const priorities = [
|
|||||||
{ value: "low", label: "Low", icon: ArrowDown, color: priorityColor.low ?? priorityColorDefault },
|
{ value: "low", label: "Low", icon: ArrowDown, color: priorityColor.low ?? priorityColorDefault },
|
||||||
];
|
];
|
||||||
|
|
||||||
|
const EXECUTION_WORKSPACE_MODES = [
|
||||||
|
{ value: "shared_workspace", label: "Project default" },
|
||||||
|
{ value: "isolated_workspace", label: "New isolated workspace" },
|
||||||
|
{ value: "reuse_existing", label: "Reuse existing workspace" },
|
||||||
|
] as const;
|
||||||
|
|
||||||
|
function defaultProjectWorkspaceIdForProject(project: { workspaces?: Array<{ id: string; isPrimary: boolean }>; executionWorkspacePolicy?: { defaultProjectWorkspaceId?: string | null } | null } | null | undefined) {
|
||||||
|
if (!project) return "";
|
||||||
|
return project.executionWorkspacePolicy?.defaultProjectWorkspaceId
|
||||||
|
?? project.workspaces?.find((workspace) => workspace.isPrimary)?.id
|
||||||
|
?? project.workspaces?.[0]?.id
|
||||||
|
?? "";
|
||||||
|
}
|
||||||
|
|
||||||
|
function defaultExecutionWorkspaceModeForProject(project: { executionWorkspacePolicy?: { enabled?: boolean; defaultMode?: string | null } | null } | null | undefined) {
|
||||||
|
const defaultMode = project?.executionWorkspacePolicy?.enabled ? project.executionWorkspacePolicy.defaultMode : null;
|
||||||
|
if (
|
||||||
|
defaultMode === "isolated_workspace" ||
|
||||||
|
defaultMode === "operator_branch" ||
|
||||||
|
defaultMode === "adapter_default"
|
||||||
|
) {
|
||||||
|
return defaultMode === "adapter_default" ? "agent_default" : defaultMode;
|
||||||
|
}
|
||||||
|
return "shared_workspace";
|
||||||
|
}
|
||||||
|
|
||||||
|
function issueExecutionWorkspaceModeForExistingWorkspace(mode: string | null | undefined) {
|
||||||
|
if (mode === "isolated_workspace" || mode === "operator_branch" || mode === "shared_workspace") {
|
||||||
|
return mode;
|
||||||
|
}
|
||||||
|
if (mode === "adapter_managed" || mode === "cloud_sandbox") {
|
||||||
|
return "agent_default";
|
||||||
|
}
|
||||||
|
return "shared_workspace";
|
||||||
|
}
|
||||||
|
|
||||||
export function NewIssueDialog() {
|
export function NewIssueDialog() {
|
||||||
const { newIssueOpen, newIssueDefaults, closeNewIssue } = useDialog();
|
const { newIssueOpen, newIssueDefaults, closeNewIssue } = useDialog();
|
||||||
const { companies, selectedCompanyId, selectedCompany } = useCompany();
|
const { companies, selectedCompanyId, selectedCompany } = useCompany();
|
||||||
@@ -247,11 +286,13 @@ export function NewIssueDialog() {
|
|||||||
const [priority, setPriority] = useState("");
|
const [priority, setPriority] = useState("");
|
||||||
const [assigneeValue, setAssigneeValue] = useState("");
|
const [assigneeValue, setAssigneeValue] = useState("");
|
||||||
const [projectId, setProjectId] = useState("");
|
const [projectId, setProjectId] = useState("");
|
||||||
|
const [projectWorkspaceId, setProjectWorkspaceId] = useState("");
|
||||||
const [assigneeOptionsOpen, setAssigneeOptionsOpen] = useState(false);
|
const [assigneeOptionsOpen, setAssigneeOptionsOpen] = useState(false);
|
||||||
const [assigneeModelOverride, setAssigneeModelOverride] = useState("");
|
const [assigneeModelOverride, setAssigneeModelOverride] = useState("");
|
||||||
const [assigneeThinkingEffort, setAssigneeThinkingEffort] = useState("");
|
const [assigneeThinkingEffort, setAssigneeThinkingEffort] = useState("");
|
||||||
const [assigneeChrome, setAssigneeChrome] = useState(false);
|
const [assigneeChrome, setAssigneeChrome] = useState(false);
|
||||||
const [useIsolatedExecutionWorkspace, setUseIsolatedExecutionWorkspace] = useState(false);
|
const [executionWorkspaceMode, setExecutionWorkspaceMode] = useState<string>("shared_workspace");
|
||||||
|
const [selectedExecutionWorkspaceId, setSelectedExecutionWorkspaceId] = useState("");
|
||||||
const [expanded, setExpanded] = useState(false);
|
const [expanded, setExpanded] = useState(false);
|
||||||
const [dialogCompanyId, setDialogCompanyId] = useState<string | null>(null);
|
const [dialogCompanyId, setDialogCompanyId] = useState<string | null>(null);
|
||||||
const [stagedFiles, setStagedFiles] = useState<StagedIssueFile[]>([]);
|
const [stagedFiles, setStagedFiles] = useState<StagedIssueFile[]>([]);
|
||||||
@@ -283,10 +324,29 @@ export function NewIssueDialog() {
|
|||||||
queryFn: () => projectsApi.list(effectiveCompanyId!),
|
queryFn: () => projectsApi.list(effectiveCompanyId!),
|
||||||
enabled: !!effectiveCompanyId && newIssueOpen,
|
enabled: !!effectiveCompanyId && newIssueOpen,
|
||||||
});
|
});
|
||||||
|
const { data: reusableExecutionWorkspaces } = useQuery({
|
||||||
|
queryKey: queryKeys.executionWorkspaces.list(effectiveCompanyId!, {
|
||||||
|
projectId,
|
||||||
|
projectWorkspaceId: projectWorkspaceId || undefined,
|
||||||
|
reuseEligible: true,
|
||||||
|
}),
|
||||||
|
queryFn: () =>
|
||||||
|
executionWorkspacesApi.list(effectiveCompanyId!, {
|
||||||
|
projectId,
|
||||||
|
projectWorkspaceId: projectWorkspaceId || undefined,
|
||||||
|
reuseEligible: true,
|
||||||
|
}),
|
||||||
|
enabled: Boolean(effectiveCompanyId) && newIssueOpen && Boolean(projectId),
|
||||||
|
});
|
||||||
const { data: session } = useQuery({
|
const { data: session } = useQuery({
|
||||||
queryKey: queryKeys.auth.session,
|
queryKey: queryKeys.auth.session,
|
||||||
queryFn: () => authApi.getSession(),
|
queryFn: () => authApi.getSession(),
|
||||||
});
|
});
|
||||||
|
const { data: experimentalSettings } = useQuery({
|
||||||
|
queryKey: queryKeys.instance.experimentalSettings,
|
||||||
|
queryFn: () => instanceSettingsApi.getExperimental(),
|
||||||
|
enabled: newIssueOpen,
|
||||||
|
});
|
||||||
const currentUserId = session?.user?.id ?? session?.session?.userId ?? null;
|
const currentUserId = session?.user?.id ?? session?.session?.userId ?? null;
|
||||||
const activeProjects = useMemo(
|
const activeProjects = useMemo(
|
||||||
() => (projects ?? []).filter((p) => !p.archivedAt),
|
() => (projects ?? []).filter((p) => !p.archivedAt),
|
||||||
@@ -417,10 +477,12 @@ export function NewIssueDialog() {
|
|||||||
priority,
|
priority,
|
||||||
assigneeValue,
|
assigneeValue,
|
||||||
projectId,
|
projectId,
|
||||||
|
projectWorkspaceId,
|
||||||
assigneeModelOverride,
|
assigneeModelOverride,
|
||||||
assigneeThinkingEffort,
|
assigneeThinkingEffort,
|
||||||
assigneeChrome,
|
assigneeChrome,
|
||||||
useIsolatedExecutionWorkspace,
|
executionWorkspaceMode,
|
||||||
|
selectedExecutionWorkspaceId,
|
||||||
});
|
});
|
||||||
}, [
|
}, [
|
||||||
title,
|
title,
|
||||||
@@ -429,10 +491,12 @@ export function NewIssueDialog() {
|
|||||||
priority,
|
priority,
|
||||||
assigneeValue,
|
assigneeValue,
|
||||||
projectId,
|
projectId,
|
||||||
|
projectWorkspaceId,
|
||||||
assigneeModelOverride,
|
assigneeModelOverride,
|
||||||
assigneeThinkingEffort,
|
assigneeThinkingEffort,
|
||||||
assigneeChrome,
|
assigneeChrome,
|
||||||
useIsolatedExecutionWorkspace,
|
executionWorkspaceMode,
|
||||||
|
selectedExecutionWorkspaceId,
|
||||||
newIssueOpen,
|
newIssueOpen,
|
||||||
scheduleSave,
|
scheduleSave,
|
||||||
]);
|
]);
|
||||||
@@ -449,13 +513,20 @@ export function NewIssueDialog() {
|
|||||||
setDescription(newIssueDefaults.description ?? "");
|
setDescription(newIssueDefaults.description ?? "");
|
||||||
setStatus(newIssueDefaults.status ?? "todo");
|
setStatus(newIssueDefaults.status ?? "todo");
|
||||||
setPriority(newIssueDefaults.priority ?? "");
|
setPriority(newIssueDefaults.priority ?? "");
|
||||||
setProjectId(newIssueDefaults.projectId ?? "");
|
const defaultProjectId = newIssueDefaults.projectId ?? "";
|
||||||
|
const defaultProject = orderedProjects.find((project) => project.id === defaultProjectId);
|
||||||
|
setProjectId(defaultProjectId);
|
||||||
|
setProjectWorkspaceId(defaultProjectWorkspaceIdForProject(defaultProject));
|
||||||
setAssigneeValue(assigneeValueFromSelection(newIssueDefaults));
|
setAssigneeValue(assigneeValueFromSelection(newIssueDefaults));
|
||||||
setAssigneeModelOverride("");
|
setAssigneeModelOverride("");
|
||||||
setAssigneeThinkingEffort("");
|
setAssigneeThinkingEffort("");
|
||||||
setAssigneeChrome(false);
|
setAssigneeChrome(false);
|
||||||
setUseIsolatedExecutionWorkspace(false);
|
setExecutionWorkspaceMode(defaultExecutionWorkspaceModeForProject(defaultProject));
|
||||||
|
setSelectedExecutionWorkspaceId("");
|
||||||
|
executionWorkspaceDefaultProjectId.current = defaultProjectId || null;
|
||||||
} else if (draft && draft.title.trim()) {
|
} else if (draft && draft.title.trim()) {
|
||||||
|
const restoredProjectId = newIssueDefaults.projectId ?? draft.projectId;
|
||||||
|
const restoredProject = orderedProjects.find((project) => project.id === restoredProjectId);
|
||||||
setTitle(draft.title);
|
setTitle(draft.title);
|
||||||
setDescription(draft.description);
|
setDescription(draft.description);
|
||||||
setStatus(draft.status || "todo");
|
setStatus(draft.status || "todo");
|
||||||
@@ -465,22 +536,33 @@ export function NewIssueDialog() {
|
|||||||
? assigneeValueFromSelection(newIssueDefaults)
|
? assigneeValueFromSelection(newIssueDefaults)
|
||||||
: (draft.assigneeValue ?? draft.assigneeId ?? ""),
|
: (draft.assigneeValue ?? draft.assigneeId ?? ""),
|
||||||
);
|
);
|
||||||
setProjectId(newIssueDefaults.projectId ?? draft.projectId);
|
setProjectId(restoredProjectId);
|
||||||
|
setProjectWorkspaceId(draft.projectWorkspaceId ?? defaultProjectWorkspaceIdForProject(restoredProject));
|
||||||
setAssigneeModelOverride(draft.assigneeModelOverride ?? "");
|
setAssigneeModelOverride(draft.assigneeModelOverride ?? "");
|
||||||
setAssigneeThinkingEffort(draft.assigneeThinkingEffort ?? "");
|
setAssigneeThinkingEffort(draft.assigneeThinkingEffort ?? "");
|
||||||
setAssigneeChrome(draft.assigneeChrome ?? false);
|
setAssigneeChrome(draft.assigneeChrome ?? false);
|
||||||
setUseIsolatedExecutionWorkspace(draft.useIsolatedExecutionWorkspace ?? false);
|
setExecutionWorkspaceMode(
|
||||||
|
draft.executionWorkspaceMode
|
||||||
|
?? (draft.useIsolatedExecutionWorkspace ? "isolated_workspace" : defaultExecutionWorkspaceModeForProject(restoredProject)),
|
||||||
|
);
|
||||||
|
setSelectedExecutionWorkspaceId(draft.selectedExecutionWorkspaceId ?? "");
|
||||||
|
executionWorkspaceDefaultProjectId.current = restoredProjectId || null;
|
||||||
} else {
|
} else {
|
||||||
|
const defaultProjectId = newIssueDefaults.projectId ?? "";
|
||||||
|
const defaultProject = orderedProjects.find((project) => project.id === defaultProjectId);
|
||||||
setStatus(newIssueDefaults.status ?? "todo");
|
setStatus(newIssueDefaults.status ?? "todo");
|
||||||
setPriority(newIssueDefaults.priority ?? "");
|
setPriority(newIssueDefaults.priority ?? "");
|
||||||
setProjectId(newIssueDefaults.projectId ?? "");
|
setProjectId(defaultProjectId);
|
||||||
|
setProjectWorkspaceId(defaultProjectWorkspaceIdForProject(defaultProject));
|
||||||
setAssigneeValue(assigneeValueFromSelection(newIssueDefaults));
|
setAssigneeValue(assigneeValueFromSelection(newIssueDefaults));
|
||||||
setAssigneeModelOverride("");
|
setAssigneeModelOverride("");
|
||||||
setAssigneeThinkingEffort("");
|
setAssigneeThinkingEffort("");
|
||||||
setAssigneeChrome(false);
|
setAssigneeChrome(false);
|
||||||
setUseIsolatedExecutionWorkspace(false);
|
setExecutionWorkspaceMode(defaultExecutionWorkspaceModeForProject(defaultProject));
|
||||||
|
setSelectedExecutionWorkspaceId("");
|
||||||
|
executionWorkspaceDefaultProjectId.current = defaultProjectId || null;
|
||||||
}
|
}
|
||||||
}, [newIssueOpen, newIssueDefaults]);
|
}, [newIssueOpen, newIssueDefaults, orderedProjects]);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
if (!supportsAssigneeOverrides) {
|
if (!supportsAssigneeOverrides) {
|
||||||
@@ -516,11 +598,13 @@ export function NewIssueDialog() {
|
|||||||
setPriority("");
|
setPriority("");
|
||||||
setAssigneeValue("");
|
setAssigneeValue("");
|
||||||
setProjectId("");
|
setProjectId("");
|
||||||
|
setProjectWorkspaceId("");
|
||||||
setAssigneeOptionsOpen(false);
|
setAssigneeOptionsOpen(false);
|
||||||
setAssigneeModelOverride("");
|
setAssigneeModelOverride("");
|
||||||
setAssigneeThinkingEffort("");
|
setAssigneeThinkingEffort("");
|
||||||
setAssigneeChrome(false);
|
setAssigneeChrome(false);
|
||||||
setUseIsolatedExecutionWorkspace(false);
|
setExecutionWorkspaceMode("shared_workspace");
|
||||||
|
setSelectedExecutionWorkspaceId("");
|
||||||
setExpanded(false);
|
setExpanded(false);
|
||||||
setDialogCompanyId(null);
|
setDialogCompanyId(null);
|
||||||
setStagedFiles([]);
|
setStagedFiles([]);
|
||||||
@@ -534,10 +618,12 @@ export function NewIssueDialog() {
|
|||||||
setDialogCompanyId(companyId);
|
setDialogCompanyId(companyId);
|
||||||
setAssigneeValue("");
|
setAssigneeValue("");
|
||||||
setProjectId("");
|
setProjectId("");
|
||||||
|
setProjectWorkspaceId("");
|
||||||
setAssigneeModelOverride("");
|
setAssigneeModelOverride("");
|
||||||
setAssigneeThinkingEffort("");
|
setAssigneeThinkingEffort("");
|
||||||
setAssigneeChrome(false);
|
setAssigneeChrome(false);
|
||||||
setUseIsolatedExecutionWorkspace(false);
|
setExecutionWorkspaceMode("shared_workspace");
|
||||||
|
setSelectedExecutionWorkspaceId("");
|
||||||
}
|
}
|
||||||
|
|
||||||
function discardDraft() {
|
function discardDraft() {
|
||||||
@@ -555,13 +641,19 @@ export function NewIssueDialog() {
|
|||||||
chrome: assigneeChrome,
|
chrome: assigneeChrome,
|
||||||
});
|
});
|
||||||
const selectedProject = orderedProjects.find((project) => project.id === projectId);
|
const selectedProject = orderedProjects.find((project) => project.id === projectId);
|
||||||
const executionWorkspacePolicy = SHOW_EXPERIMENTAL_ISSUE_WORKTREE_UI
|
const executionWorkspacePolicy =
|
||||||
? selectedProject?.executionWorkspacePolicy
|
experimentalSettings?.enableIsolatedWorkspaces === true
|
||||||
: null;
|
? selectedProject?.executionWorkspacePolicy ?? null
|
||||||
|
: null;
|
||||||
|
const selectedReusableExecutionWorkspace = deduplicatedReusableWorkspaces.find(
|
||||||
|
(workspace) => workspace.id === selectedExecutionWorkspaceId,
|
||||||
|
);
|
||||||
|
const requestedExecutionWorkspaceMode =
|
||||||
|
executionWorkspaceMode === "reuse_existing"
|
||||||
|
? issueExecutionWorkspaceModeForExistingWorkspace(selectedReusableExecutionWorkspace?.mode)
|
||||||
|
: executionWorkspaceMode;
|
||||||
const executionWorkspaceSettings = executionWorkspacePolicy?.enabled
|
const executionWorkspaceSettings = executionWorkspacePolicy?.enabled
|
||||||
? {
|
? { mode: requestedExecutionWorkspaceMode }
|
||||||
mode: useIsolatedExecutionWorkspace ? "isolated" : "project_primary",
|
|
||||||
}
|
|
||||||
: null;
|
: null;
|
||||||
createIssue.mutate({
|
createIssue.mutate({
|
||||||
companyId: effectiveCompanyId,
|
companyId: effectiveCompanyId,
|
||||||
@@ -573,7 +665,12 @@ export function NewIssueDialog() {
|
|||||||
...(selectedAssigneeAgentId ? { assigneeAgentId: selectedAssigneeAgentId } : {}),
|
...(selectedAssigneeAgentId ? { assigneeAgentId: selectedAssigneeAgentId } : {}),
|
||||||
...(selectedAssigneeUserId ? { assigneeUserId: selectedAssigneeUserId } : {}),
|
...(selectedAssigneeUserId ? { assigneeUserId: selectedAssigneeUserId } : {}),
|
||||||
...(projectId ? { projectId } : {}),
|
...(projectId ? { projectId } : {}),
|
||||||
|
...(projectWorkspaceId ? { projectWorkspaceId } : {}),
|
||||||
...(assigneeAdapterOverrides ? { assigneeAdapterOverrides } : {}),
|
...(assigneeAdapterOverrides ? { assigneeAdapterOverrides } : {}),
|
||||||
|
...(executionWorkspacePolicy?.enabled ? { executionWorkspacePreference: executionWorkspaceMode } : {}),
|
||||||
|
...(executionWorkspaceMode === "reuse_existing" && selectedExecutionWorkspaceId
|
||||||
|
? { executionWorkspaceId: selectedExecutionWorkspaceId }
|
||||||
|
: {}),
|
||||||
...(executionWorkspaceSettings ? { executionWorkspaceSettings } : {}),
|
...(executionWorkspaceSettings ? { executionWorkspaceSettings } : {}),
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
@@ -655,10 +752,26 @@ export function NewIssueDialog() {
|
|||||||
? (agents ?? []).find((a) => a.id === selectedAssigneeAgentId)
|
? (agents ?? []).find((a) => a.id === selectedAssigneeAgentId)
|
||||||
: null;
|
: null;
|
||||||
const currentProject = orderedProjects.find((project) => project.id === projectId);
|
const currentProject = orderedProjects.find((project) => project.id === projectId);
|
||||||
const currentProjectExecutionWorkspacePolicy = SHOW_EXPERIMENTAL_ISSUE_WORKTREE_UI
|
const currentProjectExecutionWorkspacePolicy =
|
||||||
? currentProject?.executionWorkspacePolicy ?? null
|
experimentalSettings?.enableIsolatedWorkspaces === true
|
||||||
: null;
|
? currentProject?.executionWorkspacePolicy ?? null
|
||||||
|
: null;
|
||||||
const currentProjectSupportsExecutionWorkspace = Boolean(currentProjectExecutionWorkspacePolicy?.enabled);
|
const currentProjectSupportsExecutionWorkspace = Boolean(currentProjectExecutionWorkspacePolicy?.enabled);
|
||||||
|
const deduplicatedReusableWorkspaces = useMemo(() => {
|
||||||
|
const workspaces = reusableExecutionWorkspaces ?? [];
|
||||||
|
const seen = new Map<string, typeof workspaces[number]>();
|
||||||
|
for (const ws of workspaces) {
|
||||||
|
const key = ws.cwd ?? ws.id;
|
||||||
|
const existing = seen.get(key);
|
||||||
|
if (!existing || new Date(ws.lastUsedAt) > new Date(existing.lastUsedAt)) {
|
||||||
|
seen.set(key, ws);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return Array.from(seen.values());
|
||||||
|
}, [reusableExecutionWorkspaces]);
|
||||||
|
const selectedReusableExecutionWorkspace = deduplicatedReusableWorkspaces.find(
|
||||||
|
(workspace) => workspace.id === selectedExecutionWorkspaceId,
|
||||||
|
);
|
||||||
const assigneeOptionsTitle =
|
const assigneeOptionsTitle =
|
||||||
assigneeAdapterType === "claude_local"
|
assigneeAdapterType === "claude_local"
|
||||||
? "Claude options"
|
? "Claude options"
|
||||||
@@ -708,9 +821,10 @@ export function NewIssueDialog() {
|
|||||||
const handleProjectChange = useCallback((nextProjectId: string) => {
|
const handleProjectChange = useCallback((nextProjectId: string) => {
|
||||||
setProjectId(nextProjectId);
|
setProjectId(nextProjectId);
|
||||||
const nextProject = orderedProjects.find((project) => project.id === nextProjectId);
|
const nextProject = orderedProjects.find((project) => project.id === nextProjectId);
|
||||||
const policy = SHOW_EXPERIMENTAL_ISSUE_WORKTREE_UI ? nextProject?.executionWorkspacePolicy : null;
|
|
||||||
executionWorkspaceDefaultProjectId.current = nextProjectId || null;
|
executionWorkspaceDefaultProjectId.current = nextProjectId || null;
|
||||||
setUseIsolatedExecutionWorkspace(Boolean(policy?.enabled && policy.defaultMode === "isolated"));
|
setProjectWorkspaceId(defaultProjectWorkspaceIdForProject(nextProject));
|
||||||
|
setExecutionWorkspaceMode(defaultExecutionWorkspaceModeForProject(nextProject));
|
||||||
|
setSelectedExecutionWorkspaceId("");
|
||||||
}, [orderedProjects]);
|
}, [orderedProjects]);
|
||||||
|
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
@@ -720,13 +834,9 @@ export function NewIssueDialog() {
|
|||||||
const project = orderedProjects.find((entry) => entry.id === projectId);
|
const project = orderedProjects.find((entry) => entry.id === projectId);
|
||||||
if (!project) return;
|
if (!project) return;
|
||||||
executionWorkspaceDefaultProjectId.current = projectId;
|
executionWorkspaceDefaultProjectId.current = projectId;
|
||||||
setUseIsolatedExecutionWorkspace(
|
setProjectWorkspaceId(defaultProjectWorkspaceIdForProject(project));
|
||||||
Boolean(
|
setExecutionWorkspaceMode(defaultExecutionWorkspaceModeForProject(project));
|
||||||
SHOW_EXPERIMENTAL_ISSUE_WORKTREE_UI &&
|
setSelectedExecutionWorkspaceId("");
|
||||||
project.executionWorkspacePolicy?.enabled &&
|
|
||||||
project.executionWorkspacePolicy.defaultMode === "isolated",
|
|
||||||
),
|
|
||||||
);
|
|
||||||
}, [newIssueOpen, orderedProjects, projectId]);
|
}, [newIssueOpen, orderedProjects, projectId]);
|
||||||
const modelOverrideOptions = useMemo<InlineEntityOption[]>(
|
const modelOverrideOptions = useMemo<InlineEntityOption[]>(
|
||||||
() => {
|
() => {
|
||||||
@@ -1007,30 +1117,48 @@ export function NewIssueDialog() {
|
|||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
{currentProjectSupportsExecutionWorkspace && (
|
{currentProject && currentProjectSupportsExecutionWorkspace && (
|
||||||
<div className="px-4 py-3 shrink-0">
|
<div className="px-4 py-3 shrink-0 space-y-2">
|
||||||
<div className="flex items-center justify-between">
|
<div className="space-y-1.5">
|
||||||
<div className="space-y-0.5">
|
<div className="text-xs font-medium">Execution workspace</div>
|
||||||
<div className="text-xs font-medium">Use isolated issue checkout</div>
|
<div className="text-[11px] text-muted-foreground">
|
||||||
<div className="text-[11px] text-muted-foreground">
|
Control whether this issue runs in the shared workspace, a new isolated workspace, or an existing one.
|
||||||
Create an issue-specific execution workspace instead of using the project's primary checkout.
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<button
|
<select
|
||||||
className={cn(
|
className="w-full rounded border border-border bg-transparent px-2 py-1.5 text-xs outline-none"
|
||||||
"relative inline-flex h-5 w-9 items-center rounded-full transition-colors",
|
value={executionWorkspaceMode}
|
||||||
useIsolatedExecutionWorkspace ? "bg-green-600" : "bg-muted",
|
onChange={(e) => {
|
||||||
)}
|
setExecutionWorkspaceMode(e.target.value);
|
||||||
onClick={() => setUseIsolatedExecutionWorkspace((value) => !value)}
|
if (e.target.value !== "reuse_existing") {
|
||||||
type="button"
|
setSelectedExecutionWorkspaceId("");
|
||||||
|
}
|
||||||
|
}}
|
||||||
>
|
>
|
||||||
<span
|
{EXECUTION_WORKSPACE_MODES.map((option) => (
|
||||||
className={cn(
|
<option key={option.value} value={option.value}>
|
||||||
"inline-block h-3.5 w-3.5 rounded-full bg-white transition-transform",
|
{option.label}
|
||||||
useIsolatedExecutionWorkspace ? "translate-x-4.5" : "translate-x-0.5",
|
</option>
|
||||||
)}
|
))}
|
||||||
/>
|
</select>
|
||||||
</button>
|
{executionWorkspaceMode === "reuse_existing" && (
|
||||||
|
<select
|
||||||
|
className="w-full rounded border border-border bg-transparent px-2 py-1.5 text-xs outline-none"
|
||||||
|
value={selectedExecutionWorkspaceId}
|
||||||
|
onChange={(e) => setSelectedExecutionWorkspaceId(e.target.value)}
|
||||||
|
>
|
||||||
|
<option value="">Choose an existing workspace</option>
|
||||||
|
{deduplicatedReusableWorkspaces.map((workspace) => (
|
||||||
|
<option key={workspace.id} value={workspace.id}>
|
||||||
|
{workspace.name} · {workspace.status} · {workspace.branchName ?? workspace.cwd ?? workspace.id.slice(0, 8)}
|
||||||
|
</option>
|
||||||
|
))}
|
||||||
|
</select>
|
||||||
|
)}
|
||||||
|
{executionWorkspaceMode === "reuse_existing" && selectedReusableExecutionWorkspace && (
|
||||||
|
<div className="text-[11px] text-muted-foreground">
|
||||||
|
Reusing {selectedReusableExecutionWorkspace.name} from {selectedReusableExecutionWorkspace.branchName ?? selectedReusableExecutionWorkspace.cwd ?? "existing execution workspace"}.
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
)}
|
)}
|
||||||
|
|||||||
@@ -42,7 +42,6 @@ const projectStatuses = [
|
|||||||
];
|
];
|
||||||
|
|
||||||
type WorkspaceSetup = "none" | "local" | "repo" | "both";
|
type WorkspaceSetup = "none" | "local" | "repo" | "both";
|
||||||
const REPO_ONLY_CWD_SENTINEL = "/__paperclip_repo_only__";
|
|
||||||
|
|
||||||
export function NewProjectDialog() {
|
export function NewProjectDialog() {
|
||||||
const { newProjectOpen, closeNewProject } = useDialog();
|
const { newProjectOpen, closeNewProject } = useDialog();
|
||||||
@@ -142,7 +141,7 @@ export function NewProjectDialog() {
|
|||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
if (repoRequired && !isGitHubRepoUrl(repoUrl)) {
|
if (repoRequired && !isGitHubRepoUrl(repoUrl)) {
|
||||||
setWorkspaceError("Repo workspace must use a valid GitHub repo URL.");
|
setWorkspaceError("Repo must use a valid GitHub repo URL.");
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
||||||
@@ -173,7 +172,6 @@ export function NewProjectDialog() {
|
|||||||
} else if (repoRequired) {
|
} else if (repoRequired) {
|
||||||
workspacePayloads.push({
|
workspacePayloads.push({
|
||||||
name: deriveWorkspaceNameFromRepo(repoUrl),
|
name: deriveWorkspaceNameFromRepo(repoUrl),
|
||||||
cwd: REPO_ONLY_CWD_SENTINEL,
|
|
||||||
repoUrl,
|
repoUrl,
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
@@ -284,7 +282,7 @@ export function NewProjectDialog() {
|
|||||||
<div className="px-4 pb-3 space-y-3 border-t border-border">
|
<div className="px-4 pb-3 space-y-3 border-t border-border">
|
||||||
<div className="pt-3">
|
<div className="pt-3">
|
||||||
<p className="text-sm font-medium">Where will work be done on this project?</p>
|
<p className="text-sm font-medium">Where will work be done on this project?</p>
|
||||||
<p className="text-xs text-muted-foreground">Add local folder and/or GitHub repo workspace hints.</p>
|
<p className="text-xs text-muted-foreground">Add a repo and/or local folder for this project.</p>
|
||||||
</div>
|
</div>
|
||||||
<div className="grid gap-2 sm:grid-cols-3">
|
<div className="grid gap-2 sm:grid-cols-3">
|
||||||
<button
|
<button
|
||||||
@@ -311,7 +309,7 @@ export function NewProjectDialog() {
|
|||||||
>
|
>
|
||||||
<div className="flex items-center gap-2 text-sm font-medium">
|
<div className="flex items-center gap-2 text-sm font-medium">
|
||||||
<Github className="h-4 w-4" />
|
<Github className="h-4 w-4" />
|
||||||
A github repo
|
A repo
|
||||||
</div>
|
</div>
|
||||||
<p className="mt-1 text-xs text-muted-foreground">Paste a GitHub URL.</p>
|
<p className="mt-1 text-xs text-muted-foreground">Paste a GitHub URL.</p>
|
||||||
</button>
|
</button>
|
||||||
@@ -327,7 +325,7 @@ export function NewProjectDialog() {
|
|||||||
<GitBranch className="h-4 w-4" />
|
<GitBranch className="h-4 w-4" />
|
||||||
Both
|
Both
|
||||||
</div>
|
</div>
|
||||||
<p className="mt-1 text-xs text-muted-foreground">Configure local + repo hints.</p>
|
<p className="mt-1 text-xs text-muted-foreground">Configure both repo and local folder.</p>
|
||||||
</button>
|
</button>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
@@ -347,7 +345,7 @@ export function NewProjectDialog() {
|
|||||||
)}
|
)}
|
||||||
{(workspaceSetup === "repo" || workspaceSetup === "both") && (
|
{(workspaceSetup === "repo" || workspaceSetup === "both") && (
|
||||||
<div className="rounded-md border border-border p-2">
|
<div className="rounded-md border border-border p-2">
|
||||||
<label className="mb-1 block text-xs text-muted-foreground">GitHub repo URL</label>
|
<label className="mb-1 block text-xs text-muted-foreground">Repo URL</label>
|
||||||
<input
|
<input
|
||||||
className="w-full rounded border border-border bg-transparent px-2 py-1 text-xs outline-none"
|
className="w-full rounded border border-border bg-transparent px-2 py-1 text-xs outline-none"
|
||||||
value={workspaceRepoUrl}
|
value={workspaceRepoUrl}
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
@@ -110,6 +110,7 @@ function makeIssue(id: string, isUnreadForMe: boolean): Issue {
|
|||||||
id,
|
id,
|
||||||
companyId: "company-1",
|
companyId: "company-1",
|
||||||
projectId: null,
|
projectId: null,
|
||||||
|
projectWorkspaceId: null,
|
||||||
goalId: null,
|
goalId: null,
|
||||||
parentId: null,
|
parentId: null,
|
||||||
title: `Issue ${id}`,
|
title: `Issue ${id}`,
|
||||||
@@ -125,6 +126,8 @@ function makeIssue(id: string, isUnreadForMe: boolean): Issue {
|
|||||||
requestDepth: 0,
|
requestDepth: 0,
|
||||||
billingCode: null,
|
billingCode: null,
|
||||||
assigneeAdapterOverrides: null,
|
assigneeAdapterOverrides: null,
|
||||||
|
executionWorkspaceId: null,
|
||||||
|
executionWorkspacePreference: null,
|
||||||
executionWorkspaceSettings: null,
|
executionWorkspaceSettings: null,
|
||||||
checkoutRunId: null,
|
checkoutRunId: null,
|
||||||
executionRunId: null,
|
executionRunId: null,
|
||||||
|
|||||||
23
ui/src/lib/instance-settings.test.ts
Normal file
23
ui/src/lib/instance-settings.test.ts
Normal file
@@ -0,0 +1,23 @@
|
|||||||
|
import { describe, expect, it } from "vitest";
|
||||||
|
import {
|
||||||
|
DEFAULT_INSTANCE_SETTINGS_PATH,
|
||||||
|
normalizeRememberedInstanceSettingsPath,
|
||||||
|
} from "./instance-settings";
|
||||||
|
|
||||||
|
describe("normalizeRememberedInstanceSettingsPath", () => {
|
||||||
|
it("keeps known instance settings pages", () => {
|
||||||
|
expect(normalizeRememberedInstanceSettingsPath("/instance/settings/experimental")).toBe(
|
||||||
|
"/instance/settings/experimental",
|
||||||
|
);
|
||||||
|
expect(normalizeRememberedInstanceSettingsPath("/instance/settings/plugins/example?tab=config#logs")).toBe(
|
||||||
|
"/instance/settings/plugins/example?tab=config#logs",
|
||||||
|
);
|
||||||
|
});
|
||||||
|
|
||||||
|
it("falls back to the default page for unknown paths", () => {
|
||||||
|
expect(normalizeRememberedInstanceSettingsPath("/instance/settings/nope")).toBe(
|
||||||
|
DEFAULT_INSTANCE_SETTINGS_PATH,
|
||||||
|
);
|
||||||
|
expect(normalizeRememberedInstanceSettingsPath(null)).toBe(DEFAULT_INSTANCE_SETTINGS_PATH);
|
||||||
|
});
|
||||||
|
});
|
||||||
24
ui/src/lib/instance-settings.ts
Normal file
24
ui/src/lib/instance-settings.ts
Normal file
@@ -0,0 +1,24 @@
|
|||||||
|
export const DEFAULT_INSTANCE_SETTINGS_PATH = "/instance/settings/heartbeats";
|
||||||
|
|
||||||
|
export function normalizeRememberedInstanceSettingsPath(rawPath: string | null): string {
|
||||||
|
if (!rawPath) return DEFAULT_INSTANCE_SETTINGS_PATH;
|
||||||
|
|
||||||
|
const match = rawPath.match(/^([^?#]*)(\?[^#]*)?(#.*)?$/);
|
||||||
|
const pathname = match?.[1] ?? rawPath;
|
||||||
|
const search = match?.[2] ?? "";
|
||||||
|
const hash = match?.[3] ?? "";
|
||||||
|
|
||||||
|
if (
|
||||||
|
pathname === "/instance/settings/heartbeats" ||
|
||||||
|
pathname === "/instance/settings/plugins" ||
|
||||||
|
pathname === "/instance/settings/experimental"
|
||||||
|
) {
|
||||||
|
return `${pathname}${search}${hash}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (/^\/instance\/settings\/plugins\/[^/?#]+$/.test(pathname)) {
|
||||||
|
return `${pathname}${search}${hash}`;
|
||||||
|
}
|
||||||
|
|
||||||
|
return DEFAULT_INSTANCE_SETTINGS_PATH;
|
||||||
|
}
|
||||||
@@ -34,6 +34,12 @@ export const queryKeys = {
|
|||||||
approvals: (issueId: string) => ["issues", "approvals", issueId] as const,
|
approvals: (issueId: string) => ["issues", "approvals", issueId] as const,
|
||||||
liveRuns: (issueId: string) => ["issues", "live-runs", issueId] as const,
|
liveRuns: (issueId: string) => ["issues", "live-runs", issueId] as const,
|
||||||
activeRun: (issueId: string) => ["issues", "active-run", issueId] as const,
|
activeRun: (issueId: string) => ["issues", "active-run", issueId] as const,
|
||||||
|
workProducts: (issueId: string) => ["issues", "work-products", issueId] as const,
|
||||||
|
},
|
||||||
|
executionWorkspaces: {
|
||||||
|
list: (companyId: string, filters?: Record<string, string | boolean | undefined>) =>
|
||||||
|
["execution-workspaces", companyId, filters ?? {}] as const,
|
||||||
|
detail: (id: string) => ["execution-workspaces", "detail", id] as const,
|
||||||
},
|
},
|
||||||
projects: {
|
projects: {
|
||||||
list: (companyId: string) => ["projects", companyId] as const,
|
list: (companyId: string) => ["projects", companyId] as const,
|
||||||
@@ -63,6 +69,7 @@ export const queryKeys = {
|
|||||||
},
|
},
|
||||||
instance: {
|
instance: {
|
||||||
schedulerHeartbeats: ["instance", "scheduler-heartbeats"] as const,
|
schedulerHeartbeats: ["instance", "scheduler-heartbeats"] as const,
|
||||||
|
experimentalSettings: ["instance", "experimental-settings"] as const,
|
||||||
},
|
},
|
||||||
health: ["health"] as const,
|
health: ["health"] as const,
|
||||||
secrets: {
|
secrets: {
|
||||||
@@ -93,6 +100,7 @@ export const queryKeys = {
|
|||||||
heartbeats: (companyId: string, agentId?: string) =>
|
heartbeats: (companyId: string, agentId?: string) =>
|
||||||
["heartbeats", companyId, agentId] as const,
|
["heartbeats", companyId, agentId] as const,
|
||||||
runDetail: (runId: string) => ["heartbeat-run", runId] as const,
|
runDetail: (runId: string) => ["heartbeat-run", runId] as const,
|
||||||
|
runWorkspaceOperations: (runId: string) => ["heartbeat-run", runId, "workspace-operations"] as const,
|
||||||
liveRuns: (companyId: string) => ["live-runs", companyId] as const,
|
liveRuns: (companyId: string) => ["live-runs", companyId] as const,
|
||||||
runIssues: (runId: string) => ["run-issues", runId] as const,
|
runIssues: (runId: string) => ["run-issues", runId] as const,
|
||||||
org: (companyId: string) => ["org", companyId] as const,
|
org: (companyId: string) => ["org", companyId] as const,
|
||||||
|
|||||||
@@ -69,6 +69,7 @@ import {
|
|||||||
type HeartbeatRunEvent,
|
type HeartbeatRunEvent,
|
||||||
type AgentRuntimeState,
|
type AgentRuntimeState,
|
||||||
type LiveEvent,
|
type LiveEvent,
|
||||||
|
type WorkspaceOperation,
|
||||||
} from "@paperclipai/shared";
|
} from "@paperclipai/shared";
|
||||||
import { redactHomePathUserSegments, redactHomePathUserSegmentsInValue } from "@paperclipai/adapter-utils";
|
import { redactHomePathUserSegments, redactHomePathUserSegmentsInValue } from "@paperclipai/adapter-utils";
|
||||||
import { agentRouteRef } from "../lib/utils";
|
import { agentRouteRef } from "../lib/utils";
|
||||||
@@ -240,6 +241,219 @@ function asNonEmptyString(value: unknown): string | null {
|
|||||||
return trimmed.length > 0 ? trimmed : null;
|
return trimmed.length > 0 ? trimmed : null;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function parseStoredLogContent(content: string): RunLogChunk[] {
|
||||||
|
const parsed: RunLogChunk[] = [];
|
||||||
|
for (const line of content.split("\n")) {
|
||||||
|
const trimmed = line.trim();
|
||||||
|
if (!trimmed) continue;
|
||||||
|
try {
|
||||||
|
const raw = JSON.parse(trimmed) as { ts?: unknown; stream?: unknown; chunk?: unknown };
|
||||||
|
const stream =
|
||||||
|
raw.stream === "stderr" || raw.stream === "system" ? raw.stream : "stdout";
|
||||||
|
const chunk = typeof raw.chunk === "string" ? raw.chunk : "";
|
||||||
|
const ts = typeof raw.ts === "string" ? raw.ts : new Date().toISOString();
|
||||||
|
if (!chunk) continue;
|
||||||
|
parsed.push({ ts, stream, chunk });
|
||||||
|
} catch {
|
||||||
|
// Ignore malformed log lines.
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return parsed;
|
||||||
|
}
|
||||||
|
|
||||||
|
function workspaceOperationPhaseLabel(phase: WorkspaceOperation["phase"]) {
|
||||||
|
switch (phase) {
|
||||||
|
case "worktree_prepare":
|
||||||
|
return "Worktree setup";
|
||||||
|
case "workspace_provision":
|
||||||
|
return "Provision";
|
||||||
|
case "workspace_teardown":
|
||||||
|
return "Teardown";
|
||||||
|
case "worktree_cleanup":
|
||||||
|
return "Worktree cleanup";
|
||||||
|
default:
|
||||||
|
return phase;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function workspaceOperationStatusTone(status: WorkspaceOperation["status"]) {
|
||||||
|
switch (status) {
|
||||||
|
case "succeeded":
|
||||||
|
return "border-green-500/20 bg-green-500/10 text-green-700 dark:text-green-300";
|
||||||
|
case "failed":
|
||||||
|
return "border-red-500/20 bg-red-500/10 text-red-700 dark:text-red-300";
|
||||||
|
case "running":
|
||||||
|
return "border-cyan-500/20 bg-cyan-500/10 text-cyan-700 dark:text-cyan-300";
|
||||||
|
case "skipped":
|
||||||
|
return "border-yellow-500/20 bg-yellow-500/10 text-yellow-700 dark:text-yellow-300";
|
||||||
|
default:
|
||||||
|
return "border-border bg-muted/40 text-muted-foreground";
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function WorkspaceOperationStatusBadge({ status }: { status: WorkspaceOperation["status"] }) {
|
||||||
|
return (
|
||||||
|
<span
|
||||||
|
className={cn(
|
||||||
|
"inline-flex items-center rounded-full border px-2 py-0.5 text-[11px] font-medium capitalize",
|
||||||
|
workspaceOperationStatusTone(status),
|
||||||
|
)}
|
||||||
|
>
|
||||||
|
{status.replace("_", " ")}
|
||||||
|
</span>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
function WorkspaceOperationLogViewer({ operation }: { operation: WorkspaceOperation }) {
|
||||||
|
const [open, setOpen] = useState(false);
|
||||||
|
const { data: logData, isLoading, error } = useQuery({
|
||||||
|
queryKey: ["workspace-operation-log", operation.id],
|
||||||
|
queryFn: () => heartbeatsApi.workspaceOperationLog(operation.id),
|
||||||
|
enabled: open && Boolean(operation.logRef),
|
||||||
|
refetchInterval: open && operation.status === "running" ? 2000 : false,
|
||||||
|
});
|
||||||
|
|
||||||
|
const chunks = useMemo(
|
||||||
|
() => (logData?.content ? parseStoredLogContent(logData.content) : []),
|
||||||
|
[logData?.content],
|
||||||
|
);
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="space-y-2">
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
className="text-[11px] text-muted-foreground underline underline-offset-2 hover:text-foreground"
|
||||||
|
onClick={() => setOpen((value) => !value)}
|
||||||
|
>
|
||||||
|
{open ? "Hide full log" : "Show full log"}
|
||||||
|
</button>
|
||||||
|
{open && (
|
||||||
|
<div className="rounded-md border border-border bg-background/70 p-2">
|
||||||
|
{isLoading && <div className="text-xs text-muted-foreground">Loading log...</div>}
|
||||||
|
{error && (
|
||||||
|
<div className="text-xs text-destructive">
|
||||||
|
{error instanceof Error ? error.message : "Failed to load workspace operation log"}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{!isLoading && !error && chunks.length === 0 && (
|
||||||
|
<div className="text-xs text-muted-foreground">No persisted log lines.</div>
|
||||||
|
)}
|
||||||
|
{chunks.length > 0 && (
|
||||||
|
<div className="max-h-64 overflow-y-auto rounded bg-neutral-100 p-2 font-mono text-xs dark:bg-neutral-950">
|
||||||
|
{chunks.map((chunk, index) => (
|
||||||
|
<div key={`${chunk.ts}-${index}`} className="flex gap-2">
|
||||||
|
<span className="shrink-0 text-neutral-500">
|
||||||
|
{new Date(chunk.ts).toLocaleTimeString("en-US", { hour12: false })}
|
||||||
|
</span>
|
||||||
|
<span
|
||||||
|
className={cn(
|
||||||
|
"shrink-0 w-14",
|
||||||
|
chunk.stream === "stderr"
|
||||||
|
? "text-red-600 dark:text-red-300"
|
||||||
|
: chunk.stream === "system"
|
||||||
|
? "text-blue-600 dark:text-blue-300"
|
||||||
|
: "text-muted-foreground",
|
||||||
|
)}
|
||||||
|
>
|
||||||
|
[{chunk.stream}]
|
||||||
|
</span>
|
||||||
|
<span className="whitespace-pre-wrap break-all">{redactHomePathUserSegments(chunk.chunk)}</span>
|
||||||
|
</div>
|
||||||
|
))}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
function WorkspaceOperationsSection({ operations }: { operations: WorkspaceOperation[] }) {
|
||||||
|
if (operations.length === 0) return null;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="rounded-lg border border-border bg-background/60 p-3 space-y-3">
|
||||||
|
<div className="text-xs font-medium text-muted-foreground">
|
||||||
|
Workspace ({operations.length})
|
||||||
|
</div>
|
||||||
|
<div className="space-y-3">
|
||||||
|
{operations.map((operation) => {
|
||||||
|
const metadata = asRecord(operation.metadata);
|
||||||
|
return (
|
||||||
|
<div key={operation.id} className="rounded-md border border-border/70 bg-background/70 p-3 space-y-2">
|
||||||
|
<div className="flex flex-wrap items-center gap-2">
|
||||||
|
<div className="text-sm font-medium">{workspaceOperationPhaseLabel(operation.phase)}</div>
|
||||||
|
<WorkspaceOperationStatusBadge status={operation.status} />
|
||||||
|
<div className="text-[11px] text-muted-foreground">
|
||||||
|
{relativeTime(operation.startedAt)}
|
||||||
|
{operation.finishedAt && ` to ${relativeTime(operation.finishedAt)}`}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{operation.command && (
|
||||||
|
<div className="text-xs break-all">
|
||||||
|
<span className="text-muted-foreground">Command: </span>
|
||||||
|
<span className="font-mono">{operation.command}</span>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{operation.cwd && (
|
||||||
|
<div className="text-xs break-all">
|
||||||
|
<span className="text-muted-foreground">Working dir: </span>
|
||||||
|
<span className="font-mono">{operation.cwd}</span>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{(asNonEmptyString(metadata?.branchName)
|
||||||
|
|| asNonEmptyString(metadata?.baseRef)
|
||||||
|
|| asNonEmptyString(metadata?.worktreePath)
|
||||||
|
|| asNonEmptyString(metadata?.repoRoot)
|
||||||
|
|| asNonEmptyString(metadata?.cleanupAction)) && (
|
||||||
|
<div className="grid gap-1 text-xs sm:grid-cols-2">
|
||||||
|
{asNonEmptyString(metadata?.branchName) && (
|
||||||
|
<div><span className="text-muted-foreground">Branch: </span><span className="font-mono">{metadata?.branchName as string}</span></div>
|
||||||
|
)}
|
||||||
|
{asNonEmptyString(metadata?.baseRef) && (
|
||||||
|
<div><span className="text-muted-foreground">Base ref: </span><span className="font-mono">{metadata?.baseRef as string}</span></div>
|
||||||
|
)}
|
||||||
|
{asNonEmptyString(metadata?.worktreePath) && (
|
||||||
|
<div className="break-all"><span className="text-muted-foreground">Worktree: </span><span className="font-mono">{metadata?.worktreePath as string}</span></div>
|
||||||
|
)}
|
||||||
|
{asNonEmptyString(metadata?.repoRoot) && (
|
||||||
|
<div className="break-all"><span className="text-muted-foreground">Repo root: </span><span className="font-mono">{metadata?.repoRoot as string}</span></div>
|
||||||
|
)}
|
||||||
|
{asNonEmptyString(metadata?.cleanupAction) && (
|
||||||
|
<div><span className="text-muted-foreground">Cleanup: </span><span className="font-mono">{metadata?.cleanupAction as string}</span></div>
|
||||||
|
)}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{typeof metadata?.created === "boolean" && (
|
||||||
|
<div className="text-xs text-muted-foreground">
|
||||||
|
{metadata.created ? "Created by this run" : "Reused existing workspace"}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{operation.stderrExcerpt && operation.stderrExcerpt.trim() && (
|
||||||
|
<div>
|
||||||
|
<div className="mb-1 text-xs text-red-700 dark:text-red-300">stderr excerpt</div>
|
||||||
|
<pre className="rounded-md bg-red-50 p-2 text-xs whitespace-pre-wrap break-all text-red-800 dark:bg-neutral-950 dark:text-red-100">
|
||||||
|
{redactHomePathUserSegments(operation.stderrExcerpt)}
|
||||||
|
</pre>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{operation.stdoutExcerpt && operation.stdoutExcerpt.trim() && (
|
||||||
|
<div>
|
||||||
|
<div className="mb-1 text-xs text-muted-foreground">stdout excerpt</div>
|
||||||
|
<pre className="rounded-md bg-neutral-100 p-2 text-xs whitespace-pre-wrap break-all dark:bg-neutral-950">
|
||||||
|
{redactHomePathUserSegments(operation.stdoutExcerpt)}
|
||||||
|
</pre>
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
{operation.logRef && <WorkspaceOperationLogViewer operation={operation} />}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
})}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
export function AgentDetail() {
|
export function AgentDetail() {
|
||||||
const { companyPrefix, agentId, tab: urlTab, runId: urlRunId } = useParams<{
|
const { companyPrefix, agentId, tab: urlTab, runId: urlRunId } = useParams<{
|
||||||
companyPrefix?: string;
|
companyPrefix?: string;
|
||||||
@@ -1849,6 +2063,11 @@ function LogViewer({ run, adapterType }: { run: HeartbeatRun; adapterType: strin
|
|||||||
distanceFromBottom: Number.POSITIVE_INFINITY,
|
distanceFromBottom: Number.POSITIVE_INFINITY,
|
||||||
});
|
});
|
||||||
const isLive = run.status === "running" || run.status === "queued";
|
const isLive = run.status === "running" || run.status === "queued";
|
||||||
|
const { data: workspaceOperations = [] } = useQuery({
|
||||||
|
queryKey: queryKeys.runWorkspaceOperations(run.id),
|
||||||
|
queryFn: () => heartbeatsApi.workspaceOperations(run.id),
|
||||||
|
refetchInterval: isLive ? 2000 : false,
|
||||||
|
});
|
||||||
|
|
||||||
function isRunLogUnavailable(err: unknown): boolean {
|
function isRunLogUnavailable(err: unknown): boolean {
|
||||||
return err instanceof ApiError && err.status === 404;
|
return err instanceof ApiError && err.status === 404;
|
||||||
@@ -2219,6 +2438,7 @@ function LogViewer({ run, adapterType }: { run: HeartbeatRun; adapterType: strin
|
|||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="space-y-3">
|
<div className="space-y-3">
|
||||||
|
<WorkspaceOperationsSection operations={workspaceOperations} />
|
||||||
{adapterInvokePayload && (
|
{adapterInvokePayload && (
|
||||||
<div className="rounded-lg border border-border bg-background/60 p-3 space-y-2">
|
<div className="rounded-lg border border-border bg-background/60 p-3 space-y-2">
|
||||||
<div className="text-xs font-medium text-muted-foreground">Invocation</div>
|
<div className="text-xs font-medium text-muted-foreground">Invocation</div>
|
||||||
|
|||||||
82
ui/src/pages/ExecutionWorkspaceDetail.tsx
Normal file
82
ui/src/pages/ExecutionWorkspaceDetail.tsx
Normal file
@@ -0,0 +1,82 @@
|
|||||||
|
import { Link, useParams } from "@/lib/router";
|
||||||
|
import { useQuery } from "@tanstack/react-query";
|
||||||
|
import { ExternalLink } from "lucide-react";
|
||||||
|
import { executionWorkspacesApi } from "../api/execution-workspaces";
|
||||||
|
import { queryKeys } from "../lib/queryKeys";
|
||||||
|
|
||||||
|
function isSafeExternalUrl(value: string | null | undefined) {
|
||||||
|
if (!value) return false;
|
||||||
|
try {
|
||||||
|
const parsed = new URL(value);
|
||||||
|
return parsed.protocol === "http:" || parsed.protocol === "https:";
|
||||||
|
} catch {
|
||||||
|
return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function DetailRow({ label, children }: { label: string; children: React.ReactNode }) {
|
||||||
|
return (
|
||||||
|
<div className="flex items-start gap-3 py-1.5">
|
||||||
|
<div className="w-28 shrink-0 text-xs text-muted-foreground">{label}</div>
|
||||||
|
<div className="min-w-0 flex-1 text-sm">{children}</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function ExecutionWorkspaceDetail() {
|
||||||
|
const { workspaceId } = useParams<{ workspaceId: string }>();
|
||||||
|
|
||||||
|
const { data: workspace, isLoading, error } = useQuery({
|
||||||
|
queryKey: queryKeys.executionWorkspaces.detail(workspaceId!),
|
||||||
|
queryFn: () => executionWorkspacesApi.get(workspaceId!),
|
||||||
|
enabled: Boolean(workspaceId),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (isLoading) return <p className="text-sm text-muted-foreground">Loading...</p>;
|
||||||
|
if (error) return <p className="text-sm text-destructive">{error instanceof Error ? error.message : "Failed to load workspace"}</p>;
|
||||||
|
if (!workspace) return null;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="max-w-2xl space-y-4">
|
||||||
|
<div className="space-y-1">
|
||||||
|
<div className="text-xs text-muted-foreground">Execution workspace</div>
|
||||||
|
<h1 className="text-2xl font-semibold">{workspace.name}</h1>
|
||||||
|
<div className="text-sm text-muted-foreground">
|
||||||
|
{workspace.status} · {workspace.mode} · {workspace.providerType}
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div className="rounded-lg border border-border p-4">
|
||||||
|
<DetailRow label="Project">
|
||||||
|
{workspace.projectId ? <Link to={`/projects/${workspace.projectId}`} className="hover:underline">{workspace.projectId}</Link> : "None"}
|
||||||
|
</DetailRow>
|
||||||
|
<DetailRow label="Source issue">
|
||||||
|
{workspace.sourceIssueId ? <Link to={`/issues/${workspace.sourceIssueId}`} className="hover:underline">{workspace.sourceIssueId}</Link> : "None"}
|
||||||
|
</DetailRow>
|
||||||
|
<DetailRow label="Branch">{workspace.branchName ?? "None"}</DetailRow>
|
||||||
|
<DetailRow label="Base ref">{workspace.baseRef ?? "None"}</DetailRow>
|
||||||
|
<DetailRow label="Working dir">
|
||||||
|
<span className="break-all font-mono text-xs">{workspace.cwd ?? "None"}</span>
|
||||||
|
</DetailRow>
|
||||||
|
<DetailRow label="Provider ref">
|
||||||
|
<span className="break-all font-mono text-xs">{workspace.providerRef ?? "None"}</span>
|
||||||
|
</DetailRow>
|
||||||
|
<DetailRow label="Repo URL">
|
||||||
|
{workspace.repoUrl && isSafeExternalUrl(workspace.repoUrl) ? (
|
||||||
|
<a href={workspace.repoUrl} target="_blank" rel="noreferrer" className="inline-flex items-center gap-1 hover:underline">
|
||||||
|
{workspace.repoUrl}
|
||||||
|
<ExternalLink className="h-3 w-3" />
|
||||||
|
</a>
|
||||||
|
) : workspace.repoUrl ? (
|
||||||
|
<span className="break-all font-mono text-xs">{workspace.repoUrl}</span>
|
||||||
|
) : "None"}
|
||||||
|
</DetailRow>
|
||||||
|
<DetailRow label="Opened">{new Date(workspace.openedAt).toLocaleString()}</DetailRow>
|
||||||
|
<DetailRow label="Last used">{new Date(workspace.lastUsedAt).toLocaleString()}</DetailRow>
|
||||||
|
<DetailRow label="Cleanup">
|
||||||
|
{workspace.cleanupEligibleAt ? `${new Date(workspace.cleanupEligibleAt).toLocaleString()}${workspace.cleanupReason ? ` · ${workspace.cleanupReason}` : ""}` : "Not scheduled"}
|
||||||
|
</DetailRow>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
102
ui/src/pages/InstanceExperimentalSettings.tsx
Normal file
102
ui/src/pages/InstanceExperimentalSettings.tsx
Normal file
@@ -0,0 +1,102 @@
|
|||||||
|
import { useEffect, useState } from "react";
|
||||||
|
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
|
||||||
|
import { FlaskConical } from "lucide-react";
|
||||||
|
import { instanceSettingsApi } from "@/api/instanceSettings";
|
||||||
|
import { useBreadcrumbs } from "../context/BreadcrumbContext";
|
||||||
|
import { queryKeys } from "../lib/queryKeys";
|
||||||
|
import { cn } from "../lib/utils";
|
||||||
|
|
||||||
|
export function InstanceExperimentalSettings() {
|
||||||
|
const { setBreadcrumbs } = useBreadcrumbs();
|
||||||
|
const queryClient = useQueryClient();
|
||||||
|
const [actionError, setActionError] = useState<string | null>(null);
|
||||||
|
|
||||||
|
useEffect(() => {
|
||||||
|
setBreadcrumbs([
|
||||||
|
{ label: "Instance Settings" },
|
||||||
|
{ label: "Experimental" },
|
||||||
|
]);
|
||||||
|
}, [setBreadcrumbs]);
|
||||||
|
|
||||||
|
const experimentalQuery = useQuery({
|
||||||
|
queryKey: queryKeys.instance.experimentalSettings,
|
||||||
|
queryFn: () => instanceSettingsApi.getExperimental(),
|
||||||
|
});
|
||||||
|
|
||||||
|
const toggleMutation = useMutation({
|
||||||
|
mutationFn: async (enabled: boolean) =>
|
||||||
|
instanceSettingsApi.updateExperimental({ enableIsolatedWorkspaces: enabled }),
|
||||||
|
onSuccess: async () => {
|
||||||
|
setActionError(null);
|
||||||
|
await queryClient.invalidateQueries({ queryKey: queryKeys.instance.experimentalSettings });
|
||||||
|
},
|
||||||
|
onError: (error) => {
|
||||||
|
setActionError(error instanceof Error ? error.message : "Failed to update experimental settings.");
|
||||||
|
},
|
||||||
|
});
|
||||||
|
|
||||||
|
if (experimentalQuery.isLoading) {
|
||||||
|
return <div className="text-sm text-muted-foreground">Loading experimental settings...</div>;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (experimentalQuery.error) {
|
||||||
|
return (
|
||||||
|
<div className="text-sm text-destructive">
|
||||||
|
{experimentalQuery.error instanceof Error
|
||||||
|
? experimentalQuery.error.message
|
||||||
|
: "Failed to load experimental settings."}
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
|
const enableIsolatedWorkspaces = experimentalQuery.data?.enableIsolatedWorkspaces === true;
|
||||||
|
|
||||||
|
return (
|
||||||
|
<div className="max-w-4xl space-y-6">
|
||||||
|
<div className="space-y-2">
|
||||||
|
<div className="flex items-center gap-2">
|
||||||
|
<FlaskConical className="h-5 w-5 text-muted-foreground" />
|
||||||
|
<h1 className="text-lg font-semibold">Experimental</h1>
|
||||||
|
</div>
|
||||||
|
<p className="text-sm text-muted-foreground">
|
||||||
|
Opt into features that are still being evaluated before they become default behavior.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
{actionError && (
|
||||||
|
<div className="rounded-md border border-destructive/40 bg-destructive/5 px-3 py-2 text-sm text-destructive">
|
||||||
|
{actionError}
|
||||||
|
</div>
|
||||||
|
)}
|
||||||
|
|
||||||
|
<section className="rounded-xl border border-border bg-card p-5">
|
||||||
|
<div className="flex items-start justify-between gap-4">
|
||||||
|
<div className="space-y-1.5">
|
||||||
|
<h2 className="text-sm font-semibold">Enabled Isolated Workspaces</h2>
|
||||||
|
<p className="max-w-2xl text-sm text-muted-foreground">
|
||||||
|
Show execution workspace controls in project configuration and allow isolated workspace behavior for new
|
||||||
|
and existing issue runs.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<button
|
||||||
|
type="button"
|
||||||
|
aria-label="Toggle isolated workspaces experimental setting"
|
||||||
|
disabled={toggleMutation.isPending}
|
||||||
|
className={cn(
|
||||||
|
"relative inline-flex h-6 w-11 items-center rounded-full transition-colors disabled:cursor-not-allowed disabled:opacity-60",
|
||||||
|
enableIsolatedWorkspaces ? "bg-green-600" : "bg-muted",
|
||||||
|
)}
|
||||||
|
onClick={() => toggleMutation.mutate(!enableIsolatedWorkspaces)}
|
||||||
|
>
|
||||||
|
<span
|
||||||
|
className={cn(
|
||||||
|
"inline-block h-4.5 w-4.5 rounded-full bg-white transition-transform",
|
||||||
|
enableIsolatedWorkspaces ? "translate-x-6" : "translate-x-0.5",
|
||||||
|
)}
|
||||||
|
/>
|
||||||
|
</button>
|
||||||
|
</div>
|
||||||
|
</section>
|
||||||
|
</div>
|
||||||
|
);
|
||||||
|
}
|
||||||
@@ -591,7 +591,6 @@ export function IssueDetail() {
|
|||||||
|
|
||||||
// Ancestors are returned oldest-first from the server (root at end, immediate parent at start)
|
// Ancestors are returned oldest-first from the server (root at end, immediate parent at start)
|
||||||
const ancestors = issue.ancestors ?? [];
|
const ancestors = issue.ancestors ?? [];
|
||||||
|
|
||||||
const handleFilePicked = async (evt: ChangeEvent<HTMLInputElement>) => {
|
const handleFilePicked = async (evt: ChangeEvent<HTMLInputElement>) => {
|
||||||
const files = evt.target.files;
|
const files = evt.target.files;
|
||||||
if (!files || files.length === 0) return;
|
if (!files || files.length === 0) return;
|
||||||
|
|||||||
@@ -296,6 +296,12 @@ export function ProjectDetail() {
|
|||||||
pushToast({ title: `"${name}" has been unarchived`, tone: "success" });
|
pushToast({ title: `"${name}" has been unarchived`, tone: "success" });
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
|
onError: (_, archived) => {
|
||||||
|
pushToast({
|
||||||
|
title: archived ? "Failed to archive project" : "Failed to unarchive project",
|
||||||
|
tone: "error",
|
||||||
|
});
|
||||||
|
},
|
||||||
});
|
});
|
||||||
|
|
||||||
const uploadImage = useMutation({
|
const uploadImage = useMutation({
|
||||||
|
|||||||
Reference in New Issue
Block a user