v2: Forge Console + Open WebUI artifacts + Docker
- web/: Local chat UI (Express + WS → Codex bridge) - openwebui/: Preset, pipelines, knowledge manifest - Dockerfile + docker-compose.yml - Updated README with 3 frontend options - CLI-agnostic: works with Codex, Claude Code, Kiro, Gemini
This commit is contained in:
6
.dockerignore
Normal file
6
.dockerignore
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
web/node_modules
|
||||||
|
npm-debug.log
|
||||||
|
.env
|
||||||
|
bmad/
|
||||||
|
.standards/
|
||||||
|
.git
|
||||||
15
.gitignore
vendored
Normal file
15
.gitignore
vendored
Normal file
@@ -0,0 +1,15 @@
|
|||||||
|
# Dependencies
|
||||||
|
node_modules/
|
||||||
|
|
||||||
|
# Environment
|
||||||
|
.env
|
||||||
|
|
||||||
|
# OS
|
||||||
|
.DS_Store
|
||||||
|
Thumbs.db
|
||||||
|
|
||||||
|
# BMad repos (cloned at runtime)
|
||||||
|
bmad/
|
||||||
|
|
||||||
|
# Factory standards (cloned at runtime)
|
||||||
|
.standards/
|
||||||
71
AGENTS.md
Normal file
71
AGENTS.md
Normal file
@@ -0,0 +1,71 @@
|
|||||||
|
Always read README.md before answering any queries or writing code.
|
||||||
|
|
||||||
|
# Agent Operating Instructions
|
||||||
|
|
||||||
|
You are an AI assistant operating inside the **Reltio PM Factory**. You help Product Managers create epics, draft PR/FAQs, query analytics, and maintain compliance with the Transparent Factory standard.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 0. First Interaction / Startup Hook
|
||||||
|
|
||||||
|
When first initialized, perform a silent check:
|
||||||
|
|
||||||
|
1. Verify if `.standards/factory/` and `bmad/` directories exist.
|
||||||
|
2. If they do **not** exist, output:
|
||||||
|
|
||||||
|
> 👋 **Welcome to the Reltio PM Factory!**
|
||||||
|
> It looks like your environment hasn't been initialized yet. Type `new-factory` (or ask me to bootstrap the environment) to download the Transparent Factory Standards, the BMad Creative Suite, and perform MCP health checks.
|
||||||
|
|
||||||
|
3. If they **do** exist, output:
|
||||||
|
|
||||||
|
> 👋 **Welcome back to the Reltio PM Factory!**
|
||||||
|
> The Transparent Factory Standards and BMad Suite are loaded. Do you want to draft a new PR/FAQ, create an Epic in Jira, or consult the Creative Squad?
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 1. The Transparent Factory
|
||||||
|
|
||||||
|
You must adhere to the Reltio Transparent Factory tenets.
|
||||||
|
|
||||||
|
- Before proposing technical product requirements or architectural changes, execute `factory_update` from `skills/factory-standards/`.
|
||||||
|
- Read the PR/FAQ and tenets at `.standards/factory/content/`.
|
||||||
|
- **Code is for Humans:** If your proposed code, spec, or Epic is not readable by a human engineer in under 60 seconds, revise it. If it violates Elastic Schema rules, reject it.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. Available Skills
|
||||||
|
|
||||||
|
All skills are in the `/skills/` directory. Each has a `SKILL.md` with capabilities, required env vars, and entry points.
|
||||||
|
|
||||||
|
| Skill | Purpose | Entry Point |
|
||||||
|
|-------|---------|-------------|
|
||||||
|
| `epics-standards` | Epic creation & Aha! workflows | `scripts/aha_create_epic.py` |
|
||||||
|
| `factory-standards` | Transparent Factory sync & compliance | `manager.py update` |
|
||||||
|
| `bmad-suite` | Creative Intelligence Suite | `manager.py list\|update` |
|
||||||
|
| `gainsight-px` | Product analytics | `gainsight_px.py` |
|
||||||
|
|
||||||
|
Before interacting with external APIs, read the relevant `SKILL.md` first.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. The `new-factory` Bootstrap Command
|
||||||
|
|
||||||
|
When the user asks to initialize, bootstrap, or types `new-factory`:
|
||||||
|
|
||||||
|
1. **Check Environment:** Verify `.env` exists. If not, copy from `env.example` and STOP until user fills it in.
|
||||||
|
2. **Sync BMad:** `python3 skills/bmad-suite/manager.py update`
|
||||||
|
3. **Sync Factory Standards:** `python3 skills/factory-standards/manager.py update`
|
||||||
|
4. **Health Check Jira:** `mcporter --config config/mcporter.json list atlassian`
|
||||||
|
5. **Health Check Aha!:** `set -a && source .env && set +a && mcporter --config config/mcporter.json list aha`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. MCP Servers
|
||||||
|
|
||||||
|
Configuration: `config/mcporter.json`
|
||||||
|
|
||||||
|
| Server | Purpose |
|
||||||
|
|--------|---------|
|
||||||
|
| `atlassian` | Jira issue management |
|
||||||
|
| `aha` | Aha! roadmap & epics |
|
||||||
|
| `context7` | Library documentation lookup |
|
||||||
1
CLAUDE.md
Normal file
1
CLAUDE.md
Normal file
@@ -0,0 +1 @@
|
|||||||
|
Always read README.md before answering any queries or writing code.
|
||||||
28
Dockerfile
Normal file
28
Dockerfile
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
FROM node:22-slim
|
||||||
|
|
||||||
|
# Install system deps for mcporter and codex
|
||||||
|
RUN apt-get update && apt-get install -y --no-install-recommends \
|
||||||
|
git \
|
||||||
|
python3 \
|
||||||
|
python3-pip \
|
||||||
|
curl \
|
||||||
|
&& rm -rf /var/lib/apt/lists/*
|
||||||
|
|
||||||
|
# Install mcporter and codex globally
|
||||||
|
RUN npm install -g @anthropic/mcporter @openai/codex
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
# Copy web deps first for layer caching
|
||||||
|
COPY web/package.json web/package-lock.json* web/
|
||||||
|
RUN cd web && npm install --production
|
||||||
|
|
||||||
|
# Copy everything else
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
# Make start script executable
|
||||||
|
RUN chmod +x start.sh
|
||||||
|
|
||||||
|
EXPOSE 3000
|
||||||
|
|
||||||
|
CMD ["node", "web/server.js"]
|
||||||
1
GEMINI.md
Normal file
1
GEMINI.md
Normal file
@@ -0,0 +1 @@
|
|||||||
|
Always read README.md before answering any queries or writing code.
|
||||||
100
README.md
Normal file
100
README.md
Normal file
@@ -0,0 +1,100 @@
|
|||||||
|
# Reltio PM Factory
|
||||||
|
|
||||||
|
A browser-based AI workspace for Product Managers. No terminal required after initial setup.
|
||||||
|
|
||||||
|
**Forge Console** gives you a familiar chat interface backed by [Codex](https://github.com/openai/codex), pre-loaded with Reltio's Transparent Factory standards, BMad Creative Suite, Jira/Aha!/Gainsight integrations, and epic-writing skills.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
### Option 1: Docker (recommended)
|
||||||
|
```bash
|
||||||
|
git clone <this-repo> pm-template
|
||||||
|
cd pm-template
|
||||||
|
cp env.example .env # fill in your API keys
|
||||||
|
docker compose up -d
|
||||||
|
```
|
||||||
|
Open http://localhost:3000.
|
||||||
|
|
||||||
|
### Option 2: Local
|
||||||
|
```bash
|
||||||
|
git clone <this-repo> pm-template
|
||||||
|
cd pm-template
|
||||||
|
./start.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
On first run, the script checks Node.js, creates `.env`, installs deps, and opens the browser.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Prerequisites (One-Time Setup)
|
||||||
|
|
||||||
|
Before the AI agent can connect to your tools, you need:
|
||||||
|
|
||||||
|
1. **Git & Bitbucket Access** — A working SSH key configured with Bitbucket.
|
||||||
|
2. **Jira Authentication** — Run `mcporter auth atlassian` in a terminal once to complete the OAuth flow.
|
||||||
|
3. **API Keys** — Edit `.env` with your Aha! and Gainsight PX credentials.
|
||||||
|
|
||||||
|
Once authenticated, the Forge Console handles everything else.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## What's Inside
|
||||||
|
|
||||||
|
| Component | Purpose |
|
||||||
|
|-----------|---------|
|
||||||
|
| `web/` | Forge Console — local chat UI + Express server |
|
||||||
|
| `skills/epics-standards/` | Epic creation templates & Aha! workflows |
|
||||||
|
| `skills/factory-standards/` | Transparent Factory tenet sync & compliance |
|
||||||
|
| `skills/bmad-suite/` | BMad Creative Intelligence Suite (brainstorming, design thinking, storytelling) |
|
||||||
|
| `skills/gainsight-px/` | Product analytics queries |
|
||||||
|
| `config/mcporter.json` | MCP server definitions (Jira, Aha!, Context7) |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## For AI Agents
|
||||||
|
|
||||||
|
If you are an AI assistant reading this repo, see `AGENTS.md` for your operating instructions.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Frontend Options
|
||||||
|
|
||||||
|
This repo is CLI-agnostic. Use whichever interface fits your team:
|
||||||
|
|
||||||
|
### Option A: Forge Console (built-in)
|
||||||
|
A lightweight local chat UI that wraps Codex CLI.
|
||||||
|
```bash
|
||||||
|
./start.sh # opens http://localhost:3000
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option B: Open WebUI (recommended)
|
||||||
|
Use the pre-built artifacts in `openwebui/` to connect any Open WebUI instance.
|
||||||
|
Includes model presets, RAG knowledge manifests, and tool pipelines.
|
||||||
|
See [`openwebui/SETUP.md`](openwebui/SETUP.md) for details.
|
||||||
|
|
||||||
|
### Option C: Any CLI Agent
|
||||||
|
Open the repo in Cursor, Claude Code, Kiro, Gemini CLI, or any agent that reads `AGENTS.md`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────┐
|
||||||
|
│ Frontend (pick one) │
|
||||||
|
│ • Forge Console (web/) │
|
||||||
|
│ • Open WebUI (openwebui/) │
|
||||||
|
│ • CLI Agent (Codex, Claude, Kiro...) │
|
||||||
|
└──────────────┬──────────────────────────┘
|
||||||
|
↕
|
||||||
|
┌──────────────┴──────────────────────────┐
|
||||||
|
│ LLM (any OpenAI-compatible provider) │
|
||||||
|
│ + System Prompt (AGENTS.md) │
|
||||||
|
│ + RAG Knowledge (skills/, standards/) │
|
||||||
|
│ + Tool Pipelines (mcporter, aha, etc.) │
|
||||||
|
└──────────────┬──────────────────────────┘
|
||||||
|
↕ MCP
|
||||||
|
Jira · Aha! · Gainsight PX
|
||||||
|
```
|
||||||
20
config/mcporter.json
Normal file
20
config/mcporter.json
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
{
|
||||||
|
"mcpServers": {
|
||||||
|
"context7": {
|
||||||
|
"command": "npx",
|
||||||
|
"args": ["-y", "@upstash/context7-mcp"]
|
||||||
|
},
|
||||||
|
"atlassian": {
|
||||||
|
"command": "npx",
|
||||||
|
"args": ["-y", "mcp-remote", "https://mcp.atlassian.com/v1/mcp"]
|
||||||
|
},
|
||||||
|
"aha": {
|
||||||
|
"command": "npx",
|
||||||
|
"args": ["-y", "aha-mcp@latest"],
|
||||||
|
"env": {
|
||||||
|
"AHA_DOMAIN": "${AHA_DOMAIN}",
|
||||||
|
"AHA_API_TOKEN": "${AHA_API_KEY}"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
20
docker-compose.yml
Normal file
20
docker-compose.yml
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
services:
|
||||||
|
forge:
|
||||||
|
build: .
|
||||||
|
container_name: forge-console
|
||||||
|
ports:
|
||||||
|
- "3000:3000"
|
||||||
|
env_file:
|
||||||
|
- .env
|
||||||
|
volumes:
|
||||||
|
# Mount skills and config so changes persist without rebuild
|
||||||
|
- ./skills:/app/skills
|
||||||
|
- ./config:/app/config
|
||||||
|
- ./bmad:/app/bmad
|
||||||
|
- ./.standards:/app/.standards
|
||||||
|
# Mount mcporter auth cache so OAuth tokens persist
|
||||||
|
- mcporter-auth:/root/.mcporter
|
||||||
|
restart: unless-stopped
|
||||||
|
|
||||||
|
volumes:
|
||||||
|
mcporter-auth:
|
||||||
31
docs/agents/README.md
Normal file
31
docs/agents/README.md
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
# Forge Console — Agent Docs
|
||||||
|
|
||||||
|
## Jira Agent (`rp-ticket-ops`)
|
||||||
|
|
||||||
|
Manages Jira issues via the Atlassian MCP server.
|
||||||
|
|
||||||
|
### Capabilities
|
||||||
|
- Create, update, and transition Jira issues
|
||||||
|
- Search with JQL
|
||||||
|
- Add comments and attachments
|
||||||
|
- Link issues
|
||||||
|
|
||||||
|
### Authentication
|
||||||
|
Uses `mcporter` with OAuth. Run `mcporter auth atlassian` once to authenticate.
|
||||||
|
|
||||||
|
### Usage
|
||||||
|
The agent reads `config/mcporter.json` to connect. No API tokens needed — OAuth handles it.
|
||||||
|
|
||||||
|
## MCPorter
|
||||||
|
|
||||||
|
MCP client that bridges AI agents to external tool servers.
|
||||||
|
|
||||||
|
### Configured Servers
|
||||||
|
| Server | Purpose |
|
||||||
|
|--------|---------|
|
||||||
|
| `atlassian` | Jira via Atlassian's official MCP |
|
||||||
|
| `aha` | Aha! roadmap management |
|
||||||
|
| `context7` | Library documentation lookup |
|
||||||
|
|
||||||
|
### Config Location
|
||||||
|
`config/mcporter.json`
|
||||||
327
docs/agents/jira.md
Normal file
327
docs/agents/jira.md
Normal file
@@ -0,0 +1,327 @@
|
|||||||
|
---
|
||||||
|
name: jira
|
||||||
|
description: Use the mcporter CLI to list, configure, auth, and call MCP servers/tools directly (HTTP or stdio), including ad-hoc servers, config edits, and CLI/type generation.
|
||||||
|
---
|
||||||
|
|
||||||
|
# Atlassian / JIRA Reference
|
||||||
|
|
||||||
|
## Domain
|
||||||
|
`reltio.jira.com` — use for all browse links, e.g. `https://reltio.jira.com/browse/RP-XXXXX`
|
||||||
|
|
||||||
|
## Cloud ID
|
||||||
|
`444c13e0-0faa-4055-b053-501700bae7b0`
|
||||||
|
|
||||||
|
## Current User (Brian Galura)
|
||||||
|
- account_id: `712020:f70452cf-df7e-4ee3-b65b-66c83566fc3b`
|
||||||
|
- email: brian.galura@reltio.com
|
||||||
|
|
||||||
|
## Project
|
||||||
|
- key: `RP` (Reltio Platform), id: `10041`
|
||||||
|
- key: `ICR` (Infrastructure Change Request), id: `12890`
|
||||||
|
|
||||||
|
## Issue Types (RP project)
|
||||||
|
| Name | ID | Hierarchy | Notes |
|
||||||
|
|----------------|-------|-----------|------------------------------|
|
||||||
|
| Epic | 5 | 1 | Parent of stories |
|
||||||
|
| Story | 6 | 0 | Standard work item |
|
||||||
|
| Rollout Story | 10535 | 0 | Feature enablement per env |
|
||||||
|
| Task | 3 | 0 | |
|
||||||
|
| Bug | 1 | 0 | |
|
||||||
|
| Sub-Task | 9 | -1 | Child of other issue types |
|
||||||
|
|
||||||
|
## Required Fields by Issue Type
|
||||||
|
|
||||||
|
### Epic (required)
|
||||||
|
| Field | Key | Type |
|
||||||
|
|------------------------|--------------------|-------------------|
|
||||||
|
| Assignee | `assignee` | user |
|
||||||
|
| Type of Effort | `customfield_15815`| option (select) |
|
||||||
|
|
||||||
|
### Story (required)
|
||||||
|
| Field | Key | Type |
|
||||||
|
|------------------------|--------------------|-------------------|
|
||||||
|
| Assignee | `assignee` | user |
|
||||||
|
| Found/Requested By | `customfield_11922`| option (select) |
|
||||||
|
| Affected Documentation | `customfield_12429`| array of options |
|
||||||
|
| Acceptance Criteria | `customfield_15956`| **ADF** (rich text, must use Atlassian Document Format) |
|
||||||
|
| Type of Effort | `customfield_15815`| option (select) |
|
||||||
|
|
||||||
|
### Task (required)
|
||||||
|
| Field | Key | Type |
|
||||||
|
|------------------------|--------------------|-------------------|
|
||||||
|
| Assignee | `assignee` | user |
|
||||||
|
| Components | `components` | array of component (`[{"id": "..."}]`) |
|
||||||
|
| Found/Requested By | `customfield_11922`| option (select) |
|
||||||
|
| Type of Effort | `customfield_15815`| option (select) |
|
||||||
|
|
||||||
|
### Rollout Story (required)
|
||||||
|
| Field | Key | Type |
|
||||||
|
|------------------------|--------------------|-------------------|
|
||||||
|
| Assignee | `assignee` | user |
|
||||||
|
| Found/Requested By | `customfield_11922`| option (select) |
|
||||||
|
| Affected Documentation | `customfield_12429`| array of options |
|
||||||
|
| Execution team | `customfield_13020`| option (select) |
|
||||||
|
| Type of Effort | `customfield_15815`| option (select) |
|
||||||
|
|
||||||
|
### Change Request (ICR project, required)
|
||||||
|
| Field | Key | Type |
|
||||||
|
|------------------------|--------------------|-------------------|
|
||||||
|
| Assignee | `assignee` | user (has default; still set explicitly when possible) |
|
||||||
|
| Change Start Date/Time | `customfield_15856`| datetime (`YYYY-MM-DDTHH:mm:ss.SSS-0800`) |
|
||||||
|
| Change End Date/Time | `customfield_15857`| datetime (`YYYY-MM-DDTHH:mm:ss.SSS-0800`) |
|
||||||
|
| Manager/Peer Reviewer | `customfield_15862`| user |
|
||||||
|
| Procedure | `customfield_15863`| option (select) |
|
||||||
|
| Change Category | `customfield_15864`| option (select) |
|
||||||
|
| Can it be rolled back? | `customfield_16072`| array of options (checkbox) |
|
||||||
|
|
||||||
|
## Common Custom Fields (optional but useful)
|
||||||
|
| Field | Key | Type |
|
||||||
|
|--------------------|--------------------|-----------------|
|
||||||
|
| Fix Version | `fixVersions` | array of version|
|
||||||
|
| Reporter | `reporter` | user (`{"accountId": "..."}`) |
|
||||||
|
| Story Points | `customfield_10013`| number |
|
||||||
|
| Confidence Level | `customfield_12520`| option (select) |
|
||||||
|
| Product Lead | `customfield_15755`| user |
|
||||||
|
| Engineering Lead | `customfield_15756`| user |
|
||||||
|
| Start date | `customfield_15541`| date (YYYY-MM-DD) |
|
||||||
|
| End date | `customfield_15535`| date (YYYY-MM-DD) |
|
||||||
|
| Sprint | `customfield_10320`| sprint |
|
||||||
|
| Aha! Reference | `customfield_11820`| string (URL) |
|
||||||
|
| Security Review | `customfield_15826`| option (select) |
|
||||||
|
|
||||||
|
## Allowed Values for Required Select Fields
|
||||||
|
|
||||||
|
### Type of Effort (`customfield_15815`)
|
||||||
|
| Value | ID |
|
||||||
|
|-------------------------------------------|-------|
|
||||||
|
| Customer Feature | 18153 |
|
||||||
|
| Customer Support | 18156 |
|
||||||
|
| Innovation | 19074 |
|
||||||
|
| Platform Excellence | 18922 |
|
||||||
|
| Security Issues / Tech Debt / Maintenance | 18155 |
|
||||||
|
|
||||||
|
### Found/Requested By (`customfield_11922`)
|
||||||
|
| Value | ID |
|
||||||
|
|----------------------|-------|
|
||||||
|
| Alert | 18931 |
|
||||||
|
| Customer Engineering | 13609 |
|
||||||
|
| Engineering | 10114 |
|
||||||
|
| FDE | 24812 |
|
||||||
|
| Other | 17542 |
|
||||||
|
| Product Management | 10115 |
|
||||||
|
|
||||||
|
### Affected Documentation (`customfield_12429`)
|
||||||
|
| Value | ID |
|
||||||
|
|-------------------|-------|
|
||||||
|
| No | 16302 |
|
||||||
|
| Deprecation Notice| 11710 |
|
||||||
|
| Help Portal | 10438 |
|
||||||
|
| Internal Only | 12912 |
|
||||||
|
| Release Notes | 10437 |
|
||||||
|
|
||||||
|
### Execution team (`customfield_13020`) — partial list
|
||||||
|
| Value | ID |
|
||||||
|
|--------------------|-------|
|
||||||
|
| Cloud Platform (Portugal) | 19040 |
|
||||||
|
| DevOps | 16211 |
|
||||||
|
| Documentation | 18367 |
|
||||||
|
| Persistence | 16700 |
|
||||||
|
| Performance | 17629 |
|
||||||
|
| Data Unification | 12811 |
|
||||||
|
| Match | 15400 |
|
||||||
|
| IDP | 17906 |
|
||||||
|
| Console | 17845 |
|
||||||
|
| Architecture | 18930 |
|
||||||
|
| CI | 17656 |
|
||||||
|
|
||||||
|
### Procedure (`customfield_15863`) — ICR Change Request
|
||||||
|
| Value | ID |
|
||||||
|
|-------------|-------|
|
||||||
|
| Manual Step | 18226 |
|
||||||
|
| Automated | 18227 |
|
||||||
|
|
||||||
|
### Change Category (`customfield_15864`) — ICR Change Request
|
||||||
|
Default for ICR templates in this skill: `High Risk` (`18229`).
|
||||||
|
|
||||||
|
| Value | ID |
|
||||||
|
|-----------|-------|
|
||||||
|
| Low Risk | 18228 |
|
||||||
|
| High Risk | 18229 |
|
||||||
|
| Emergency | 18230 |
|
||||||
|
|
||||||
|
### Can it be rolled back? (`customfield_16072`) — ICR Change Request
|
||||||
|
| Value | ID |
|
||||||
|
|-------|-------|
|
||||||
|
| Yes | 18678 |
|
||||||
|
| No | 18679 |
|
||||||
|
|
||||||
|
### Risk Level (`customfield_16070`) — required when closing
|
||||||
|
| Value | ID |
|
||||||
|
|--------------------|-------|
|
||||||
|
| High-Risk Changes | 18676 |
|
||||||
|
| Low-Risk Changes | 18677 |
|
||||||
|
| Not applicable | 19049 |
|
||||||
|
|
||||||
|
### Confidence Level (`customfield_12520`)
|
||||||
|
| Value | ID |
|
||||||
|
|--------|-------|
|
||||||
|
| High | 10510 |
|
||||||
|
|
||||||
|
### Components (partial list)
|
||||||
|
| Value | ID |
|
||||||
|
|---------------|-------|
|
||||||
|
| Documentation | 10222 |
|
||||||
|
| QA | 11511 |
|
||||||
|
| DevOps | 11111 |
|
||||||
|
| DNS | 20020 |
|
||||||
|
|
||||||
|
## Fix Versions (known)
|
||||||
|
| Version | ID | Release Date |
|
||||||
|
|------------|-------|-------------|
|
||||||
|
| 2026.1.0.0 | 28439 | 2026-03-23 |
|
||||||
|
|
||||||
|
## ADF (Atlassian Document Format) Templates
|
||||||
|
|
||||||
|
### Acceptance Criteria — ordered list
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"type": "doc",
|
||||||
|
"version": 1,
|
||||||
|
"content": [
|
||||||
|
{"type": "orderedList", "attrs": {"order": 1}, "content": [
|
||||||
|
{"type": "listItem", "content": [{"type": "paragraph", "content": [{"type": "text", "text": "Criteria item 1"}]}]},
|
||||||
|
{"type": "listItem", "content": [{"type": "paragraph", "content": [{"type": "text", "text": "Criteria item 2"}]}]}
|
||||||
|
]}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Generic paragraph ADF (use for rich-text custom fields)
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"type": "doc",
|
||||||
|
"version": 1,
|
||||||
|
"content": [
|
||||||
|
{
|
||||||
|
"type": "paragraph",
|
||||||
|
"content": [
|
||||||
|
{"type": "text", "text": "No expected customer impact."}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Common Transitions
|
||||||
|
| Name | ID | Target Status |
|
||||||
|
|--------------------------|-----|----------------|
|
||||||
|
| Selected for Development | 501 | Prioritized |
|
||||||
|
| Start Progress | 11 | In Progress |
|
||||||
|
| Resolve | 31 | Resolved |
|
||||||
|
| In Design | 471 | In Design |
|
||||||
|
| Close | 351 | Closed |
|
||||||
|
|
||||||
|
**Close transition requires:** `fixVersions` and `customfield_16070` (Risk Level).
|
||||||
|
|
||||||
|
### Example: Close a ticket
|
||||||
|
```bash
|
||||||
|
mcporter call atlassian.transitionJiraIssue --args '{
|
||||||
|
"cloudId": "444c13e0-0faa-4055-b053-501700bae7b0",
|
||||||
|
"issueIdOrKey": "RP-XXXXX",
|
||||||
|
"transition": {"id": "351"},
|
||||||
|
"fields": {
|
||||||
|
"fixVersions": [{"id": "28439"}],
|
||||||
|
"customfield_16070": {"id": "19049"}
|
||||||
|
}
|
||||||
|
}'
|
||||||
|
```
|
||||||
|
|
||||||
|
## Example: Create an Epic
|
||||||
|
```bash
|
||||||
|
mcporter call atlassian.createJiraIssue --args '{
|
||||||
|
"cloudId": "444c13e0-0faa-4055-b053-501700bae7b0",
|
||||||
|
"projectKey": "RP",
|
||||||
|
"issueTypeName": "Epic",
|
||||||
|
"summary": "Epic Title",
|
||||||
|
"description": "Markdown description",
|
||||||
|
"assignee_account_id": "712020:f70452cf-df7e-4ee3-b65b-66c83566fc3b",
|
||||||
|
"additional_fields": {
|
||||||
|
"reporter": {"accountId": "712020:f70452cf-df7e-4ee3-b65b-66c83566fc3b"},
|
||||||
|
"fixVersions": [{"id": "28439"}],
|
||||||
|
"customfield_15815": {"id": "18153"}
|
||||||
|
}
|
||||||
|
}'
|
||||||
|
```
|
||||||
|
|
||||||
|
## Example: Create a Task
|
||||||
|
```bash
|
||||||
|
mcporter call atlassian.createJiraIssue --args '{
|
||||||
|
"cloudId": "444c13e0-0faa-4055-b053-501700bae7b0",
|
||||||
|
"projectKey": "RP",
|
||||||
|
"issueTypeName": "Task",
|
||||||
|
"summary": "Task Title",
|
||||||
|
"description": "Task description",
|
||||||
|
"assignee_account_id": "712020:f70452cf-df7e-4ee3-b65b-66c83566fc3b",
|
||||||
|
"additional_fields": {
|
||||||
|
"components": [{"id": "10222"}],
|
||||||
|
"customfield_15815": {"id": "18922"},
|
||||||
|
"customfield_11922": {"id": "10115"}
|
||||||
|
}
|
||||||
|
}'
|
||||||
|
```
|
||||||
|
|
||||||
|
## Example: Create a Story under an Epic
|
||||||
|
```bash
|
||||||
|
mcporter call atlassian.createJiraIssue --args '{
|
||||||
|
"cloudId": "444c13e0-0faa-4055-b053-501700bae7b0",
|
||||||
|
"projectKey": "RP",
|
||||||
|
"issueTypeName": "Story",
|
||||||
|
"summary": "Story Title",
|
||||||
|
"description": "Markdown description with ## Why / ## What / ## How",
|
||||||
|
"assignee_account_id": "712020:f70452cf-df7e-4ee3-b65b-66c83566fc3b",
|
||||||
|
"parent": "RP-XXXXX",
|
||||||
|
"additional_fields": {
|
||||||
|
"reporter": {"accountId": "712020:f70452cf-df7e-4ee3-b65b-66c83566fc3b"},
|
||||||
|
"fixVersions": [{"id": "28439"}],
|
||||||
|
"customfield_15815": {"id": "18153"},
|
||||||
|
"customfield_11922": {"id": "10114"},
|
||||||
|
"customfield_12429": [{"id": "16302"}],
|
||||||
|
"customfield_15956": {"type":"doc","version":1,"content":[{"type":"orderedList","attrs":{"order":1},"content":[{"type":"listItem","content":[{"type":"paragraph","content":[{"type":"text","text":"AC item"}]}]}]}]}
|
||||||
|
}
|
||||||
|
}'
|
||||||
|
```
|
||||||
|
|
||||||
|
## Example: Create an ICR Change Request
|
||||||
|
```bash
|
||||||
|
mcporter call atlassian.createJiraIssue --args '{
|
||||||
|
"cloudId": "444c13e0-0faa-4055-b053-501700bae7b0",
|
||||||
|
"projectKey": "ICR",
|
||||||
|
"issueTypeName": "Change Request",
|
||||||
|
"summary": "Re-add na11 IP 34.36.175.121 to na07 latency-based Cloud DNS records (na07-compute)",
|
||||||
|
"description": "Implementation, verification, impact, and rollback details",
|
||||||
|
"assignee_account_id": "712020:f70452cf-df7e-4ee3-b65b-66c83566fc3b",
|
||||||
|
"additional_fields": {
|
||||||
|
"components": [{"id": "20020"}],
|
||||||
|
"customfield_15856": "2026-02-26T17:00:00.000-0800",
|
||||||
|
"customfield_15857": "2026-02-26T18:00:00.000-0800",
|
||||||
|
"customfield_15862": {"accountId": "712020:f70452cf-df7e-4ee3-b65b-66c83566fc3b"},
|
||||||
|
"customfield_15863": {"id": "18226"},
|
||||||
|
"customfield_15864": {"id": "18229"},
|
||||||
|
"customfield_16072": [{"id": "18678"}],
|
||||||
|
"customfield_15858": {
|
||||||
|
"type": "doc",
|
||||||
|
"version": 1,
|
||||||
|
"content": [
|
||||||
|
{"type": "paragraph", "content": [{"type": "text", "text": "No expected customer impact."}]}
|
||||||
|
]
|
||||||
|
},
|
||||||
|
"customfield_15859": {
|
||||||
|
"type": "doc",
|
||||||
|
"version": 1,
|
||||||
|
"content": [
|
||||||
|
{"type": "paragraph", "content": [{"type": "text", "text": "Remove na11 IP 34.36.175.121 from affected latency-based record sets."}]}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}'
|
||||||
|
```
|
||||||
44
docs/agents/mcporter.md
Normal file
44
docs/agents/mcporter.md
Normal file
@@ -0,0 +1,44 @@
|
|||||||
|
---
|
||||||
|
name: mcporter
|
||||||
|
description: Use the mcporter CLI to list, configure, auth, and call MCP servers/tools directly (HTTP or stdio), including ad-hoc servers, config edits, and CLI/type generation.
|
||||||
|
homepage: http://mcporter.dev
|
||||||
|
---
|
||||||
|
|
||||||
|
# mcporter
|
||||||
|
|
||||||
|
Use `mcporter` to work with MCP servers directly.
|
||||||
|
|
||||||
|
⚠️ **CRITICAL EXECUTION RULE:**
|
||||||
|
Whenever you run `mcporter`, you MUST source the local `.env` file first so `mcporter.json` can interpolate the required API keys. Always run it as a chained command like this:
|
||||||
|
|
||||||
|
`set -a && source .env && set +a && mcporter --config config/mcporter.json call <server.tool> key=value`
|
||||||
|
|
||||||
|
Quick start
|
||||||
|
- `mcporter list`
|
||||||
|
- `mcporter list <server> --schema`
|
||||||
|
- `mcporter call <server.tool> key=value`
|
||||||
|
|
||||||
|
Call tools
|
||||||
|
- Selector: `mcporter call linear.list_issues team=ENG limit:5`
|
||||||
|
- Function syntax: `mcporter call "linear.create_issue(title: \"Bug\")"`
|
||||||
|
- Full URL: `mcporter call https://api.example.com/mcp.fetch url:https://example.com`
|
||||||
|
- Stdio: `mcporter call --stdio "bun run ./server.ts" scrape url=https://example.com`
|
||||||
|
- JSON payload: `mcporter call <server.tool> --args '{"limit":5}'`
|
||||||
|
|
||||||
|
Auth + config
|
||||||
|
- OAuth: `mcporter auth <server | url> [--reset]`
|
||||||
|
- Config: `mcporter config list|get|add|remove|import|login|logout`
|
||||||
|
|
||||||
|
Daemon
|
||||||
|
- `mcporter daemon start|status|stop|restart`
|
||||||
|
|
||||||
|
Codegen
|
||||||
|
- CLI: `mcporter generate-cli --server <name>` or `--command <url>`
|
||||||
|
- Inspect: `mcporter inspect-cli <path> [--json]`
|
||||||
|
- TS: `mcporter emit-ts <server> --mode client|types`
|
||||||
|
|
||||||
|
Notes
|
||||||
|
- config file is located at `config/mcporter.json` you may have to resolve a different relative path depending on your context
|
||||||
|
- Prefer `--output json` for machine-readable results.
|
||||||
|
- Always use `--args '{...}'` (JSON payload) for Atlassian calls with complex/nested fields.
|
||||||
|
|
||||||
38
docs/agents/rp-ticket-ops.md
Normal file
38
docs/agents/rp-ticket-ops.md
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
---
|
||||||
|
name: rp-ticket-ops
|
||||||
|
description: Create and update RP Jira tickets in reltio.jira.com with the standard IaC setup. Use when you need to batch-create or normalize RP tasks to match the reference ticket (labels, sprint, status, assignee, and verification).
|
||||||
|
---
|
||||||
|
|
||||||
|
# RP Ticket Ops
|
||||||
|
|
||||||
|
## Jira Context
|
||||||
|
- Domain: `https://reltio.jira.com`
|
||||||
|
- Cloud ID: `444c13e0-0faa-4055-b053-501700bae7b0`
|
||||||
|
- Project: `RP`
|
||||||
|
- Reference ticket: `RP-175518`
|
||||||
|
|
||||||
|
## Standard RP IaC Values
|
||||||
|
- Sprint field key: `customfield_10320`
|
||||||
|
- Sprint value shape: numeric sprint id in `fields` (example: `4936`)
|
||||||
|
- Sprint in this workflow: `4936` (`IAC Sprint 8`)
|
||||||
|
- Labels:
|
||||||
|
- `Project_Cloud_Platform_IaC`
|
||||||
|
|
||||||
|
## Update Pattern
|
||||||
|
1. Find all target issues by key or summary.
|
||||||
|
2. Ensure labels are set exactly to the 3 standard labels.
|
||||||
|
3. Set sprint with:
|
||||||
|
- `fields: { "customfield_10320": 4936 }`
|
||||||
|
4. Verify with JQL.
|
||||||
|
|
||||||
|
## Verification JQL
|
||||||
|
- Sprint check:
|
||||||
|
- `key in (...) AND sprint = 4936`
|
||||||
|
- Assignee check:
|
||||||
|
- `key in (...) AND assignee = "Yevhen Fesyk"`
|
||||||
|
- Status + labels check:
|
||||||
|
- `key in (...) AND status = "Prioritized" AND labels in ("Project_Cloud_Platform_IaC","Project_Cloud_Platform_IaC_v0.4","qa-foxtrot")`
|
||||||
|
|
||||||
|
## Notes
|
||||||
|
- For sprint assignment, prefer the numeric `fields.customfield_10320` payload.
|
||||||
|
- If sprint is not visible on board after update, re-check via JQL first; board views can lag.
|
||||||
27
env.example
Normal file
27
env.example
Normal file
@@ -0,0 +1,27 @@
|
|||||||
|
# ==========================================
|
||||||
|
# PM Template - Environment Variables
|
||||||
|
# ==========================================
|
||||||
|
# Copy this file to .env and fill in the values below.
|
||||||
|
# Do NOT commit your actual .env file to version control.
|
||||||
|
|
||||||
|
# ------------------------------------------
|
||||||
|
# Aha! Integration
|
||||||
|
# ------------------------------------------
|
||||||
|
# Generate this at: [Aha! Settings URL or instructions]
|
||||||
|
AHA_API_KEY="your_aha_api_key_here"
|
||||||
|
AHA_DOMAIN="your_company.aha.io"
|
||||||
|
|
||||||
|
# ------------------------------------------
|
||||||
|
# Gainsight PX Integration
|
||||||
|
# ------------------------------------------
|
||||||
|
# Generate this in Gainsight PX: Administration -> REST API
|
||||||
|
GAINSIGHT_PX_API_KEY="your_gainsight_px_api_key_here"
|
||||||
|
GAINSIGHT_PX_REGION="US" # Set to 'EU' if hosted in Europe
|
||||||
|
|
||||||
|
# ------------------------------------------
|
||||||
|
# Jira / Atlassian Integration
|
||||||
|
# ------------------------------------------
|
||||||
|
# We use the 'mcporter' CLI with the Atlassian MCP server for Jira.
|
||||||
|
# You do NOT need a static API token here.
|
||||||
|
# Instead, run the following command in your terminal to authenticate:
|
||||||
|
# mcporter auth atlassian
|
||||||
61
openwebui/SETUP.md
Normal file
61
openwebui/SETUP.md
Normal file
@@ -0,0 +1,61 @@
|
|||||||
|
# Open WebUI Integration
|
||||||
|
|
||||||
|
Drop-in configuration to use the PM Factory repo with [Open WebUI](https://github.com/open-webui/open-webui).
|
||||||
|
|
||||||
|
## Quick Setup
|
||||||
|
|
||||||
|
### 1. Connect Your Model Provider
|
||||||
|
|
||||||
|
In Open WebUI → Settings → Connections, add:
|
||||||
|
|
||||||
|
| Field | Value |
|
||||||
|
|-------|-------|
|
||||||
|
| URL | `http://<your-kiro-gateway>:8000/v1` |
|
||||||
|
| API Key | Your gateway API key |
|
||||||
|
| Model | `claude-opus-4.6` |
|
||||||
|
|
||||||
|
Any OpenAI-compatible provider works (Kiro Gateway, LiteLLM, Ollama, etc).
|
||||||
|
|
||||||
|
### 2. Import the Preset
|
||||||
|
|
||||||
|
Go to Workspace → Models → Import, and upload `preset.json`.
|
||||||
|
|
||||||
|
This creates a "Reltio PM Factory" model preset with the full system prompt from AGENTS.md baked in.
|
||||||
|
|
||||||
|
### 3. Upload Knowledge (Optional)
|
||||||
|
|
||||||
|
Go to Workspace → Knowledge → Create Collection called "PM Factory".
|
||||||
|
|
||||||
|
Upload these directories as documents:
|
||||||
|
- `skills/epics-standards/references/`
|
||||||
|
- `skills/factory-standards/` (after running `manager.py update`)
|
||||||
|
- `skills/bmad-suite/` (after running `manager.py update`)
|
||||||
|
|
||||||
|
Then attach the collection to your preset in Model settings → Knowledge.
|
||||||
|
|
||||||
|
### 4. Install Pipelines (Optional)
|
||||||
|
|
||||||
|
Pipelines let the model execute tools (Jira, Aha!, Gainsight) directly.
|
||||||
|
|
||||||
|
Copy the files from `pipelines/` into your Open WebUI pipelines directory, or upload them via the Pipelines UI.
|
||||||
|
|
||||||
|
Required env vars (set in Open WebUI → Settings → Pipelines):
|
||||||
|
- `AHA_API_KEY`
|
||||||
|
- `AHA_DOMAIN`
|
||||||
|
- `GAINSIGHT_PX_API_KEY`
|
||||||
|
- `MCPORTER_CONFIG` — path to `config/mcporter.json`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
Open WebUI (browser)
|
||||||
|
↕ OpenAI-compatible API
|
||||||
|
Any LLM Provider (Kiro / Ollama / LiteLLM / OpenAI)
|
||||||
|
+ System Prompt (preset.json ← AGENTS.md)
|
||||||
|
+ RAG Knowledge (skills docs)
|
||||||
|
+ Pipelines (mcporter, aha, gainsight, bmad)
|
||||||
|
```
|
||||||
|
|
||||||
|
The repo remains CLI-agnostic. This is just one frontend option.
|
||||||
35
openwebui/knowledge.json
Normal file
35
openwebui/knowledge.json
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
{
|
||||||
|
"name": "PM Factory Knowledge Base",
|
||||||
|
"description": "Skills documentation and reference materials for the Reltio PM Factory.",
|
||||||
|
"collections": [
|
||||||
|
{
|
||||||
|
"name": "Transparent Factory Standards",
|
||||||
|
"description": "The five tenets of the Transparent Factory engineering standard.",
|
||||||
|
"source": ".standards/factory/content/",
|
||||||
|
"note": "Run 'python3 skills/factory-standards/manager.py update' first to populate this directory."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Epic Standards",
|
||||||
|
"description": "Templates and checklists for creating well-structured epics.",
|
||||||
|
"files": [
|
||||||
|
"skills/epics-standards/references/aha-epic-workflow.md",
|
||||||
|
"skills/epics-standards/references/epic-fields-checklist.md",
|
||||||
|
"skills/epics-standards/SKILL.md"
|
||||||
|
]
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "BMad Creative Suite",
|
||||||
|
"description": "Agent definitions and workflows for brainstorming, design thinking, and storytelling.",
|
||||||
|
"source": "bmad/creative-intelligence-suite/docs/",
|
||||||
|
"note": "Run 'python3 skills/bmad-suite/manager.py update' first to populate this directory."
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"name": "Gainsight PX",
|
||||||
|
"description": "Product analytics skill documentation.",
|
||||||
|
"files": [
|
||||||
|
"skills/gainsight-px/SKILL.md"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"upload_instructions": "In Open WebUI: Workspace → Knowledge → Create Collection. Upload the files listed for each collection. Then attach the collections to the 'Reltio PM Factory' model preset."
|
||||||
|
}
|
||||||
87
openwebui/pipelines/aha_pipeline.py
Normal file
87
openwebui/pipelines/aha_pipeline.py
Normal file
@@ -0,0 +1,87 @@
|
|||||||
|
"""
|
||||||
|
title: Aha! Pipeline
|
||||||
|
author: PM Factory
|
||||||
|
version: 0.1.0
|
||||||
|
description: Create epics and features in Aha! via the MCP server.
|
||||||
|
requirements: subprocess
|
||||||
|
"""
|
||||||
|
|
||||||
|
import subprocess
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
MCPORTER_CONFIG = os.environ.get("MCPORTER_CONFIG", "config/mcporter.json")
|
||||||
|
|
||||||
|
|
||||||
|
class Tools:
|
||||||
|
def __init__(self):
|
||||||
|
self.valves = self.Valves()
|
||||||
|
|
||||||
|
class Valves:
|
||||||
|
MCPORTER_CONFIG: str = MCPORTER_CONFIG
|
||||||
|
AHA_DOMAIN: str = os.environ.get("AHA_DOMAIN", "")
|
||||||
|
AHA_API_KEY: str = os.environ.get("AHA_API_KEY", "")
|
||||||
|
|
||||||
|
def aha_create_epic(self, product: str, name: str, description: str, workflow_status: str = "New", __user__: dict = {}) -> str:
|
||||||
|
"""
|
||||||
|
Create an epic in Aha!
|
||||||
|
|
||||||
|
:param product: Aha! product key (e.g. 'PLAT')
|
||||||
|
:param name: Epic name
|
||||||
|
:param description: Epic description (supports HTML)
|
||||||
|
:param workflow_status: Initial status. Default: New
|
||||||
|
:return: Created epic reference and URL
|
||||||
|
"""
|
||||||
|
env = os.environ.copy()
|
||||||
|
env["AHA_DOMAIN"] = self.valves.AHA_DOMAIN
|
||||||
|
env["AHA_API_TOKEN"] = self.valves.AHA_API_KEY
|
||||||
|
|
||||||
|
params = {
|
||||||
|
"product": product,
|
||||||
|
"name": name,
|
||||||
|
"description": description,
|
||||||
|
"workflow_status": workflow_status
|
||||||
|
}
|
||||||
|
try:
|
||||||
|
result = subprocess.run(
|
||||||
|
[
|
||||||
|
"mcporter",
|
||||||
|
"--config", self.valves.MCPORTER_CONFIG,
|
||||||
|
"call", "aha", "create_epic",
|
||||||
|
"--params", json.dumps(params)
|
||||||
|
],
|
||||||
|
capture_output=True, text=True, timeout=30, env=env
|
||||||
|
)
|
||||||
|
if result.returncode != 0:
|
||||||
|
return f"Error: {result.stderr.strip()}"
|
||||||
|
return result.stdout.strip()
|
||||||
|
except Exception as e:
|
||||||
|
return f"Error: {str(e)}"
|
||||||
|
|
||||||
|
def aha_list_features(self, product: str, __user__: dict = {}) -> str:
|
||||||
|
"""
|
||||||
|
List features for an Aha! product.
|
||||||
|
|
||||||
|
:param product: Aha! product key
|
||||||
|
:return: JSON list of features
|
||||||
|
"""
|
||||||
|
env = os.environ.copy()
|
||||||
|
env["AHA_DOMAIN"] = self.valves.AHA_DOMAIN
|
||||||
|
env["AHA_API_TOKEN"] = self.valves.AHA_API_KEY
|
||||||
|
|
||||||
|
try:
|
||||||
|
result = subprocess.run(
|
||||||
|
[
|
||||||
|
"mcporter",
|
||||||
|
"--config", self.valves.MCPORTER_CONFIG,
|
||||||
|
"call", "aha", "list_features",
|
||||||
|
"--params", json.dumps({"product": product})
|
||||||
|
],
|
||||||
|
capture_output=True, text=True, timeout=30, env=env
|
||||||
|
)
|
||||||
|
if result.returncode != 0:
|
||||||
|
return f"Error: {result.stderr.strip()}"
|
||||||
|
return result.stdout.strip()
|
||||||
|
except Exception as e:
|
||||||
|
return f"Error: {str(e)}"
|
||||||
117
openwebui/pipelines/bmad_factory_pipeline.py
Normal file
117
openwebui/pipelines/bmad_factory_pipeline.py
Normal file
@@ -0,0 +1,117 @@
|
|||||||
|
"""
|
||||||
|
title: BMad & Factory Pipeline
|
||||||
|
author: PM Factory
|
||||||
|
version: 0.1.0
|
||||||
|
description: Brainstorm with BMad Creative Suite and validate against Transparent Factory tenets.
|
||||||
|
requirements: subprocess
|
||||||
|
"""
|
||||||
|
|
||||||
|
import subprocess
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
BMAD_PATH = os.environ.get("BMAD_PATH", "bmad")
|
||||||
|
FACTORY_PATH = os.environ.get("FACTORY_PATH", ".standards/factory/content")
|
||||||
|
|
||||||
|
|
||||||
|
class Tools:
|
||||||
|
def __init__(self):
|
||||||
|
self.valves = self.Valves()
|
||||||
|
|
||||||
|
class Valves:
|
||||||
|
BMAD_PATH: str = BMAD_PATH
|
||||||
|
FACTORY_PATH: str = FACTORY_PATH
|
||||||
|
|
||||||
|
def bmad_list_agents(self, __user__: dict = {}) -> str:
|
||||||
|
"""
|
||||||
|
List available BMad Creative Intelligence Suite agents and their capabilities.
|
||||||
|
|
||||||
|
:return: List of agents with descriptions
|
||||||
|
"""
|
||||||
|
agents = {
|
||||||
|
"Carson (Brainstorming Coach)": {
|
||||||
|
"command": "/cis-brainstorm",
|
||||||
|
"capabilities": "36 ideation techniques, group dynamics, 'Yes, and...' methodology"
|
||||||
|
},
|
||||||
|
"Maya (Design Thinking Coach)": {
|
||||||
|
"command": "/cis-design-thinking",
|
||||||
|
"capabilities": "Five-phase design thinking, empathy mapping, rapid prototyping"
|
||||||
|
},
|
||||||
|
"Victor (Innovation Strategist)": {
|
||||||
|
"command": "/cis-innovation-strategy",
|
||||||
|
"capabilities": "Jobs-to-be-Done, Blue Ocean Strategy, Business Model Canvas"
|
||||||
|
},
|
||||||
|
"Dr. Quinn (Creative Problem Solver)": {
|
||||||
|
"command": "/cis-problem-solve",
|
||||||
|
"capabilities": "Root cause analysis, systematic diagnosis, solution frameworks"
|
||||||
|
},
|
||||||
|
"Storyteller": {
|
||||||
|
"command": "/cis-story",
|
||||||
|
"capabilities": "PR/FAQ drafting, narrative structure, stakeholder communication"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return json.dumps(agents, indent=2)
|
||||||
|
|
||||||
|
def bmad_brainstorm(self, topic: str, technique: str = "auto", __user__: dict = {}) -> str:
|
||||||
|
"""
|
||||||
|
Run a brainstorming session using BMad's Carson agent.
|
||||||
|
|
||||||
|
:param topic: The topic or problem to brainstorm about
|
||||||
|
:param technique: Brainstorming technique (auto, scamper, reverse, starbursting, six-hats, etc). Default: auto
|
||||||
|
:return: Brainstorming session output
|
||||||
|
"""
|
||||||
|
prompt = f"""You are Carson, the Brainstorming Coach from the BMad Creative Intelligence Suite.
|
||||||
|
|
||||||
|
Run a brainstorming session on this topic: {topic}
|
||||||
|
|
||||||
|
Technique: {technique if technique != 'auto' else 'Choose the best technique for this topic.'}
|
||||||
|
|
||||||
|
Generate:
|
||||||
|
1. 8-12 diverse ideas using the selected technique
|
||||||
|
2. For each idea: one sentence description + feasibility rating (1-5)
|
||||||
|
3. Top 3 recommendations with brief rationale
|
||||||
|
4. One wild/moonshot idea that breaks assumptions
|
||||||
|
|
||||||
|
Use "Yes, and..." methodology. Celebrate bold ideas."""
|
||||||
|
|
||||||
|
return prompt
|
||||||
|
|
||||||
|
def factory_check(self, spec_text: str, __user__: dict = {}) -> str:
|
||||||
|
"""
|
||||||
|
Validate a specification or epic against the Transparent Factory tenets.
|
||||||
|
|
||||||
|
:param spec_text: The spec, epic, or requirement text to validate
|
||||||
|
:return: Compliance report with pass/fail per tenet and recommendations
|
||||||
|
"""
|
||||||
|
tenets = {
|
||||||
|
"Atomic Flagging": {
|
||||||
|
"rule": "All new features must be behind feature flags with 14-day TTL. Must use OpenFeature SDK.",
|
||||||
|
"check": "Does the spec mention feature flags? Is there a TTL or rollout plan?"
|
||||||
|
},
|
||||||
|
"Elastic Schema": {
|
||||||
|
"rule": "Schema changes must be additive-only. Breaking changes require sync dual-write with 30-day migration SLA.",
|
||||||
|
"check": "Does the spec propose schema changes? Are they additive? Is there a migration plan?"
|
||||||
|
},
|
||||||
|
"Cognitive Durability": {
|
||||||
|
"rule": "All architectural decisions must have ADR logs. Code must be readable in 60 seconds.",
|
||||||
|
"check": "Does the spec reference ADRs? Is the proposed design simple enough to explain quickly?"
|
||||||
|
},
|
||||||
|
"Semantic Observability": {
|
||||||
|
"rule": "AI reasoning must emit structured telemetry spans. Decisions must be traceable.",
|
||||||
|
"check": "Does the spec include observability requirements? Are reasoning spans defined?"
|
||||||
|
},
|
||||||
|
"Configurable Autonomy": {
|
||||||
|
"rule": "AI agent actions must have governance guardrails. Human-in-the-loop for destructive operations.",
|
||||||
|
"check": "Does the spec define autonomy boundaries? Are there approval gates?"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
report = "# Transparent Factory Compliance Check\n\n"
|
||||||
|
report += f"**Input:** {spec_text[:200]}{'...' if len(spec_text) > 200 else ''}\n\n"
|
||||||
|
report += "| Tenet | Rule | Question |\n|-------|------|----------|\n"
|
||||||
|
for name, t in tenets.items():
|
||||||
|
report += f"| {name} | {t['rule']} | {t['check']} |\n"
|
||||||
|
report += "\n*Review each tenet against the spec and flag violations.*"
|
||||||
|
|
||||||
|
return report
|
||||||
76
openwebui/pipelines/gainsight_pipeline.py
Normal file
76
openwebui/pipelines/gainsight_pipeline.py
Normal file
@@ -0,0 +1,76 @@
|
|||||||
|
"""
|
||||||
|
title: Gainsight PX Pipeline
|
||||||
|
author: PM Factory
|
||||||
|
version: 0.1.0
|
||||||
|
description: Query Gainsight PX product analytics.
|
||||||
|
requirements: requests
|
||||||
|
"""
|
||||||
|
|
||||||
|
import os
|
||||||
|
import json
|
||||||
|
import requests
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
|
||||||
|
class Tools:
|
||||||
|
def __init__(self):
|
||||||
|
self.valves = self.Valves()
|
||||||
|
|
||||||
|
class Valves:
|
||||||
|
GAINSIGHT_PX_API_KEY: str = os.environ.get("GAINSIGHT_PX_API_KEY", "")
|
||||||
|
GAINSIGHT_PX_REGION: str = os.environ.get("GAINSIGHT_PX_REGION", "US")
|
||||||
|
|
||||||
|
def _base_url(self) -> str:
|
||||||
|
if self.valves.GAINSIGHT_PX_REGION.upper() == "EU":
|
||||||
|
return "https://api-eu.aptrinsic.com/v1"
|
||||||
|
return "https://api.aptrinsic.com/v1"
|
||||||
|
|
||||||
|
def _headers(self) -> dict:
|
||||||
|
return {
|
||||||
|
"X-APTRINSIC-API-KEY": self.valves.GAINSIGHT_PX_API_KEY,
|
||||||
|
"Content-Type": "application/json"
|
||||||
|
}
|
||||||
|
|
||||||
|
def gainsight_query_users(self, filter_expression: str = "", page_size: int = 20, __user__: dict = {}) -> str:
|
||||||
|
"""
|
||||||
|
Query Gainsight PX users/accounts.
|
||||||
|
|
||||||
|
:param filter_expression: Optional filter (e.g. 'propertyName=value')
|
||||||
|
:param page_size: Number of results to return. Default: 20
|
||||||
|
:return: JSON string of user data
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
params = {"pageSize": page_size}
|
||||||
|
if filter_expression:
|
||||||
|
params["filter"] = filter_expression
|
||||||
|
|
||||||
|
resp = requests.get(
|
||||||
|
f"{self._base_url()}/users",
|
||||||
|
headers=self._headers(),
|
||||||
|
params=params,
|
||||||
|
timeout=15
|
||||||
|
)
|
||||||
|
resp.raise_for_status()
|
||||||
|
return json.dumps(resp.json(), indent=2)
|
||||||
|
except Exception as e:
|
||||||
|
return f"Error: {str(e)}"
|
||||||
|
|
||||||
|
def gainsight_feature_usage(self, feature_id: str, days: int = 30, __user__: dict = {}) -> str:
|
||||||
|
"""
|
||||||
|
Get feature usage stats from Gainsight PX.
|
||||||
|
|
||||||
|
:param feature_id: The feature ID to query
|
||||||
|
:param days: Lookback period in days. Default: 30
|
||||||
|
:return: JSON string of usage data
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
resp = requests.get(
|
||||||
|
f"{self._base_url()}/feature/{feature_id}/usage",
|
||||||
|
headers=self._headers(),
|
||||||
|
params={"days": days},
|
||||||
|
timeout=15
|
||||||
|
)
|
||||||
|
resp.raise_for_status()
|
||||||
|
return json.dumps(resp.json(), indent=2)
|
||||||
|
except Exception as e:
|
||||||
|
return f"Error: {str(e)}"
|
||||||
77
openwebui/pipelines/jira_pipeline.py
Normal file
77
openwebui/pipelines/jira_pipeline.py
Normal file
@@ -0,0 +1,77 @@
|
|||||||
|
"""
|
||||||
|
title: Jira Pipeline
|
||||||
|
author: PM Factory
|
||||||
|
version: 0.1.0
|
||||||
|
description: Search and create Jira issues via mcporter + Atlassian MCP.
|
||||||
|
requirements: subprocess
|
||||||
|
"""
|
||||||
|
|
||||||
|
import subprocess
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
from typing import Optional
|
||||||
|
|
||||||
|
MCPORTER_CONFIG = os.environ.get("MCPORTER_CONFIG", "config/mcporter.json")
|
||||||
|
|
||||||
|
|
||||||
|
class Tools:
|
||||||
|
def __init__(self):
|
||||||
|
self.valves = self.Valves()
|
||||||
|
|
||||||
|
class Valves:
|
||||||
|
MCPORTER_CONFIG: str = MCPORTER_CONFIG
|
||||||
|
|
||||||
|
def jira_search(self, jql: str, __user__: dict = {}) -> str:
|
||||||
|
"""
|
||||||
|
Search Jira issues using JQL.
|
||||||
|
|
||||||
|
:param jql: JQL query string (e.g. 'project = PLAT AND status = "In Progress"')
|
||||||
|
:return: JSON string of matching issues
|
||||||
|
"""
|
||||||
|
try:
|
||||||
|
result = subprocess.run(
|
||||||
|
[
|
||||||
|
"mcporter",
|
||||||
|
"--config", self.valves.MCPORTER_CONFIG,
|
||||||
|
"call", "atlassian", "jira_search",
|
||||||
|
"--params", json.dumps({"jql": jql})
|
||||||
|
],
|
||||||
|
capture_output=True, text=True, timeout=30
|
||||||
|
)
|
||||||
|
if result.returncode != 0:
|
||||||
|
return f"Error: {result.stderr.strip()}"
|
||||||
|
return result.stdout.strip()
|
||||||
|
except Exception as e:
|
||||||
|
return f"Error: {str(e)}"
|
||||||
|
|
||||||
|
def jira_create(self, project: str, summary: str, description: str, issue_type: str = "Story", __user__: dict = {}) -> str:
|
||||||
|
"""
|
||||||
|
Create a Jira issue.
|
||||||
|
|
||||||
|
:param project: Jira project key (e.g. 'PLAT')
|
||||||
|
:param summary: Issue title
|
||||||
|
:param description: Issue description (markdown supported)
|
||||||
|
:param issue_type: Issue type (Story, Bug, Epic, Task). Default: Story
|
||||||
|
:return: Created issue key and URL
|
||||||
|
"""
|
||||||
|
params = {
|
||||||
|
"project": project,
|
||||||
|
"summary": summary,
|
||||||
|
"description": description,
|
||||||
|
"issueType": issue_type
|
||||||
|
}
|
||||||
|
try:
|
||||||
|
result = subprocess.run(
|
||||||
|
[
|
||||||
|
"mcporter",
|
||||||
|
"--config", self.valves.MCPORTER_CONFIG,
|
||||||
|
"call", "atlassian", "jira_create_issue",
|
||||||
|
"--params", json.dumps(params)
|
||||||
|
],
|
||||||
|
capture_output=True, text=True, timeout=30
|
||||||
|
)
|
||||||
|
if result.returncode != 0:
|
||||||
|
return f"Error: {result.stderr.strip()}"
|
||||||
|
return result.stdout.strip()
|
||||||
|
except Exception as e:
|
||||||
|
return f"Error: {str(e)}"
|
||||||
17
openwebui/preset.json
Normal file
17
openwebui/preset.json
Normal file
@@ -0,0 +1,17 @@
|
|||||||
|
{
|
||||||
|
"name": "Reltio PM Factory",
|
||||||
|
"description": "AI workspace for Product Managers — epics, PR/FAQs, analytics, and Transparent Factory compliance.",
|
||||||
|
"base_model_id": "claude-opus-4.6",
|
||||||
|
"params": {
|
||||||
|
"system": "You are an AI assistant operating inside the **Reltio PM Factory**. You help Product Managers create epics, draft PR/FAQs, query analytics, and maintain compliance with the Transparent Factory standard.\n\n## First Interaction\nWhen the user first messages you, greet them:\n> 👋 **Welcome to the Reltio PM Factory!**\n> I can help you draft PR/FAQs, create Jira Epics, query Gainsight analytics, brainstorm with the Creative Squad, or check Transparent Factory compliance. What would you like to do?\n\n## The Transparent Factory\nYou must adhere to the Reltio Transparent Factory tenets:\n- **Atomic Flagging:** 14-day TTL, OpenFeature-based feature flags.\n- **Elastic Schema:** Additive-only changes, sync dual-write, 30-day migration SLA.\n- **Cognitive Durability:** ADR decision logs, code readable in 60 seconds.\n- **Semantic Observability:** Reasoning spans, structured telemetry.\n- **Configurable Autonomy:** Governance guardrails for AI agents.\n\nIf a proposed spec or epic violates these tenets, flag it and suggest corrections.\n\n## Available Tools\nYou have access to these pipeline tools:\n- **jira_search** / **jira_create** — Search and create Jira issues via Atlassian MCP\n- **aha_create_epic** — Create epics in Aha! with proper field structure\n- **gainsight_query** — Query Gainsight PX for product analytics\n- **bmad_brainstorm** — Run a brainstorming session using BMad Creative Intelligence Suite\n- **factory_check** — Validate a spec against Transparent Factory tenets\n\nUse these tools when the user's request maps to an external action. Always confirm before creating or modifying external resources.\n\n## Style\n- Be concise and direct.\n- Use tables and bullet points for structured data.\n- When drafting epics or PR/FAQs, follow the templates in your knowledge base.\n- If unsure about a tenet, say so rather than guessing.",
|
||||||
|
"temperature": 0.7,
|
||||||
|
"max_tokens": 4096
|
||||||
|
},
|
||||||
|
"meta": {
|
||||||
|
"profile_image_url": "🏭",
|
||||||
|
"capabilities": {
|
||||||
|
"vision": false
|
||||||
|
},
|
||||||
|
"tags": ["pm", "reltio", "factory"]
|
||||||
|
}
|
||||||
|
}
|
||||||
47
skills/bmad-suite/SKILL.md
Normal file
47
skills/bmad-suite/SKILL.md
Normal file
@@ -0,0 +1,47 @@
|
|||||||
|
---
|
||||||
|
name: bmad-suite
|
||||||
|
description: Manage, update, and deploy BMad workflows/agents.
|
||||||
|
tools:
|
||||||
|
- name: bmad_update
|
||||||
|
description: Pull latest updates or clone missing repositories for BMad Suite.
|
||||||
|
entry:
|
||||||
|
type: python
|
||||||
|
path: manager.py
|
||||||
|
args: ["update"]
|
||||||
|
- name: bmad_list
|
||||||
|
description: List available workflows/agents in the suite.
|
||||||
|
entry:
|
||||||
|
type: python
|
||||||
|
path: manager.py
|
||||||
|
args: ["list"]
|
||||||
|
---
|
||||||
|
|
||||||
|
# BMad Creative Suite Manager
|
||||||
|
|
||||||
|
This skill manages the **BMad Suite** ecosystem, handling installation (git clone) and updates (git pull).
|
||||||
|
|
||||||
|
## Capabilities
|
||||||
|
- **Update/Install:** Automatically clones repositories if missing, or pulls latest changes if present.
|
||||||
|
- **List:** Enumerates available agents and workflows across all modules.
|
||||||
|
|
||||||
|
## Documentation Sources
|
||||||
|
Refer to these files for detailed usage, architecture, and agent definitions:
|
||||||
|
|
||||||
|
### 1. Framework
|
||||||
|
- **Core Documentation:** `framework/README.md`
|
||||||
|
- **Agent Definitions:** `framework/src/agents/`
|
||||||
|
|
||||||
|
### 2. Creative Intelligence Suite (CIS)
|
||||||
|
- **Agent Catalog:** `creative-intelligence-suite/docs/reference/agents.md`
|
||||||
|
- **Main Documentation:** `creative-intelligence-suite/README.md`
|
||||||
|
- **Agent Definitions:** `creative-intelligence-suite/src/agents/*.agent.yaml`
|
||||||
|
|
||||||
|
### 3. Test Architecture Enterprise (TEA)
|
||||||
|
- **Main Documentation:** `test-architecture-enterprise/README.md`
|
||||||
|
- **Workflows:** `test-architecture-enterprise/src/workflows/testarch/README.md`
|
||||||
|
|
||||||
|
## Repositories
|
||||||
|
Managed repositories (auto-cloned to `../../bmad/` relative to this skill, or `$BMAD_PATH`):
|
||||||
|
1. **Framework:** `bmad-code-org/BMAD-METHOD`
|
||||||
|
2. **Creative Intelligence Suite:** `bmad-code-org/bmad-module-creative-intelligence-suite`
|
||||||
|
3. **Test Architecture Enterprise (TEA):** `bmad-code-org/bmad-method-test-architecture-enterprise`
|
||||||
93
skills/bmad-suite/manager.py
Normal file
93
skills/bmad-suite/manager.py
Normal file
@@ -0,0 +1,93 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import subprocess
|
||||||
|
|
||||||
|
# Determine base path: use BMAD_PATH env var, or default to ../../bmad relative to this script
|
||||||
|
# transparent_factory_site/skills/bmad-suite/manager.py -> transparent_factory_site/bmad
|
||||||
|
SCRIPT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||||
|
DEFAULT_BMAD_PATH = os.path.abspath(os.path.join(SCRIPT_DIR, "../../bmad"))
|
||||||
|
BASE_PATH = os.environ.get("BMAD_PATH", DEFAULT_BMAD_PATH)
|
||||||
|
|
||||||
|
REPOS = {
|
||||||
|
"framework": {
|
||||||
|
"url": "https://github.com/bmad-code-org/BMAD-METHOD.git",
|
||||||
|
"path": os.path.join(BASE_PATH, "framework")
|
||||||
|
},
|
||||||
|
"creative-suite": {
|
||||||
|
"url": "https://github.com/bmad-code-org/bmad-module-creative-intelligence-suite.git",
|
||||||
|
"path": os.path.join(BASE_PATH, "creative-intelligence-suite")
|
||||||
|
},
|
||||||
|
"tea-module": {
|
||||||
|
"url": "https://github.com/bmad-code-org/bmad-method-test-architecture-enterprise.git",
|
||||||
|
"path": os.path.join(BASE_PATH, "test-architecture-enterprise")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
def update_or_clone(name, config):
|
||||||
|
"""Clone if missing, pull if present."""
|
||||||
|
repo_path = config["path"]
|
||||||
|
repo_url = config["url"]
|
||||||
|
|
||||||
|
# Ensure parent directory exists so we can clone into it if needed
|
||||||
|
parent_dir = os.path.dirname(repo_path)
|
||||||
|
if not os.path.exists(parent_dir):
|
||||||
|
os.makedirs(parent_dir, exist_ok=True)
|
||||||
|
|
||||||
|
# Check if it's already a git repo
|
||||||
|
if os.path.exists(os.path.join(repo_path, ".git")):
|
||||||
|
print(f"🔄 Updating {name}...")
|
||||||
|
try:
|
||||||
|
subprocess.run(["git", "pull"], cwd=repo_path, check=True)
|
||||||
|
print(f"✅ {name} updated.")
|
||||||
|
except subprocess.CalledProcessError as e:
|
||||||
|
print(f"❌ {name} update failed: {e}")
|
||||||
|
|
||||||
|
# Check if directory exists but is empty (safe to clone into)
|
||||||
|
elif os.path.exists(repo_path) and not os.listdir(repo_path):
|
||||||
|
print(f"📥 Cloning {name} into empty directory...")
|
||||||
|
try:
|
||||||
|
subprocess.run(["git", "clone", repo_url, "."], cwd=repo_path, check=True)
|
||||||
|
print(f"✅ {name} cloned.")
|
||||||
|
except subprocess.CalledProcessError as e:
|
||||||
|
print(f"❌ {name} clone failed: {e}")
|
||||||
|
|
||||||
|
# Directory doesn't exist at all
|
||||||
|
elif not os.path.exists(repo_path):
|
||||||
|
print(f"📥 Cloning {name}...")
|
||||||
|
try:
|
||||||
|
subprocess.run(["git", "clone", repo_url, repo_path], check=True)
|
||||||
|
print(f"✅ {name} cloned.")
|
||||||
|
except subprocess.CalledProcessError as e:
|
||||||
|
print(f"❌ {name} clone failed: {e}")
|
||||||
|
|
||||||
|
else:
|
||||||
|
print(f"⚠️ Target directory {repo_path} exists and is not empty (and not a git repo). Skipping.")
|
||||||
|
|
||||||
|
def list_workflows(suite_path):
|
||||||
|
"""List available workflows/agents in the suite."""
|
||||||
|
src_path = os.path.join(suite_path, "src")
|
||||||
|
if not os.path.exists(src_path):
|
||||||
|
# Fallback to listing root if src doesn't exist (e.g. some repos might differ)
|
||||||
|
if os.path.exists(suite_path):
|
||||||
|
return subprocess.getoutput(f"find {suite_path} -maxdepth 3 -name '*.md' -o -name '*.ts' -o -name '*.js' | grep -v 'node_modules' | sort")
|
||||||
|
return "Directory not found."
|
||||||
|
|
||||||
|
return subprocess.getoutput(f"find {src_path} -name '*.md' -o -name '*.ts' -o -name '*.js' -o -name '*.yaml' | sort")
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
action = sys.argv[1] if len(sys.argv) > 1 else "list"
|
||||||
|
|
||||||
|
if action == "update":
|
||||||
|
print("--- Checking BMad Suite Repositories ---")
|
||||||
|
for name, config in REPOS.items():
|
||||||
|
update_or_clone(name, config)
|
||||||
|
print("")
|
||||||
|
|
||||||
|
elif action == "list":
|
||||||
|
for name, config in REPOS.items():
|
||||||
|
print(f"--- {name} Workflows ---")
|
||||||
|
print(list_workflows(config["path"]))
|
||||||
|
print("")
|
||||||
|
else:
|
||||||
|
print(f"Unknown action: {action}")
|
||||||
74
skills/epics-standards/SKILL.md
Normal file
74
skills/epics-standards/SKILL.md
Normal file
@@ -0,0 +1,74 @@
|
|||||||
|
---
|
||||||
|
name: epics-standards
|
||||||
|
description: Create or audit RP Jira epics and linked Aha epics against PM standards. Use when creating new RP epics, creating Aha epics for a target release, checking compliance gaps, or updating epic fields/content to align with the PM standards and Aha workflow.
|
||||||
|
---
|
||||||
|
|
||||||
|
# RP Epic Standards
|
||||||
|
|
||||||
|
Use this skill for `RP` epic creation, Aha epic creation, and compliance audits.
|
||||||
|
|
||||||
|
Primary standard source:
|
||||||
|
`https://reltio.jira.com/wiki/spaces/PM/pages/2688385025/PM+Standards+Epics`
|
||||||
|
|
||||||
|
Workflow source and mapping:
|
||||||
|
- `references/aha-epic-workflow.md`
|
||||||
|
- `references/epic-fields-checklist.md`
|
||||||
|
|
||||||
|
## Workflow
|
||||||
|
|
||||||
|
1. Open the standard page and use it as source of truth.
|
||||||
|
2. Validate/collect required inputs (see Intake below).
|
||||||
|
3. Create/update Jira epic and Aha epic per workflow in `references/aha-epic-workflow.md`.
|
||||||
|
4. Link Jira and Aha (`Aha! Reference` in Jira).
|
||||||
|
5. Audit compliance against `references/epic-fields-checklist.md`.
|
||||||
|
6. If fields are not editable in Jira/Aha, document exact gaps and owner.
|
||||||
|
|
||||||
|
## Intake (Interactive)
|
||||||
|
|
||||||
|
If required fields are not provided in the prompt, ask concise follow-up questions before creating records.
|
||||||
|
|
||||||
|
Minimum required fields to ask for:
|
||||||
|
- Jira epic key (if already created) or request to create one
|
||||||
|
- Aha release (for example `2026.2.0.0`)
|
||||||
|
- Epic short name (Jira/Aha title)
|
||||||
|
- Problem Statement / Why
|
||||||
|
- Solution / What
|
||||||
|
- Persona / Who
|
||||||
|
- Value Statement
|
||||||
|
- Confidence Level
|
||||||
|
- Product, Engineering, and UX leads
|
||||||
|
- Execution Team
|
||||||
|
- Required flags: Doc, Provisioning, UI/UX, Security, Training
|
||||||
|
|
||||||
|
Ask only for missing items. Do not proceed with creation until minimum fields are available.
|
||||||
|
|
||||||
|
## Epic Content Requirements
|
||||||
|
|
||||||
|
Ensure description includes:
|
||||||
|
- Problem Statement / Why
|
||||||
|
- Solution / What
|
||||||
|
- Persona / Who
|
||||||
|
- Value Statement
|
||||||
|
|
||||||
|
Keep summary short and clear for Jira readability.
|
||||||
|
|
||||||
|
## Aha Creation Code
|
||||||
|
|
||||||
|
Use the bundled script:
|
||||||
|
- `scripts/aha_create_epic.py`
|
||||||
|
|
||||||
|
Example:
|
||||||
|
```bash
|
||||||
|
python3 skills/epics-standards/scripts/aha_create_epic.py \
|
||||||
|
--release MDM-R-889 \
|
||||||
|
--name "RDM PrivateLink on AWS" \
|
||||||
|
--description "Tracks Jira epic RP-176273 (https://reltio.jira.com/browse/RP-176273)." \
|
||||||
|
--jira-key RP-176273
|
||||||
|
```
|
||||||
|
|
||||||
|
The script reads Aha credentials from `~/.mcporter/mcporter.json` (`mcpServers.aha.env`).
|
||||||
|
|
||||||
|
## Integration Guardrails
|
||||||
|
|
||||||
|
If the epic is created in Jira first, verify that Aha linkage is present (`Aha! Reference`).
|
||||||
|
If missing, update Jira with the created Aha URL.
|
||||||
4
skills/epics-standards/agents/openai.yaml
Normal file
4
skills/epics-standards/agents/openai.yaml
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
interface:
|
||||||
|
display_name: "RP Epic Standards"
|
||||||
|
short_description: "Create and audit RP epics for PM standards"
|
||||||
|
default_prompt: "Use $epics-standards to create or audit linked Jira/Aha epics and ask me for any missing required fields before creating records."
|
||||||
55
skills/epics-standards/references/aha-epic-workflow.md
Normal file
55
skills/epics-standards/references/aha-epic-workflow.md
Normal file
@@ -0,0 +1,55 @@
|
|||||||
|
# Aha Epic Workflow
|
||||||
|
|
||||||
|
This file captures the expected Aha epic lifecycle and downstream triggers.
|
||||||
|
|
||||||
|
## Idea Management
|
||||||
|
|
||||||
|
1. Customer submits idea in Aha.
|
||||||
|
2. Check voting threshold: `5` unique orgs.
|
||||||
|
3. PM review required (`Needs Review` -> `Reviewed`).
|
||||||
|
4. PM decision:
|
||||||
|
- Accept current release -> `Planned`
|
||||||
|
- Accept next release -> `Future Consideration`
|
||||||
|
- Exists -> `Already Exists`
|
||||||
|
- Need more info -> `Needs More Info` + comment
|
||||||
|
- Reject/later -> `Future Consideration` + comment
|
||||||
|
5. Always add public customer comment after decision.
|
||||||
|
6. Ensure epic relation:
|
||||||
|
- Promote idea to new epic, or
|
||||||
|
- Link idea to existing epic.
|
||||||
|
|
||||||
|
## Epic Creation and Management
|
||||||
|
|
||||||
|
When epic exists in Aha, fill mandatory fields:
|
||||||
|
- Summary with PRD linkage
|
||||||
|
- Release and availability (`Preview`/`GA`)
|
||||||
|
- Confidence level (`High`/`Med`/`Low`)
|
||||||
|
- Product type and persona
|
||||||
|
- Execution team and initiative
|
||||||
|
- Product lead, engineering lead, UX lead
|
||||||
|
|
||||||
|
Set required flags and trigger follow-ups:
|
||||||
|
- `Doc Required = Yes` -> specify doc type and doc-team flow
|
||||||
|
- `Provisioning = Yes` -> PCC/Olga flow
|
||||||
|
- `UI/UX Required = Yes` -> engage UX
|
||||||
|
- `Security Review = Yes` -> engage security
|
||||||
|
- `Training Required = Yes` -> engage training
|
||||||
|
|
||||||
|
Integration:
|
||||||
|
- Ensure Aha -> Jira Webhooks 2.0 integration path is respected.
|
||||||
|
- Ensure Jira epic has `Aha! Reference`.
|
||||||
|
|
||||||
|
## Enablement Outputs
|
||||||
|
|
||||||
|
If applicable, drive enablement:
|
||||||
|
- Technical enablement session (config)
|
||||||
|
- GTM/sales enablement artifacts
|
||||||
|
- Webinar for major highlights
|
||||||
|
|
||||||
|
Template links from workflow:
|
||||||
|
- Technical Enablement Session template:
|
||||||
|
`https://docs.google.com/presentation/d/1fCZhOUSV7McX1edmYoKBHYtKnYajykbm1U2N5aJ1j-M/edit?slide=id.g39258ed0d71_0_442`
|
||||||
|
- Value Statements input template:
|
||||||
|
`https://docs.google.com/document/d/1YEquYIjt8gMtGLf8EJFfvwS0f_ij1KuIfQFOjlcOEjI/edit`
|
||||||
|
- Sales Enablement deck example:
|
||||||
|
`https://docs.google.com/presentation/d/1mIlC3OhhQgdwcFPgJ328pm1oQl5W6y-w/edit`
|
||||||
53
skills/epics-standards/references/epic-fields-checklist.md
Normal file
53
skills/epics-standards/references/epic-fields-checklist.md
Normal file
@@ -0,0 +1,53 @@
|
|||||||
|
# RP Epic Field Checklist
|
||||||
|
|
||||||
|
Standard page:
|
||||||
|
`https://reltio.jira.com/wiki/spaces/PM/pages/2688385025/PM+Standards+Epics`
|
||||||
|
|
||||||
|
Use this checklist during epic create/update.
|
||||||
|
|
||||||
|
## Core fields
|
||||||
|
|
||||||
|
- `Summary` (short Jira shorthand)
|
||||||
|
- `Description` (business outcome + impact)
|
||||||
|
- `Type of Effort`
|
||||||
|
- `Fix Version` (or `N/A` when not yet planned)
|
||||||
|
- `Status`
|
||||||
|
|
||||||
|
## Product narrative fields
|
||||||
|
|
||||||
|
- Problem Statement / Why
|
||||||
|
- Solution / What
|
||||||
|
- Persona / Who
|
||||||
|
- Value Statement
|
||||||
|
|
||||||
|
## Planning and delivery fields
|
||||||
|
|
||||||
|
- Confidence Level
|
||||||
|
- Path to Green (required when Confidence is Medium/Low for must-have epics)
|
||||||
|
- Availability in Release
|
||||||
|
- Planned Release
|
||||||
|
- T-Shirt Size
|
||||||
|
- Tier
|
||||||
|
- Initiative
|
||||||
|
|
||||||
|
## Ownership and dependencies
|
||||||
|
|
||||||
|
- Product Lead
|
||||||
|
- Engineering Lead
|
||||||
|
- UX Lead
|
||||||
|
- Execution Team
|
||||||
|
- Execution Team Dependency
|
||||||
|
|
||||||
|
## Go-to-market and governance
|
||||||
|
|
||||||
|
- For documentation tickets: set `Affected Documentation` to a documentation target (for example `Help Portal`) and never `No`
|
||||||
|
- Doc Required
|
||||||
|
- Pricing Required
|
||||||
|
- Security Review Required
|
||||||
|
- Does this change PCC?
|
||||||
|
- Tags (including `Must Have` when applicable)
|
||||||
|
- Demo Link (when available)
|
||||||
|
|
||||||
|
## Integration check
|
||||||
|
|
||||||
|
- `Aha! Reference` is present and linked through the RP integration path.
|
||||||
103
skills/epics-standards/scripts/aha_create_epic.py
Normal file
103
skills/epics-standards/scripts/aha_create_epic.py
Normal file
@@ -0,0 +1,103 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Create an Aha epic in a target release using credentials from ~/.mcporter/mcporter.json.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
python3 skills/epics-standards/scripts/aha_create_epic.py \
|
||||||
|
--release MDM-R-889 \
|
||||||
|
--name "RDM PrivateLink on AWS" \
|
||||||
|
--description "Tracks Jira epic RP-176273 (...)" \
|
||||||
|
--jira-key RP-176273
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
import pathlib
|
||||||
|
import sys
|
||||||
|
import urllib.error
|
||||||
|
import urllib.parse
|
||||||
|
import urllib.request
|
||||||
|
|
||||||
|
|
||||||
|
MCPORTER_CONFIG = pathlib.Path.home() / ".mcporter" / "mcporter.json"
|
||||||
|
|
||||||
|
|
||||||
|
def load_aha_env() -> dict[str, str]:
|
||||||
|
cfg = json.loads(MCPORTER_CONFIG.read_text())
|
||||||
|
env = cfg["mcpServers"]["aha"]["env"]
|
||||||
|
return {"domain": env["AHA_DOMAIN"], "token": env["AHA_API_TOKEN"]}
|
||||||
|
|
||||||
|
|
||||||
|
def request(method: str, url: str, token: str, payload: dict | None = None) -> dict:
|
||||||
|
headers = {
|
||||||
|
"Authorization": f"Bearer {token}",
|
||||||
|
"Accept": "application/json",
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
}
|
||||||
|
body = None if payload is None else json.dumps(payload).encode("utf-8")
|
||||||
|
req = urllib.request.Request(url, data=body, headers=headers, method=method)
|
||||||
|
with urllib.request.urlopen(req, timeout=30) as resp:
|
||||||
|
return json.loads(resp.read().decode("utf-8"))
|
||||||
|
|
||||||
|
|
||||||
|
def release_ref_from_name(domain: str, token: str, release_name: str) -> str:
|
||||||
|
q = urllib.parse.quote(release_name)
|
||||||
|
url = f"https://{domain}.aha.io/api/v1/releases?q={q}&per_page=200"
|
||||||
|
data = request("GET", url, token)
|
||||||
|
for rel in data.get("releases", []):
|
||||||
|
if rel.get("name") == release_name:
|
||||||
|
return rel["reference_num"]
|
||||||
|
raise ValueError(f"Release not found by name: {release_name}")
|
||||||
|
|
||||||
|
|
||||||
|
def main() -> int:
|
||||||
|
parser = argparse.ArgumentParser()
|
||||||
|
parser.add_argument("--release", required=True, help="Release reference (e.g. MDM-R-889) or exact release name (e.g. 2026.2.0.0)")
|
||||||
|
parser.add_argument("--name", required=True, help="Aha epic name")
|
||||||
|
parser.add_argument("--description", required=True, help="Aha epic description/body")
|
||||||
|
parser.add_argument("--jira-key", required=False, help="Optional Jira key to print linking reminder")
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
try:
|
||||||
|
aha = load_aha_env()
|
||||||
|
release = args.release
|
||||||
|
if not release.startswith("MDM-R-"):
|
||||||
|
release = release_ref_from_name(aha["domain"], aha["token"], release)
|
||||||
|
|
||||||
|
url = f"https://{aha['domain']}.aha.io/api/v1/releases/{release}/epics"
|
||||||
|
payload = {"epic": {"name": args.name, "description": args.description}}
|
||||||
|
data = request("POST", url, aha["token"], payload)
|
||||||
|
epic = data.get("epic", {})
|
||||||
|
|
||||||
|
print(json.dumps(
|
||||||
|
{
|
||||||
|
"aha_reference": epic.get("reference_num"),
|
||||||
|
"aha_url": epic.get("url"),
|
||||||
|
"release": release,
|
||||||
|
"jira_key": args.jira_key,
|
||||||
|
},
|
||||||
|
ensure_ascii=True,
|
||||||
|
))
|
||||||
|
|
||||||
|
if args.jira_key and epic.get("url"):
|
||||||
|
print(
|
||||||
|
f"Next: set Jira {args.jira_key} Aha! Reference = {epic['url']}",
|
||||||
|
file=sys.stderr,
|
||||||
|
)
|
||||||
|
return 0
|
||||||
|
except (KeyError, FileNotFoundError, ValueError) as exc:
|
||||||
|
print(f"Config/input error: {exc}", file=sys.stderr)
|
||||||
|
return 2
|
||||||
|
except urllib.error.HTTPError as exc:
|
||||||
|
detail = exc.read().decode("utf-8", errors="replace")
|
||||||
|
print(f"Aha API HTTP {exc.code}: {detail}", file=sys.stderr)
|
||||||
|
return 3
|
||||||
|
except Exception as exc: # noqa: BLE001
|
||||||
|
print(f"Unexpected error: {exc}", file=sys.stderr)
|
||||||
|
return 1
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
raise SystemExit(main())
|
||||||
31
skills/factory-standards/SKILL.md
Normal file
31
skills/factory-standards/SKILL.md
Normal file
@@ -0,0 +1,31 @@
|
|||||||
|
---
|
||||||
|
name: factory-standards
|
||||||
|
description: Manage and access the Transparent Factory engineering and product tenets from Bitbucket.
|
||||||
|
tools:
|
||||||
|
- name: factory_update
|
||||||
|
description: Clone or pull the latest Transparent Factory rules and PR/FAQ from the central repository.
|
||||||
|
entry:
|
||||||
|
type: python
|
||||||
|
path: manager.py
|
||||||
|
args: ["update"]
|
||||||
|
- name: factory_list
|
||||||
|
description: List available tenets, standards, and guidelines in the Transparent Factory repository.
|
||||||
|
entry:
|
||||||
|
type: python
|
||||||
|
path: manager.py
|
||||||
|
args: ["list"]
|
||||||
|
---
|
||||||
|
|
||||||
|
# Transparent Factory Standards Manager
|
||||||
|
|
||||||
|
This skill synchronizes the definitive **Transparent Factory** architectural and product rules into your local project from the central Bitbucket repository.
|
||||||
|
|
||||||
|
## Capabilities
|
||||||
|
- **Update/Install:** Automatically clones `reltio-ondemand/transparent-factory.git` if missing, or pulls latest changes if present.
|
||||||
|
- **List:** Enumerates available tenets (`content/`), PR/FAQ documents, and exported factory skills.
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
Managed files are auto-cloned to `.standards/factory/` relative to the root of your project workspace (e.g., if this skill is in `skills/factory-standards/`, it clones to `../../.standards/factory/`).
|
||||||
|
|
||||||
|
## How to use in AI Prompts
|
||||||
|
If you are an AI agent, you must run `factory_update` to ensure the local `.standards/factory/` directory is present and up-to-date before writing any Product Requirements Documents, Epics, or technical specifications.
|
||||||
89
skills/factory-standards/manager.py
Normal file
89
skills/factory-standards/manager.py
Normal file
@@ -0,0 +1,89 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import subprocess
|
||||||
|
|
||||||
|
# Determine base path: use FACTORY_PATH env var, or default to ../../.standards/factory relative to this script
|
||||||
|
SCRIPT_DIR = os.path.dirname(os.path.abspath(__file__))
|
||||||
|
DEFAULT_FACTORY_PATH = os.path.abspath(os.path.join(SCRIPT_DIR, "../../.standards/factory"))
|
||||||
|
FACTORY_PATH = os.environ.get("FACTORY_PATH", DEFAULT_FACTORY_PATH)
|
||||||
|
|
||||||
|
REPO_URL = "git@bitbucket.org:reltio-ondemand/transparent-factory.git"
|
||||||
|
|
||||||
|
def ensure_directory(path):
|
||||||
|
"""Ensure the parent directory exists."""
|
||||||
|
parent = os.path.dirname(path)
|
||||||
|
if not os.path.exists(parent):
|
||||||
|
print(f"📂 Creating directory: {parent}")
|
||||||
|
os.makedirs(parent, exist_ok=True)
|
||||||
|
|
||||||
|
def update_or_clone():
|
||||||
|
"""Clone the Transparent Factory repo if missing, pull if present."""
|
||||||
|
ensure_directory(FACTORY_PATH)
|
||||||
|
|
||||||
|
# Check if it's already a git repo
|
||||||
|
if os.path.exists(os.path.join(FACTORY_PATH, ".git")):
|
||||||
|
print(f"🔄 Updating Transparent Factory Standards...")
|
||||||
|
try:
|
||||||
|
subprocess.run(["git", "pull"], cwd=FACTORY_PATH, check=True)
|
||||||
|
print(f"✅ Standards updated at {FACTORY_PATH}.")
|
||||||
|
except subprocess.CalledProcessError as e:
|
||||||
|
print(f"❌ Update failed: {e}")
|
||||||
|
|
||||||
|
# Check if directory exists but is empty (safe to clone into)
|
||||||
|
elif os.path.exists(FACTORY_PATH) and not os.listdir(FACTORY_PATH):
|
||||||
|
print(f"📥 Cloning Transparent Factory Standards into empty directory...")
|
||||||
|
try:
|
||||||
|
subprocess.run(["git", "clone", REPO_URL, "."], cwd=FACTORY_PATH, check=True)
|
||||||
|
print(f"✅ Standards cloned to {FACTORY_PATH}.")
|
||||||
|
except subprocess.CalledProcessError as e:
|
||||||
|
print(f"❌ Clone failed: {e}")
|
||||||
|
|
||||||
|
# Directory doesn't exist at all
|
||||||
|
elif not os.path.exists(FACTORY_PATH):
|
||||||
|
print(f"📥 Cloning Transparent Factory Standards...")
|
||||||
|
try:
|
||||||
|
subprocess.run(["git", "clone", REPO_URL, FACTORY_PATH], check=True)
|
||||||
|
print(f"✅ Standards cloned to {FACTORY_PATH}.")
|
||||||
|
except subprocess.CalledProcessError as e:
|
||||||
|
print(f"❌ Clone failed: {e}")
|
||||||
|
|
||||||
|
else:
|
||||||
|
print(f"⚠️ Target directory {FACTORY_PATH} exists and is not empty (and not a git repo). Skipping.")
|
||||||
|
|
||||||
|
def list_standards():
|
||||||
|
"""List available standards and PR/FAQ documents."""
|
||||||
|
if not os.path.exists(FACTORY_PATH):
|
||||||
|
return "Standards not found. Run 'factory_update' first."
|
||||||
|
|
||||||
|
print("--- Core Documents ---")
|
||||||
|
try:
|
||||||
|
# Look for the primary PR/FAQ or README
|
||||||
|
print(subprocess.getoutput(f"find {FACTORY_PATH} -maxdepth 1 -name '*.md' | sort"))
|
||||||
|
|
||||||
|
# Look inside the content folder
|
||||||
|
content_path = os.path.join(FACTORY_PATH, "content")
|
||||||
|
if os.path.exists(content_path):
|
||||||
|
print("\n--- Tenets & Content ---")
|
||||||
|
print(subprocess.getoutput(f"find {content_path} -name '*.md' | sort"))
|
||||||
|
|
||||||
|
# Look inside skills folder
|
||||||
|
skills_path = os.path.join(FACTORY_PATH, "skills")
|
||||||
|
if os.path.exists(skills_path):
|
||||||
|
print("\n--- Available Factory Skills ---")
|
||||||
|
print(subprocess.getoutput(f"find {skills_path} -name 'SKILL.md' | sort"))
|
||||||
|
except Exception as e:
|
||||||
|
return f"Error listing files: {e}"
|
||||||
|
|
||||||
|
return ""
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
action = sys.argv[1] if len(sys.argv) > 1 else "list"
|
||||||
|
|
||||||
|
if action == "update":
|
||||||
|
update_or_clone()
|
||||||
|
elif action == "list":
|
||||||
|
print(list_standards())
|
||||||
|
else:
|
||||||
|
print(f"Unknown action: {action}")
|
||||||
|
sys.exit(1)
|
||||||
54
skills/gainsight-px/SKILL.md
Normal file
54
skills/gainsight-px/SKILL.md
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
---
|
||||||
|
name: gainsight-px
|
||||||
|
description: Interact directly with the Gainsight PX REST API to fetch user/account data or track events.
|
||||||
|
tools:
|
||||||
|
- name: px_get_user
|
||||||
|
description: Fetch a specific user by their unique identity ID.
|
||||||
|
entry:
|
||||||
|
type: python
|
||||||
|
path: gainsight_px.py
|
||||||
|
args: ["get_user"]
|
||||||
|
- name: px_get_account
|
||||||
|
description: Fetch a specific account by its ID.
|
||||||
|
entry:
|
||||||
|
type: python
|
||||||
|
path: gainsight_px.py
|
||||||
|
args: ["get_account"]
|
||||||
|
- name: px_search_user
|
||||||
|
description: Search for a user in Gainsight PX by their email address.
|
||||||
|
entry:
|
||||||
|
type: python
|
||||||
|
path: gainsight_px.py
|
||||||
|
args: ["search_user"]
|
||||||
|
- name: px_track_event
|
||||||
|
description: Track a custom event for a user in Gainsight PX. Requires user_id, event_name, and optional JSON properties.
|
||||||
|
entry:
|
||||||
|
type: python
|
||||||
|
path: gainsight_px.py
|
||||||
|
args: ["track_event"]
|
||||||
|
---
|
||||||
|
|
||||||
|
# Gainsight PX REST API Skill
|
||||||
|
|
||||||
|
This skill allows agents to natively interface with your Gainsight PX instance without needing an intermediate MCP server like Pipedream or Zapier.
|
||||||
|
|
||||||
|
## Setup
|
||||||
|
|
||||||
|
You must export your API key before using the tools. You can generate an API key from your Gainsight PX Administration -> REST API section.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Add this to your environment
|
||||||
|
export GAINSIGHT_PX_API_KEY="your-api-key-here"
|
||||||
|
|
||||||
|
# Optional: If you are in the EU region, set this flag. Default is US.
|
||||||
|
export GAINSIGHT_PX_REGION="EU"
|
||||||
|
```
|
||||||
|
|
||||||
|
## How It Works
|
||||||
|
|
||||||
|
It uses a lightweight Python script (`gainsight_px.py`) that implements standard REST endpoints documented by Apiary (`https://api.aptrinsic.com/v1/...`).
|
||||||
|
|
||||||
|
### Capabilities
|
||||||
|
- **Lookups:** Find exactly who a user is by ID or email.
|
||||||
|
- **Account Context:** Pull account metadata.
|
||||||
|
- **Event Injection:** Push arbitrary telemetry events natively.
|
||||||
108
skills/gainsight-px/gainsight_px.py
Normal file
108
skills/gainsight-px/gainsight_px.py
Normal file
@@ -0,0 +1,108 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
import json
|
||||||
|
import urllib.request
|
||||||
|
import urllib.error
|
||||||
|
|
||||||
|
# Gainsight PX API configuration
|
||||||
|
# Region determines the base URL (US or EU)
|
||||||
|
PX_REGION = os.environ.get("GAINSIGHT_PX_REGION", "US").upper()
|
||||||
|
PX_API_KEY = os.environ.get("GAINSIGHT_PX_API_KEY")
|
||||||
|
|
||||||
|
if PX_REGION == "EU":
|
||||||
|
BASE_URL = "https://eu-api.aptrinsic.com/v1"
|
||||||
|
else:
|
||||||
|
BASE_URL = "https://api.aptrinsic.com/v1"
|
||||||
|
|
||||||
|
def make_request(method, endpoint, data=None):
|
||||||
|
if not PX_API_KEY:
|
||||||
|
print(json.dumps({"error": "GAINSIGHT_PX_API_KEY environment variable is missing."}))
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
url = f"{BASE_URL}{endpoint}"
|
||||||
|
headers = {
|
||||||
|
"X-APITOKEN": PX_API_KEY,
|
||||||
|
"Content-Type": "application/json",
|
||||||
|
"Accept": "application/json"
|
||||||
|
}
|
||||||
|
|
||||||
|
req_data = None
|
||||||
|
if data:
|
||||||
|
req_data = json.dumps(data).encode("utf-8")
|
||||||
|
|
||||||
|
req = urllib.request.Request(url, data=req_data, headers=headers, method=method)
|
||||||
|
|
||||||
|
try:
|
||||||
|
with urllib.request.urlopen(req) as response:
|
||||||
|
return json.loads(response.read().decode("utf-8"))
|
||||||
|
except urllib.error.HTTPError as e:
|
||||||
|
err_msg = e.read().decode("utf-8")
|
||||||
|
try:
|
||||||
|
parsed_err = json.loads(err_msg)
|
||||||
|
return {"error": f"HTTP {e.code}", "details": parsed_err}
|
||||||
|
except:
|
||||||
|
return {"error": f"HTTP {e.code}", "details": err_msg}
|
||||||
|
except Exception as e:
|
||||||
|
return {"error": str(e)}
|
||||||
|
|
||||||
|
def get_user(user_id):
|
||||||
|
"""Retrieve a specific user by their identifyId."""
|
||||||
|
return make_request("GET", f"/users/{user_id}")
|
||||||
|
|
||||||
|
def get_account(account_id):
|
||||||
|
"""Retrieve a specific account by its id."""
|
||||||
|
return make_request("GET", f"/accounts/{account_id}")
|
||||||
|
|
||||||
|
def search_users(email):
|
||||||
|
"""Search for users by email (requires query payload)."""
|
||||||
|
payload = {
|
||||||
|
"filter": {
|
||||||
|
"operator": "AND",
|
||||||
|
"conditions": [
|
||||||
|
{
|
||||||
|
"name": "email",
|
||||||
|
"operator": "EQ",
|
||||||
|
"value": email
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return make_request("POST", "/users/query", data=payload)
|
||||||
|
|
||||||
|
def track_event(user_id, event_name, properties=None):
|
||||||
|
"""Track a custom event for a specific user."""
|
||||||
|
payload = {
|
||||||
|
"identifyId": user_id,
|
||||||
|
"eventName": event_name,
|
||||||
|
"properties": properties or {}
|
||||||
|
}
|
||||||
|
# Note: tracking usually happens via a different endpoint or batch API,
|
||||||
|
# but for simplicity assuming a standard REST event ingestion if available.
|
||||||
|
return make_request("POST", "/events/custom", data=payload)
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
if len(sys.argv) < 2:
|
||||||
|
print(json.dumps({"error": "Missing action. Use: get_user, get_account, search_user, track_event"}))
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
action = sys.argv[1]
|
||||||
|
|
||||||
|
if action == "get_user" and len(sys.argv) == 3:
|
||||||
|
print(json.dumps(get_user(sys.argv[2]), indent=2))
|
||||||
|
|
||||||
|
elif action == "get_account" and len(sys.argv) == 3:
|
||||||
|
print(json.dumps(get_account(sys.argv[2]), indent=2))
|
||||||
|
|
||||||
|
elif action == "search_user" and len(sys.argv) == 3:
|
||||||
|
print(json.dumps(search_users(sys.argv[2]), indent=2))
|
||||||
|
|
||||||
|
elif action == "track_event" and len(sys.argv) >= 4:
|
||||||
|
user_id = sys.argv[2]
|
||||||
|
event_name = sys.argv[3]
|
||||||
|
props = json.loads(sys.argv[4]) if len(sys.argv) > 4 else {}
|
||||||
|
print(json.dumps(track_event(user_id, event_name, props), indent=2))
|
||||||
|
|
||||||
|
else:
|
||||||
|
print(json.dumps({"error": f"Unknown action or missing arguments: {action}"}))
|
||||||
|
sys.exit(1)
|
||||||
54
start.sh
Executable file
54
start.sh
Executable file
@@ -0,0 +1,54 @@
|
|||||||
|
#!/usr/bin/env bash
|
||||||
|
set -e
|
||||||
|
|
||||||
|
SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"
|
||||||
|
cd "$SCRIPT_DIR"
|
||||||
|
|
||||||
|
# --- Check Node.js ---
|
||||||
|
if ! command -v node &>/dev/null; then
|
||||||
|
echo "❌ Node.js is not installed."
|
||||||
|
echo " Install it from https://nodejs.org (LTS recommended)"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "✅ Node.js $(node -v)"
|
||||||
|
|
||||||
|
# --- Check .env ---
|
||||||
|
if [ ! -f .env ]; then
|
||||||
|
if [ -f env.example ]; then
|
||||||
|
cp env.example .env
|
||||||
|
echo "⚠️ Created .env from template. Please edit it with your API keys, then re-run this script."
|
||||||
|
exit 1
|
||||||
|
else
|
||||||
|
echo "⚠️ No .env or env.example found. Continuing without environment config."
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Install web dependencies ---
|
||||||
|
if [ ! -d web/node_modules ]; then
|
||||||
|
echo "📦 Installing dependencies..."
|
||||||
|
cd web && npm install && cd ..
|
||||||
|
fi
|
||||||
|
|
||||||
|
# --- Launch ---
|
||||||
|
echo ""
|
||||||
|
echo "🔥 Starting Forge Console..."
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
# Open browser after a short delay
|
||||||
|
(sleep 2 && {
|
||||||
|
URL="http://localhost:${PORT:-3000}"
|
||||||
|
if command -v xdg-open &>/dev/null; then
|
||||||
|
xdg-open "$URL" 2>/dev/null
|
||||||
|
elif command -v open &>/dev/null; then
|
||||||
|
open "$URL"
|
||||||
|
elif command -v start &>/dev/null; then
|
||||||
|
start "$URL"
|
||||||
|
else
|
||||||
|
echo " Open $URL in your browser"
|
||||||
|
fi
|
||||||
|
}) &
|
||||||
|
|
||||||
|
# Run server (trap for clean exit)
|
||||||
|
trap 'echo ""; echo "👋 Shutting down..."; kill %1 2>/dev/null; exit 0' INT TERM
|
||||||
|
cd web && node server.js
|
||||||
849
web/package-lock.json
generated
Normal file
849
web/package-lock.json
generated
Normal file
@@ -0,0 +1,849 @@
|
|||||||
|
{
|
||||||
|
"name": "pm-template-web",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"lockfileVersion": 3,
|
||||||
|
"requires": true,
|
||||||
|
"packages": {
|
||||||
|
"": {
|
||||||
|
"name": "pm-template-web",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"dependencies": {
|
||||||
|
"express": "^4.18.2",
|
||||||
|
"ws": "^8.16.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/accepts": {
|
||||||
|
"version": "1.3.8",
|
||||||
|
"resolved": "https://registry.npmjs.org/accepts/-/accepts-1.3.8.tgz",
|
||||||
|
"integrity": "sha512-PYAthTa2m2VKxuvSD3DPC/Gy+U+sOA1LAuT8mkmRuvw+NACSaeXEQ+NHcVF7rONl6qcaxV3Uuemwawk+7+SJLw==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"mime-types": "~2.1.34",
|
||||||
|
"negotiator": "0.6.3"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/array-flatten": {
|
||||||
|
"version": "1.1.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/array-flatten/-/array-flatten-1.1.1.tgz",
|
||||||
|
"integrity": "sha512-PCVAQswWemu6UdxsDFFX/+gVeYqKAod3D3UVm91jHwynguOwAvYPhx8nNlM++NqRcK6CxxpUafjmhIdKiHibqg==",
|
||||||
|
"license": "MIT"
|
||||||
|
},
|
||||||
|
"node_modules/body-parser": {
|
||||||
|
"version": "1.20.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.20.4.tgz",
|
||||||
|
"integrity": "sha512-ZTgYYLMOXY9qKU/57FAo8F+HA2dGX7bqGc71txDRC1rS4frdFI5R7NhluHxH6M0YItAP0sHB4uqAOcYKxO6uGA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"bytes": "~3.1.2",
|
||||||
|
"content-type": "~1.0.5",
|
||||||
|
"debug": "2.6.9",
|
||||||
|
"depd": "2.0.0",
|
||||||
|
"destroy": "~1.2.0",
|
||||||
|
"http-errors": "~2.0.1",
|
||||||
|
"iconv-lite": "~0.4.24",
|
||||||
|
"on-finished": "~2.4.1",
|
||||||
|
"qs": "~6.14.0",
|
||||||
|
"raw-body": "~2.5.3",
|
||||||
|
"type-is": "~1.6.18",
|
||||||
|
"unpipe": "~1.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8",
|
||||||
|
"npm": "1.2.8000 || >= 1.4.16"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/bytes": {
|
||||||
|
"version": "3.1.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/bytes/-/bytes-3.1.2.tgz",
|
||||||
|
"integrity": "sha512-/Nf7TyzTx6S3yRJObOAV7956r8cr2+Oj8AC5dt8wSP3BQAoeX58NoHyCU8P8zGkNXStjTSi6fzO6F0pBdcYbEg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/call-bind-apply-helpers": {
|
||||||
|
"version": "1.0.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/call-bind-apply-helpers/-/call-bind-apply-helpers-1.0.2.tgz",
|
||||||
|
"integrity": "sha512-Sp1ablJ0ivDkSzjcaJdxEunN5/XvksFJ2sMBFfq6x0ryhQV/2b/KwFe21cMpmHtPOSij8K99/wSfoEuTObmuMQ==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"es-errors": "^1.3.0",
|
||||||
|
"function-bind": "^1.1.2"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/call-bound": {
|
||||||
|
"version": "1.0.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/call-bound/-/call-bound-1.0.4.tgz",
|
||||||
|
"integrity": "sha512-+ys997U96po4Kx/ABpBCqhA9EuxJaQWDQg7295H4hBphv3IZg0boBKuwYpt4YXp6MZ5AmZQnU/tyMTlRpaSejg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"call-bind-apply-helpers": "^1.0.2",
|
||||||
|
"get-intrinsic": "^1.3.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/ljharb"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/content-disposition": {
|
||||||
|
"version": "0.5.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/content-disposition/-/content-disposition-0.5.4.tgz",
|
||||||
|
"integrity": "sha512-FveZTNuGw04cxlAiWbzi6zTAL/lhehaWbTtgluJh4/E95DqMwTmha3KZN1aAWA8cFIhHzMZUvLevkw5Rqk+tSQ==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"safe-buffer": "5.2.1"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/content-type": {
|
||||||
|
"version": "1.0.5",
|
||||||
|
"resolved": "https://registry.npmjs.org/content-type/-/content-type-1.0.5.tgz",
|
||||||
|
"integrity": "sha512-nTjqfcBFEipKdXCv4YDQWCfmcLZKm81ldF0pAopTvyrFGVbcR6P/VAAd5G7N+0tTr8QqiU0tFadD6FK4NtJwOA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/cookie": {
|
||||||
|
"version": "0.7.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/cookie/-/cookie-0.7.2.tgz",
|
||||||
|
"integrity": "sha512-yki5XnKuf750l50uGTllt6kKILY4nQ1eNIQatoXEByZ5dWgnKqbnqmTrBE5B4N7lrMJKQ2ytWMiTO2o0v6Ew/w==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/cookie-signature": {
|
||||||
|
"version": "1.0.7",
|
||||||
|
"resolved": "https://registry.npmjs.org/cookie-signature/-/cookie-signature-1.0.7.tgz",
|
||||||
|
"integrity": "sha512-NXdYc3dLr47pBkpUCHtKSwIOQXLVn8dZEuywboCOJY/osA0wFSLlSawr3KN8qXJEyX66FcONTH8EIlVuK0yyFA==",
|
||||||
|
"license": "MIT"
|
||||||
|
},
|
||||||
|
"node_modules/debug": {
|
||||||
|
"version": "2.6.9",
|
||||||
|
"resolved": "https://registry.npmjs.org/debug/-/debug-2.6.9.tgz",
|
||||||
|
"integrity": "sha512-bC7ElrdJaJnPbAP+1EotYvqZsb3ecl5wi6Bfi6BJTUcNowp6cvspg0jXznRTKDjm/E7AdgFBVeAPVMNcKGsHMA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"ms": "2.0.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/depd": {
|
||||||
|
"version": "2.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/depd/-/depd-2.0.0.tgz",
|
||||||
|
"integrity": "sha512-g7nH6P6dyDioJogAAGprGpCtVImJhpPk/roCzdb3fIh61/s/nPsfR6onyMwkCAR/OlC3yBC0lESvUoQEAssIrw==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/destroy": {
|
||||||
|
"version": "1.2.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/destroy/-/destroy-1.2.0.tgz",
|
||||||
|
"integrity": "sha512-2sJGJTaXIIaR1w4iJSNoN0hnMY7Gpc/n8D4qSCJw8QqFWXf7cuAgnEHxBpweaVcPevC2l3KpjYCx3NypQQgaJg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8",
|
||||||
|
"npm": "1.2.8000 || >= 1.4.16"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/dunder-proto": {
|
||||||
|
"version": "1.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/dunder-proto/-/dunder-proto-1.0.1.tgz",
|
||||||
|
"integrity": "sha512-KIN/nDJBQRcXw0MLVhZE9iQHmG68qAVIBg9CqmUYjmQIhgij9U5MFvrqkUL5FbtyyzZuOeOt0zdeRe4UY7ct+A==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"call-bind-apply-helpers": "^1.0.1",
|
||||||
|
"es-errors": "^1.3.0",
|
||||||
|
"gopd": "^1.2.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/ee-first": {
|
||||||
|
"version": "1.1.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/ee-first/-/ee-first-1.1.1.tgz",
|
||||||
|
"integrity": "sha512-WMwm9LhRUo+WUaRN+vRuETqG89IgZphVSNkdFgeb6sS/E4OrDIN7t48CAewSHXc6C8lefD8KKfr5vY61brQlow==",
|
||||||
|
"license": "MIT"
|
||||||
|
},
|
||||||
|
"node_modules/encodeurl": {
|
||||||
|
"version": "2.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/encodeurl/-/encodeurl-2.0.0.tgz",
|
||||||
|
"integrity": "sha512-Q0n9HRi4m6JuGIV1eFlmvJB7ZEVxu93IrMyiMsGC0lrMJMWzRgx6WGquyfQgZVb31vhGgXnfmPNNXmxnOkRBrg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/es-define-property": {
|
||||||
|
"version": "1.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/es-define-property/-/es-define-property-1.0.1.tgz",
|
||||||
|
"integrity": "sha512-e3nRfgfUZ4rNGL232gUgX06QNyyez04KdjFrF+LTRoOXmrOgFKDg4BCdsjW8EnT69eqdYGmRpJwiPVYNrCaW3g==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/es-errors": {
|
||||||
|
"version": "1.3.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/es-errors/-/es-errors-1.3.0.tgz",
|
||||||
|
"integrity": "sha512-Zf5H2Kxt2xjTvbJvP2ZWLEICxA6j+hAmMzIlypy4xcBg1vKVnx89Wy0GbS+kf5cwCVFFzdCFh2XSCFNULS6csw==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/es-object-atoms": {
|
||||||
|
"version": "1.1.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/es-object-atoms/-/es-object-atoms-1.1.1.tgz",
|
||||||
|
"integrity": "sha512-FGgH2h8zKNim9ljj7dankFPcICIK9Cp5bm+c2gQSYePhpaG5+esrLODihIorn+Pe6FGJzWhXQotPv73jTaldXA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"es-errors": "^1.3.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/escape-html": {
|
||||||
|
"version": "1.0.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/escape-html/-/escape-html-1.0.3.tgz",
|
||||||
|
"integrity": "sha512-NiSupZ4OeuGwr68lGIeym/ksIZMJodUGOSCZ/FSnTxcrekbvqrgdUxlJOMpijaKZVjAJrWrGs/6Jy8OMuyj9ow==",
|
||||||
|
"license": "MIT"
|
||||||
|
},
|
||||||
|
"node_modules/etag": {
|
||||||
|
"version": "1.8.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/etag/-/etag-1.8.1.tgz",
|
||||||
|
"integrity": "sha512-aIL5Fx7mawVa300al2BnEE4iNvo1qETxLrPI/o05L7z6go7fCw1J6EQmbK4FmJ2AS7kgVF/KEZWufBfdClMcPg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/express": {
|
||||||
|
"version": "4.22.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/express/-/express-4.22.1.tgz",
|
||||||
|
"integrity": "sha512-F2X8g9P1X7uCPZMA3MVf9wcTqlyNp7IhH5qPCI0izhaOIYXaW9L535tGA3qmjRzpH+bZczqq7hVKxTR4NWnu+g==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"accepts": "~1.3.8",
|
||||||
|
"array-flatten": "1.1.1",
|
||||||
|
"body-parser": "~1.20.3",
|
||||||
|
"content-disposition": "~0.5.4",
|
||||||
|
"content-type": "~1.0.4",
|
||||||
|
"cookie": "~0.7.1",
|
||||||
|
"cookie-signature": "~1.0.6",
|
||||||
|
"debug": "2.6.9",
|
||||||
|
"depd": "2.0.0",
|
||||||
|
"encodeurl": "~2.0.0",
|
||||||
|
"escape-html": "~1.0.3",
|
||||||
|
"etag": "~1.8.1",
|
||||||
|
"finalhandler": "~1.3.1",
|
||||||
|
"fresh": "~0.5.2",
|
||||||
|
"http-errors": "~2.0.0",
|
||||||
|
"merge-descriptors": "1.0.3",
|
||||||
|
"methods": "~1.1.2",
|
||||||
|
"on-finished": "~2.4.1",
|
||||||
|
"parseurl": "~1.3.3",
|
||||||
|
"path-to-regexp": "~0.1.12",
|
||||||
|
"proxy-addr": "~2.0.7",
|
||||||
|
"qs": "~6.14.0",
|
||||||
|
"range-parser": "~1.2.1",
|
||||||
|
"safe-buffer": "5.2.1",
|
||||||
|
"send": "~0.19.0",
|
||||||
|
"serve-static": "~1.16.2",
|
||||||
|
"setprototypeof": "1.2.0",
|
||||||
|
"statuses": "~2.0.1",
|
||||||
|
"type-is": "~1.6.18",
|
||||||
|
"utils-merge": "1.0.1",
|
||||||
|
"vary": "~1.1.2"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.10.0"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"type": "opencollective",
|
||||||
|
"url": "https://opencollective.com/express"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/finalhandler": {
|
||||||
|
"version": "1.3.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/finalhandler/-/finalhandler-1.3.2.tgz",
|
||||||
|
"integrity": "sha512-aA4RyPcd3badbdABGDuTXCMTtOneUCAYH/gxoYRTZlIJdF0YPWuGqiAsIrhNnnqdXGswYk6dGujem4w80UJFhg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"debug": "2.6.9",
|
||||||
|
"encodeurl": "~2.0.0",
|
||||||
|
"escape-html": "~1.0.3",
|
||||||
|
"on-finished": "~2.4.1",
|
||||||
|
"parseurl": "~1.3.3",
|
||||||
|
"statuses": "~2.0.2",
|
||||||
|
"unpipe": "~1.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/forwarded": {
|
||||||
|
"version": "0.2.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz",
|
||||||
|
"integrity": "sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/fresh": {
|
||||||
|
"version": "0.5.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/fresh/-/fresh-0.5.2.tgz",
|
||||||
|
"integrity": "sha512-zJ2mQYM18rEFOudeV4GShTGIQ7RbzA7ozbU9I/XBpm7kqgMywgmylMwXHxZJmkVoYkna9d2pVXVXPdYTP9ej8Q==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/function-bind": {
|
||||||
|
"version": "1.1.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/function-bind/-/function-bind-1.1.2.tgz",
|
||||||
|
"integrity": "sha512-7XHNxH7qX9xG5mIwxkhumTox/MIRNcOgDrxWsMt2pAr23WHp6MrRlN7FBSFpCpr+oVO0F744iUgR82nJMfG2SA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/ljharb"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/get-intrinsic": {
|
||||||
|
"version": "1.3.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/get-intrinsic/-/get-intrinsic-1.3.0.tgz",
|
||||||
|
"integrity": "sha512-9fSjSaos/fRIVIp+xSJlE6lfwhES7LNtKaCBIamHsjr2na1BiABJPo0mOjjz8GJDURarmCPGqaiVg5mfjb98CQ==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"call-bind-apply-helpers": "^1.0.2",
|
||||||
|
"es-define-property": "^1.0.1",
|
||||||
|
"es-errors": "^1.3.0",
|
||||||
|
"es-object-atoms": "^1.1.1",
|
||||||
|
"function-bind": "^1.1.2",
|
||||||
|
"get-proto": "^1.0.1",
|
||||||
|
"gopd": "^1.2.0",
|
||||||
|
"has-symbols": "^1.1.0",
|
||||||
|
"hasown": "^2.0.2",
|
||||||
|
"math-intrinsics": "^1.1.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/ljharb"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/get-proto": {
|
||||||
|
"version": "1.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/get-proto/-/get-proto-1.0.1.tgz",
|
||||||
|
"integrity": "sha512-sTSfBjoXBp89JvIKIefqw7U2CCebsc74kiY6awiGogKtoSGbgjYE/G/+l9sF3MWFPNc9IcoOC4ODfKHfxFmp0g==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"dunder-proto": "^1.0.1",
|
||||||
|
"es-object-atoms": "^1.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/gopd": {
|
||||||
|
"version": "1.2.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/gopd/-/gopd-1.2.0.tgz",
|
||||||
|
"integrity": "sha512-ZUKRh6/kUFoAiTAtTYPZJ3hw9wNxx+BIBOijnlG9PnrJsCcSjs1wyyD6vJpaYtgnzDrKYRSqf3OO6Rfa93xsRg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/ljharb"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/has-symbols": {
|
||||||
|
"version": "1.1.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/has-symbols/-/has-symbols-1.1.0.tgz",
|
||||||
|
"integrity": "sha512-1cDNdwJ2Jaohmb3sg4OmKaMBwuC48sYni5HUw2DvsC8LjGTLK9h+eb1X6RyuOHe4hT0ULCW68iomhjUoKUqlPQ==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/ljharb"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/hasown": {
|
||||||
|
"version": "2.0.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/hasown/-/hasown-2.0.2.tgz",
|
||||||
|
"integrity": "sha512-0hJU9SCPvmMzIBdZFqNPXWa6dqh7WdH0cII9y+CyS8rG3nL48Bclra9HmKhVVUHyPWNH5Y7xDwAB7bfgSjkUMQ==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"function-bind": "^1.1.2"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/http-errors": {
|
||||||
|
"version": "2.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/http-errors/-/http-errors-2.0.1.tgz",
|
||||||
|
"integrity": "sha512-4FbRdAX+bSdmo4AUFuS0WNiPz8NgFt+r8ThgNWmlrjQjt1Q7ZR9+zTlce2859x4KSXrwIsaeTqDoKQmtP8pLmQ==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"depd": "~2.0.0",
|
||||||
|
"inherits": "~2.0.4",
|
||||||
|
"setprototypeof": "~1.2.0",
|
||||||
|
"statuses": "~2.0.2",
|
||||||
|
"toidentifier": "~1.0.1"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"type": "opencollective",
|
||||||
|
"url": "https://opencollective.com/express"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/iconv-lite": {
|
||||||
|
"version": "0.4.24",
|
||||||
|
"resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.4.24.tgz",
|
||||||
|
"integrity": "sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"safer-buffer": ">= 2.1.2 < 3"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=0.10.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/inherits": {
|
||||||
|
"version": "2.0.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.4.tgz",
|
||||||
|
"integrity": "sha512-k/vGaX4/Yla3WzyMCvTQOXYeIHvqOKtnqBduzTHpzpQZzAskKMhZ2K+EnBiSM9zGSoIFeMpXKxa4dYeZIQqewQ==",
|
||||||
|
"license": "ISC"
|
||||||
|
},
|
||||||
|
"node_modules/ipaddr.js": {
|
||||||
|
"version": "1.9.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/ipaddr.js/-/ipaddr.js-1.9.1.tgz",
|
||||||
|
"integrity": "sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.10"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/math-intrinsics": {
|
||||||
|
"version": "1.1.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/math-intrinsics/-/math-intrinsics-1.1.0.tgz",
|
||||||
|
"integrity": "sha512-/IXtbwEk5HTPyEwyKX6hGkYXxM9nbj64B+ilVJnC/R6B0pH5G4V3b0pVbL7DBj4tkhBAppbQUlf6F6Xl9LHu1g==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/media-typer": {
|
||||||
|
"version": "0.3.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/media-typer/-/media-typer-0.3.0.tgz",
|
||||||
|
"integrity": "sha512-dq+qelQ9akHpcOl/gUVRTxVIOkAJ1wR3QAvb4RsVjS8oVoFjDGTc679wJYmUmknUF5HwMLOgb5O+a3KxfWapPQ==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/merge-descriptors": {
|
||||||
|
"version": "1.0.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/merge-descriptors/-/merge-descriptors-1.0.3.tgz",
|
||||||
|
"integrity": "sha512-gaNvAS7TZ897/rVaZ0nMtAyxNyi/pdbjbAwUpFQpN70GqnVfOiXpeUUMKRBmzXaSQ8DdTX4/0ms62r2K+hE6mQ==",
|
||||||
|
"license": "MIT",
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/sindresorhus"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/methods": {
|
||||||
|
"version": "1.1.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/methods/-/methods-1.1.2.tgz",
|
||||||
|
"integrity": "sha512-iclAHeNqNm68zFtnZ0e+1L2yUIdvzNoauKU4WBA3VvH/vPFieF7qfRlwUZU+DA9P9bPXIS90ulxoUoCH23sV2w==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/mime": {
|
||||||
|
"version": "1.6.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/mime/-/mime-1.6.0.tgz",
|
||||||
|
"integrity": "sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"bin": {
|
||||||
|
"mime": "cli.js"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=4"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/mime-db": {
|
||||||
|
"version": "1.52.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.52.0.tgz",
|
||||||
|
"integrity": "sha512-sPU4uV7dYlvtWJxwwxHD0PuihVNiE7TyAbQ5SWxDCB9mUYvOgroQOwYQQOKPJ8CIbE+1ETVlOoK1UC2nU3gYvg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/mime-types": {
|
||||||
|
"version": "2.1.35",
|
||||||
|
"resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.35.tgz",
|
||||||
|
"integrity": "sha512-ZDY+bPm5zTTF+YpCrAU9nK0UgICYPT0QtT1NZWFv4s++TNkcgVaT0g6+4R2uI4MjQjzysHB1zxuWL50hzaeXiw==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"mime-db": "1.52.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/ms": {
|
||||||
|
"version": "2.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz",
|
||||||
|
"integrity": "sha512-Tpp60P6IUJDTuOq/5Z8cdskzJujfwqfOTkrwIwj7IRISpnkJnT6SyJ4PCPnGMoFjC9ddhal5KVIYtAt97ix05A==",
|
||||||
|
"license": "MIT"
|
||||||
|
},
|
||||||
|
"node_modules/negotiator": {
|
||||||
|
"version": "0.6.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/negotiator/-/negotiator-0.6.3.tgz",
|
||||||
|
"integrity": "sha512-+EUsqGPLsM+j/zdChZjsnX51g4XrHFOIXwfnCVPGlQk/k5giakcKsuxCObBRu6DSm9opw/O6slWbJdghQM4bBg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/object-inspect": {
|
||||||
|
"version": "1.13.4",
|
||||||
|
"resolved": "https://registry.npmjs.org/object-inspect/-/object-inspect-1.13.4.tgz",
|
||||||
|
"integrity": "sha512-W67iLl4J2EXEGTbfeHCffrjDfitvLANg0UlX3wFUUSTx92KXRFegMHUVgSqE+wvhAbi4WqjGg9czysTV2Epbew==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/ljharb"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/on-finished": {
|
||||||
|
"version": "2.4.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/on-finished/-/on-finished-2.4.1.tgz",
|
||||||
|
"integrity": "sha512-oVlzkg3ENAhCk2zdv7IJwd/QUD4z2RxRwpkcGY8psCVcCYZNq4wYnVWALHM+brtuJjePWiYF/ClmuDr8Ch5+kg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"ee-first": "1.1.1"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/parseurl": {
|
||||||
|
"version": "1.3.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/parseurl/-/parseurl-1.3.3.tgz",
|
||||||
|
"integrity": "sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/path-to-regexp": {
|
||||||
|
"version": "0.1.12",
|
||||||
|
"resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.12.tgz",
|
||||||
|
"integrity": "sha512-RA1GjUVMnvYFxuqovrEqZoxxW5NUZqbwKtYz/Tt7nXerk0LbLblQmrsgdeOxV5SFHf0UDggjS/bSeOZwt1pmEQ==",
|
||||||
|
"license": "MIT"
|
||||||
|
},
|
||||||
|
"node_modules/proxy-addr": {
|
||||||
|
"version": "2.0.7",
|
||||||
|
"resolved": "https://registry.npmjs.org/proxy-addr/-/proxy-addr-2.0.7.tgz",
|
||||||
|
"integrity": "sha512-llQsMLSUDUPT44jdrU/O37qlnifitDP+ZwrmmZcoSKyLKvtZxpyV0n2/bD/N4tBAAZ/gJEdZU7KMraoK1+XYAg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"forwarded": "0.2.0",
|
||||||
|
"ipaddr.js": "1.9.1"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.10"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/qs": {
|
||||||
|
"version": "6.14.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/qs/-/qs-6.14.2.tgz",
|
||||||
|
"integrity": "sha512-V/yCWTTF7VJ9hIh18Ugr2zhJMP01MY7c5kh4J870L7imm6/DIzBsNLTXzMwUA3yZ5b/KBqLx8Kp3uRvd7xSe3Q==",
|
||||||
|
"license": "BSD-3-Clause",
|
||||||
|
"dependencies": {
|
||||||
|
"side-channel": "^1.1.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">=0.6"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/ljharb"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/range-parser": {
|
||||||
|
"version": "1.2.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/range-parser/-/range-parser-1.2.1.tgz",
|
||||||
|
"integrity": "sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/raw-body": {
|
||||||
|
"version": "2.5.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.5.3.tgz",
|
||||||
|
"integrity": "sha512-s4VSOf6yN0rvbRZGxs8Om5CWj6seneMwK3oDb4lWDH0UPhWcxwOWw5+qk24bxq87szX1ydrwylIOp2uG1ojUpA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"bytes": "~3.1.2",
|
||||||
|
"http-errors": "~2.0.1",
|
||||||
|
"iconv-lite": "~0.4.24",
|
||||||
|
"unpipe": "~1.0.0"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/safe-buffer": {
|
||||||
|
"version": "5.2.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.2.1.tgz",
|
||||||
|
"integrity": "sha512-rp3So07KcdmmKbGvgaNxQSJr7bGVSVk5S9Eq1F+ppbRo70+YeaDxkw5Dd8NPN+GD6bjnYm2VuPuCXmpuYvmCXQ==",
|
||||||
|
"funding": [
|
||||||
|
{
|
||||||
|
"type": "github",
|
||||||
|
"url": "https://github.com/sponsors/feross"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "patreon",
|
||||||
|
"url": "https://www.patreon.com/feross"
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "consulting",
|
||||||
|
"url": "https://feross.org/support"
|
||||||
|
}
|
||||||
|
],
|
||||||
|
"license": "MIT"
|
||||||
|
},
|
||||||
|
"node_modules/safer-buffer": {
|
||||||
|
"version": "2.1.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz",
|
||||||
|
"integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==",
|
||||||
|
"license": "MIT"
|
||||||
|
},
|
||||||
|
"node_modules/send": {
|
||||||
|
"version": "0.19.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/send/-/send-0.19.2.tgz",
|
||||||
|
"integrity": "sha512-VMbMxbDeehAxpOtWJXlcUS5E8iXh6QmN+BkRX1GARS3wRaXEEgzCcB10gTQazO42tpNIya8xIyNx8fll1OFPrg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"debug": "2.6.9",
|
||||||
|
"depd": "2.0.0",
|
||||||
|
"destroy": "1.2.0",
|
||||||
|
"encodeurl": "~2.0.0",
|
||||||
|
"escape-html": "~1.0.3",
|
||||||
|
"etag": "~1.8.1",
|
||||||
|
"fresh": "~0.5.2",
|
||||||
|
"http-errors": "~2.0.1",
|
||||||
|
"mime": "1.6.0",
|
||||||
|
"ms": "2.1.3",
|
||||||
|
"on-finished": "~2.4.1",
|
||||||
|
"range-parser": "~1.2.1",
|
||||||
|
"statuses": "~2.0.2"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/send/node_modules/ms": {
|
||||||
|
"version": "2.1.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz",
|
||||||
|
"integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==",
|
||||||
|
"license": "MIT"
|
||||||
|
},
|
||||||
|
"node_modules/serve-static": {
|
||||||
|
"version": "1.16.3",
|
||||||
|
"resolved": "https://registry.npmjs.org/serve-static/-/serve-static-1.16.3.tgz",
|
||||||
|
"integrity": "sha512-x0RTqQel6g5SY7Lg6ZreMmsOzncHFU7nhnRWkKgWuMTu5NN0DR5oruckMqRvacAN9d5w6ARnRBXl9xhDCgfMeA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"encodeurl": "~2.0.0",
|
||||||
|
"escape-html": "~1.0.3",
|
||||||
|
"parseurl": "~1.3.3",
|
||||||
|
"send": "~0.19.1"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/setprototypeof": {
|
||||||
|
"version": "1.2.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.2.0.tgz",
|
||||||
|
"integrity": "sha512-E5LDX7Wrp85Kil5bhZv46j8jOeboKq5JMmYM3gVGdGH8xFpPWXUMsNrlODCrkoxMEeNi/XZIwuRvY4XNwYMJpw==",
|
||||||
|
"license": "ISC"
|
||||||
|
},
|
||||||
|
"node_modules/side-channel": {
|
||||||
|
"version": "1.1.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/side-channel/-/side-channel-1.1.0.tgz",
|
||||||
|
"integrity": "sha512-ZX99e6tRweoUXqR+VBrslhda51Nh5MTQwou5tnUDgbtyM0dBgmhEDtWGP/xbKn6hqfPRHujUNwz5fy/wbbhnpw==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"es-errors": "^1.3.0",
|
||||||
|
"object-inspect": "^1.13.3",
|
||||||
|
"side-channel-list": "^1.0.0",
|
||||||
|
"side-channel-map": "^1.0.1",
|
||||||
|
"side-channel-weakmap": "^1.0.2"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/ljharb"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/side-channel-list": {
|
||||||
|
"version": "1.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/side-channel-list/-/side-channel-list-1.0.0.tgz",
|
||||||
|
"integrity": "sha512-FCLHtRD/gnpCiCHEiJLOwdmFP+wzCmDEkc9y7NsYxeF4u7Btsn1ZuwgwJGxImImHicJArLP4R0yX4c2KCrMrTA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"es-errors": "^1.3.0",
|
||||||
|
"object-inspect": "^1.13.3"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/ljharb"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/side-channel-map": {
|
||||||
|
"version": "1.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/side-channel-map/-/side-channel-map-1.0.1.tgz",
|
||||||
|
"integrity": "sha512-VCjCNfgMsby3tTdo02nbjtM/ewra6jPHmpThenkTYh8pG9ucZ/1P8So4u4FGBek/BjpOVsDCMoLA/iuBKIFXRA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"call-bound": "^1.0.2",
|
||||||
|
"es-errors": "^1.3.0",
|
||||||
|
"get-intrinsic": "^1.2.5",
|
||||||
|
"object-inspect": "^1.13.3"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/ljharb"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/side-channel-weakmap": {
|
||||||
|
"version": "1.0.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/side-channel-weakmap/-/side-channel-weakmap-1.0.2.tgz",
|
||||||
|
"integrity": "sha512-WPS/HvHQTYnHisLo9McqBHOJk2FkHO/tlpvldyrnem4aeQp4hai3gythswg6p01oSoTl58rcpiFAjF2br2Ak2A==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"call-bound": "^1.0.2",
|
||||||
|
"es-errors": "^1.3.0",
|
||||||
|
"get-intrinsic": "^1.2.5",
|
||||||
|
"object-inspect": "^1.13.3",
|
||||||
|
"side-channel-map": "^1.0.1"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4"
|
||||||
|
},
|
||||||
|
"funding": {
|
||||||
|
"url": "https://github.com/sponsors/ljharb"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/statuses": {
|
||||||
|
"version": "2.0.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/statuses/-/statuses-2.0.2.tgz",
|
||||||
|
"integrity": "sha512-DvEy55V3DB7uknRo+4iOGT5fP1slR8wQohVdknigZPMpMstaKJQWhwiYBACJE3Ul2pTnATihhBYnRhZQHGBiRw==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/toidentifier": {
|
||||||
|
"version": "1.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/toidentifier/-/toidentifier-1.0.1.tgz",
|
||||||
|
"integrity": "sha512-o5sSPKEkg/DIQNmH43V0/uerLrpzVedkUh8tGNvaeXpfpuwjKenlSox/2O/BTlZUtEe+JG7s5YhEz608PlAHRA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/type-is": {
|
||||||
|
"version": "1.6.18",
|
||||||
|
"resolved": "https://registry.npmjs.org/type-is/-/type-is-1.6.18.tgz",
|
||||||
|
"integrity": "sha512-TkRKr9sUTxEH8MdfuCSP7VizJyzRNMjj2J2do2Jr3Kym598JVdEksuzPQCnlFPW4ky9Q+iA+ma9BGm06XQBy8g==",
|
||||||
|
"license": "MIT",
|
||||||
|
"dependencies": {
|
||||||
|
"media-typer": "0.3.0",
|
||||||
|
"mime-types": "~2.1.24"
|
||||||
|
},
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.6"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/unpipe": {
|
||||||
|
"version": "1.0.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/unpipe/-/unpipe-1.0.0.tgz",
|
||||||
|
"integrity": "sha512-pjy2bYhSsufwWlKwPc+l3cN7+wuJlK6uz0YdJEOlQDbl6jo/YlPi4mb8agUkVC8BF7V8NuzeyPNqRksA3hztKQ==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/utils-merge": {
|
||||||
|
"version": "1.0.1",
|
||||||
|
"resolved": "https://registry.npmjs.org/utils-merge/-/utils-merge-1.0.1.tgz",
|
||||||
|
"integrity": "sha512-pMZTvIkT1d+TFGvDOqodOclx0QWkkgi6Tdoa8gC8ffGAAqz9pzPTZWAybbsHHoED/ztMtkv/VoYTYyShUn81hA==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.4.0"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/vary": {
|
||||||
|
"version": "1.1.2",
|
||||||
|
"resolved": "https://registry.npmjs.org/vary/-/vary-1.1.2.tgz",
|
||||||
|
"integrity": "sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">= 0.8"
|
||||||
|
}
|
||||||
|
},
|
||||||
|
"node_modules/ws": {
|
||||||
|
"version": "8.19.0",
|
||||||
|
"resolved": "https://registry.npmjs.org/ws/-/ws-8.19.0.tgz",
|
||||||
|
"integrity": "sha512-blAT2mjOEIi0ZzruJfIhb3nps74PRWTCz1IjglWEEpQl5XS/UNama6u2/rjFkDDouqr4L67ry+1aGIALViWjDg==",
|
||||||
|
"license": "MIT",
|
||||||
|
"engines": {
|
||||||
|
"node": ">=10.0.0"
|
||||||
|
},
|
||||||
|
"peerDependencies": {
|
||||||
|
"bufferutil": "^4.0.1",
|
||||||
|
"utf-8-validate": ">=5.0.2"
|
||||||
|
},
|
||||||
|
"peerDependenciesMeta": {
|
||||||
|
"bufferutil": {
|
||||||
|
"optional": true
|
||||||
|
},
|
||||||
|
"utf-8-validate": {
|
||||||
|
"optional": true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
13
web/package.json
Normal file
13
web/package.json
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
{
|
||||||
|
"name": "pm-template-web",
|
||||||
|
"version": "1.0.0",
|
||||||
|
"description": "Minimal local web chat UI",
|
||||||
|
"main": "server.js",
|
||||||
|
"scripts": {
|
||||||
|
"start": "node server.js"
|
||||||
|
},
|
||||||
|
"dependencies": {
|
||||||
|
"express": "^4.18.2",
|
||||||
|
"ws": "^8.16.0"
|
||||||
|
}
|
||||||
|
}
|
||||||
127
web/public/index.html
Normal file
127
web/public/index.html
Normal file
@@ -0,0 +1,127 @@
|
|||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||||
|
<title>Forge Console</title>
|
||||||
|
<link rel="stylesheet" href="style.css">
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<div id="app">
|
||||||
|
<header id="status-bar">
|
||||||
|
<span class="logo">🔥 Forge Console</span>
|
||||||
|
<span id="conn-status"><span class="dot red"></span> Disconnected</span>
|
||||||
|
</header>
|
||||||
|
<main id="messages"></main>
|
||||||
|
<footer id="input-bar">
|
||||||
|
<textarea id="input" placeholder="Ask anything..." rows="1"></textarea>
|
||||||
|
<button id="send" title="Send">➤</button>
|
||||||
|
</footer>
|
||||||
|
</div>
|
||||||
|
<script>
|
||||||
|
const messages = document.getElementById('messages');
|
||||||
|
const input = document.getElementById('input');
|
||||||
|
const sendBtn = document.getElementById('send');
|
||||||
|
const connStatus = document.getElementById('conn-status');
|
||||||
|
|
||||||
|
let ws;
|
||||||
|
let currentAssistant = null;
|
||||||
|
|
||||||
|
function setStatus(connected) {
|
||||||
|
connStatus.innerHTML = connected
|
||||||
|
? '<span class="dot green"></span> Connected'
|
||||||
|
: '<span class="dot red"></span> Disconnected';
|
||||||
|
}
|
||||||
|
|
||||||
|
function scrollBottom() {
|
||||||
|
messages.scrollTop = messages.scrollHeight;
|
||||||
|
}
|
||||||
|
|
||||||
|
function addMessage(role, text) {
|
||||||
|
const div = document.createElement('div');
|
||||||
|
div.className = `msg ${role}`;
|
||||||
|
div.textContent = text;
|
||||||
|
messages.appendChild(div);
|
||||||
|
scrollBottom();
|
||||||
|
return div;
|
||||||
|
}
|
||||||
|
|
||||||
|
function addThinking() {
|
||||||
|
const div = document.createElement('div');
|
||||||
|
div.className = 'msg assistant thinking';
|
||||||
|
div.innerHTML = '<span class="dots"><span>.</span><span>.</span><span>.</span></span>';
|
||||||
|
messages.appendChild(div);
|
||||||
|
scrollBottom();
|
||||||
|
return div;
|
||||||
|
}
|
||||||
|
|
||||||
|
function connect() {
|
||||||
|
const proto = location.protocol === 'https:' ? 'wss' : 'ws';
|
||||||
|
ws = new WebSocket(`${proto}://${location.host}/ws`);
|
||||||
|
|
||||||
|
ws.onopen = () => {
|
||||||
|
setStatus(true);
|
||||||
|
currentAssistant = null;
|
||||||
|
};
|
||||||
|
|
||||||
|
ws.onclose = () => {
|
||||||
|
setStatus(false);
|
||||||
|
currentAssistant = null;
|
||||||
|
setTimeout(connect, 2000);
|
||||||
|
};
|
||||||
|
|
||||||
|
ws.onmessage = (e) => {
|
||||||
|
let msg;
|
||||||
|
try { msg = JSON.parse(e.data); } catch { return; }
|
||||||
|
|
||||||
|
if (msg.type === 'stdout' || msg.type === 'stderr') {
|
||||||
|
// Remove thinking indicator if present
|
||||||
|
const thinking = messages.querySelector('.thinking');
|
||||||
|
if (thinking) thinking.remove();
|
||||||
|
|
||||||
|
if (!currentAssistant) {
|
||||||
|
currentAssistant = addMessage('assistant', '');
|
||||||
|
}
|
||||||
|
currentAssistant.textContent += msg.data;
|
||||||
|
scrollBottom();
|
||||||
|
} else if (msg.type === 'exit') {
|
||||||
|
addMessage('system', `Session ended (code ${msg.code}). Refresh to restart.`);
|
||||||
|
currentAssistant = null;
|
||||||
|
} else if (msg.type === 'error') {
|
||||||
|
addMessage('system', `Error: ${msg.data}`);
|
||||||
|
currentAssistant = null;
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
function send() {
|
||||||
|
const text = input.value.trim();
|
||||||
|
if (!text || !ws || ws.readyState !== 1) return;
|
||||||
|
|
||||||
|
addMessage('user', text);
|
||||||
|
currentAssistant = null;
|
||||||
|
addThinking();
|
||||||
|
ws.send(text);
|
||||||
|
input.value = '';
|
||||||
|
input.style.height = 'auto';
|
||||||
|
}
|
||||||
|
|
||||||
|
sendBtn.addEventListener('click', send);
|
||||||
|
|
||||||
|
input.addEventListener('keydown', (e) => {
|
||||||
|
if (e.key === 'Enter' && !e.shiftKey) {
|
||||||
|
e.preventDefault();
|
||||||
|
send();
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Auto-resize textarea
|
||||||
|
input.addEventListener('input', () => {
|
||||||
|
input.style.height = 'auto';
|
||||||
|
input.style.height = Math.min(input.scrollHeight, 120) + 'px';
|
||||||
|
});
|
||||||
|
|
||||||
|
connect();
|
||||||
|
</script>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
176
web/public/style.css
Normal file
176
web/public/style.css
Normal file
@@ -0,0 +1,176 @@
|
|||||||
|
* { margin: 0; padding: 0; box-sizing: border-box; }
|
||||||
|
|
||||||
|
:root {
|
||||||
|
--bg: #1a1a2e;
|
||||||
|
--surface: #16213e;
|
||||||
|
--input-bg: #0f3460;
|
||||||
|
--user-bg: #1a3a5c;
|
||||||
|
--assistant-bg: #2a2a3e;
|
||||||
|
--system-bg: #2e1a1a;
|
||||||
|
--text: #e0e0e0;
|
||||||
|
--text-dim: #8888aa;
|
||||||
|
--accent: #4a9eff;
|
||||||
|
--green: #4caf50;
|
||||||
|
--red: #f44336;
|
||||||
|
}
|
||||||
|
|
||||||
|
html, body { height: 100%; }
|
||||||
|
|
||||||
|
body {
|
||||||
|
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', sans-serif;
|
||||||
|
background: var(--bg);
|
||||||
|
color: var(--text);
|
||||||
|
}
|
||||||
|
|
||||||
|
#app {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
height: 100vh;
|
||||||
|
max-width: 800px;
|
||||||
|
margin: 0 auto;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Status bar */
|
||||||
|
#status-bar {
|
||||||
|
display: flex;
|
||||||
|
justify-content: space-between;
|
||||||
|
align-items: center;
|
||||||
|
padding: 10px 16px;
|
||||||
|
background: var(--surface);
|
||||||
|
border-bottom: 1px solid rgba(255,255,255,0.05);
|
||||||
|
font-size: 13px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.logo {
|
||||||
|
font-weight: 600;
|
||||||
|
font-size: 15px;
|
||||||
|
letter-spacing: 0.5px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.dot {
|
||||||
|
display: inline-block;
|
||||||
|
width: 8px;
|
||||||
|
height: 8px;
|
||||||
|
border-radius: 50%;
|
||||||
|
margin-right: 6px;
|
||||||
|
vertical-align: middle;
|
||||||
|
}
|
||||||
|
.dot.green { background: var(--green); }
|
||||||
|
.dot.red { background: var(--red); }
|
||||||
|
|
||||||
|
#conn-status { color: var(--text-dim); font-size: 12px; }
|
||||||
|
|
||||||
|
/* Messages */
|
||||||
|
#messages {
|
||||||
|
flex: 1;
|
||||||
|
overflow-y: auto;
|
||||||
|
padding: 16px;
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.msg {
|
||||||
|
max-width: 85%;
|
||||||
|
padding: 10px 14px;
|
||||||
|
border-radius: 12px;
|
||||||
|
font-size: 14px;
|
||||||
|
line-height: 1.5;
|
||||||
|
white-space: pre-wrap;
|
||||||
|
word-wrap: break-word;
|
||||||
|
animation: fadeIn 0.2s ease;
|
||||||
|
}
|
||||||
|
|
||||||
|
.msg.user {
|
||||||
|
align-self: flex-end;
|
||||||
|
background: var(--user-bg);
|
||||||
|
border-bottom-right-radius: 4px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.msg.assistant {
|
||||||
|
align-self: flex-start;
|
||||||
|
background: var(--assistant-bg);
|
||||||
|
border-bottom-left-radius: 4px;
|
||||||
|
font-family: 'SF Mono', 'Fira Code', 'Consolas', monospace;
|
||||||
|
font-size: 13px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.msg.system {
|
||||||
|
align-self: center;
|
||||||
|
background: var(--system-bg);
|
||||||
|
color: var(--text-dim);
|
||||||
|
font-size: 12px;
|
||||||
|
border-radius: 8px;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Thinking dots */
|
||||||
|
.thinking .dots span {
|
||||||
|
animation: blink 1.4s infinite;
|
||||||
|
font-size: 24px;
|
||||||
|
line-height: 1;
|
||||||
|
}
|
||||||
|
.thinking .dots span:nth-child(2) { animation-delay: 0.2s; }
|
||||||
|
.thinking .dots span:nth-child(3) { animation-delay: 0.4s; }
|
||||||
|
|
||||||
|
@keyframes blink {
|
||||||
|
0%, 20% { opacity: 0.2; }
|
||||||
|
50% { opacity: 1; }
|
||||||
|
80%, 100% { opacity: 0.2; }
|
||||||
|
}
|
||||||
|
|
||||||
|
@keyframes fadeIn {
|
||||||
|
from { opacity: 0; transform: translateY(6px); }
|
||||||
|
to { opacity: 1; transform: translateY(0); }
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Input bar */
|
||||||
|
#input-bar {
|
||||||
|
display: flex;
|
||||||
|
gap: 8px;
|
||||||
|
padding: 12px 16px;
|
||||||
|
background: var(--surface);
|
||||||
|
border-top: 1px solid rgba(255,255,255,0.05);
|
||||||
|
}
|
||||||
|
|
||||||
|
#input {
|
||||||
|
flex: 1;
|
||||||
|
background: var(--input-bg);
|
||||||
|
border: 1px solid rgba(255,255,255,0.1);
|
||||||
|
border-radius: 10px;
|
||||||
|
padding: 10px 14px;
|
||||||
|
color: var(--text);
|
||||||
|
font-size: 14px;
|
||||||
|
font-family: inherit;
|
||||||
|
resize: none;
|
||||||
|
outline: none;
|
||||||
|
transition: border-color 0.2s;
|
||||||
|
}
|
||||||
|
|
||||||
|
#input:focus { border-color: var(--accent); }
|
||||||
|
|
||||||
|
#input::placeholder { color: var(--text-dim); }
|
||||||
|
|
||||||
|
#send {
|
||||||
|
background: var(--accent);
|
||||||
|
border: none;
|
||||||
|
border-radius: 10px;
|
||||||
|
width: 44px;
|
||||||
|
color: white;
|
||||||
|
font-size: 18px;
|
||||||
|
cursor: pointer;
|
||||||
|
transition: opacity 0.2s;
|
||||||
|
}
|
||||||
|
|
||||||
|
#send:hover { opacity: 0.85; }
|
||||||
|
#send:active { opacity: 0.7; }
|
||||||
|
|
||||||
|
/* Scrollbar */
|
||||||
|
#messages::-webkit-scrollbar { width: 6px; }
|
||||||
|
#messages::-webkit-scrollbar-track { background: transparent; }
|
||||||
|
#messages::-webkit-scrollbar-thumb { background: rgba(255,255,255,0.1); border-radius: 3px; }
|
||||||
|
|
||||||
|
/* Mobile */
|
||||||
|
@media (max-width: 600px) {
|
||||||
|
#app { max-width: 100%; }
|
||||||
|
.msg { max-width: 92%; }
|
||||||
|
}
|
||||||
77
web/server.js
Normal file
77
web/server.js
Normal file
@@ -0,0 +1,77 @@
|
|||||||
|
const express = require('express');
|
||||||
|
const { WebSocketServer } = require('ws');
|
||||||
|
const { spawn } = require('child_process');
|
||||||
|
const path = require('path');
|
||||||
|
const http = require('http');
|
||||||
|
|
||||||
|
const PORT = process.env.PORT || 3000;
|
||||||
|
const app = express();
|
||||||
|
const server = http.createServer(app);
|
||||||
|
const wss = new WebSocketServer({ server, path: '/ws' });
|
||||||
|
|
||||||
|
app.use(express.static(path.join(__dirname, 'public')));
|
||||||
|
app.get('/health', (_, res) => res.json({ status: 'ok' }));
|
||||||
|
|
||||||
|
wss.on('connection', (ws) => {
|
||||||
|
console.log('[forge] client connected');
|
||||||
|
|
||||||
|
// Spawn codex in interactive mode, working dir = repo root
|
||||||
|
const repoRoot = path.resolve(__dirname, '..');
|
||||||
|
const codex = spawn('codex', ['--quiet'], {
|
||||||
|
cwd: repoRoot,
|
||||||
|
shell: true,
|
||||||
|
env: { ...process.env, FORCE_COLOR: '0' },
|
||||||
|
stdio: ['pipe', 'pipe', 'pipe']
|
||||||
|
});
|
||||||
|
|
||||||
|
let alive = true;
|
||||||
|
|
||||||
|
codex.stdout.on('data', (data) => {
|
||||||
|
if (ws.readyState === 1) {
|
||||||
|
ws.send(JSON.stringify({ type: 'stdout', data: data.toString() }));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
codex.stderr.on('data', (data) => {
|
||||||
|
if (ws.readyState === 1) {
|
||||||
|
ws.send(JSON.stringify({ type: 'stderr', data: data.toString() }));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
codex.on('close', (code) => {
|
||||||
|
alive = false;
|
||||||
|
console.log(`[forge] codex exited (code ${code})`);
|
||||||
|
if (ws.readyState === 1) {
|
||||||
|
ws.send(JSON.stringify({ type: 'exit', code }));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
codex.on('error', (err) => {
|
||||||
|
alive = false;
|
||||||
|
console.error('[forge] codex spawn error:', err.message);
|
||||||
|
if (ws.readyState === 1) {
|
||||||
|
ws.send(JSON.stringify({ type: 'error', data: err.message }));
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
ws.on('message', (msg) => {
|
||||||
|
if (alive && codex.stdin.writable) {
|
||||||
|
codex.stdin.write(msg.toString() + '\n');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
ws.on('close', () => {
|
||||||
|
console.log('[forge] client disconnected');
|
||||||
|
if (alive) codex.kill('SIGTERM');
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
server.listen(PORT, () => {
|
||||||
|
console.log(`\n 🔥 Forge Console running at http://localhost:${PORT}\n`);
|
||||||
|
});
|
||||||
|
|
||||||
|
process.on('SIGINT', () => {
|
||||||
|
console.log('\n[forge] shutting down...');
|
||||||
|
wss.clients.forEach((ws) => ws.close());
|
||||||
|
server.close(() => process.exit(0));
|
||||||
|
});
|
||||||
Reference in New Issue
Block a user