Prepared by: Neo, Chief Code Architect
Date: 2026-03-28
Status: For Will's review
The mainframe (mainframe.py) is a SQLite database with six tables:
| Table | Purpose | Records |
|---|---|---|
team_members | PKA team roster | 4 (founding team) |
projects | Project tracking | Empty |
tasks | Task management | Empty |
task_notes | Activity log per task | Empty |
deliverables | Work product tracking | Empty |
skills | Team member capabilities | 17 (Neo's skills) |
comms_log | Inter-team communications | Empty |
The mainframe is clean, well-structured, and ready for extension. It has no knowledge of vessels, routes, positions, activity zones, or any Shore Party operational data.
~/.claude/skills/activity-intelligence/x-rwgps-api-key + x-rwgps-auth-token. No direct bounding-box query; requires full pull + local filter.activity_zones.json with center coordinates, radii, season windows, tier classifications, and vessel associations.~/Documents/Claude/Projects/Vessel Intelligence/marinetraffic-skill/9d408e67-c1bf-4f24-90c2-32a489ba689aca454294-5538-466a-b457-1c47f1d5a83c0489bd0b-4cc6-4ece-b761-37ab28373e98c8381d37-f9e5-4255-8368-e4203badd5d1~/shore-party-proxy/worker.js?url= param, fetches the target URL, returns as application/zip with CORS headers. Used for downloading route files or other cross-origin resources.api.datalastic.com (vessel tracking/AIS data API). Adds CORS headers for browser-side consumption.~/Desktop/COWORK/Shore Party/SHORE_PARTY_MASTER.mdnotion-search, notion-query-database-view, notion-update-page, etc.) PKA MAINFRAME (SQLite)
|
+----------+----------+----------+----------+
| | | | |
[RWGPS API] [MarineTraffic] [Notion] [Open-Meteo] [Datalastic]
| | | | |
File sync Chrome MCP MCP tools Direct API Cloudflare proxy
+ API pull + Dense Track (free) (CORS bridge)
| | |
routes.db positions.db Two-way sync
(cached) (time-series) (vessel registry,
positions, captain intel)
| System | Method | Frequency | Direction |
|---|---|---|---|
| Activity Zones | File parse (activity_zones.json -> SQLite) | On init + on change | One-way in |
| Fleet Registry | File parse (fleet-registry.md -> SQLite) | On init + on change | One-way in |
| RWGPS Routes | API pull -> SQLite cache | Daily or on-demand | One-way in |
| Vessel Positions | Chrome MCP scrape -> SQLite | Daily (morning pull) | One-way in |
| Notion Vessel Registry | MCP read/write | On-demand sync | Two-way |
| Notion Daily Positions | MCP write | After each position pull | One-way out |
| Open-Meteo Weather | Direct API -> SQLite | On-demand per zone/position | One-way in |
| Datalastic/AIS | Via Cloudflare proxy -> SQLite | On-demand | One-way in |
| Dense Track (MT) | Chrome MCP -> SQLite | On-demand per vessel | One-way in |
| Shore Party Master | Reference only (not ingested) | N/A | Read-only |
These tables would be added to mainframe.py alongside the existing schema.
CREATE TABLE IF NOT EXISTS vessels (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL UNIQUE,
mmsi TEXT,
imo TEXT,
call_sign TEXT,
flag TEXT,
loa_m REAL,
builder TEXT,
build_year INTEGER,
owner TEXT,
charter_rate TEXT,
ice_class TEXT,
market_class TEXT CHECK (market_class IN ('CHARTER', 'PRIVATE')),
tier INTEGER DEFAULT 3,
mt_search_note TEXT,
mt_ship_id TEXT,
notion_page_id TEXT,
philosophy TEXT,
status TEXT DEFAULT 'active',
created_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%SZ', 'now')),
updated_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%SZ', 'now'))
);
CREATE TABLE IF NOT EXISTS fleet_membership (
vessel_id INTEGER NOT NULL REFERENCES vessels(id),
fleet TEXT NOT NULL CHECK (fleet IN ('CHARTER', 'PRIVATE', 'DATA SET', 'TIER ONE')),
PRIMARY KEY (vessel_id, fleet)
);
CREATE TABLE IF NOT EXISTS vessel_positions (
id INTEGER PRIMARY KEY AUTOINCREMENT,
vessel_id INTEGER NOT NULL REFERENCES vessels(id),
lat REAL NOT NULL,
lon REAL NOT NULL,
speed_kn REAL,
course REAL,
current_port TEXT,
destination TEXT,
nav_status TEXT,
source TEXT DEFAULT 'marinetraffic',
recorded_at TEXT NOT NULL,
created_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%SZ', 'now'))
);
CREATE INDEX IF NOT EXISTS idx_positions_vessel ON vessel_positions(vessel_id);
CREATE INDEX IF NOT EXISTS idx_positions_time ON vessel_positions(recorded_at);
CREATE INDEX IF NOT EXISTS idx_positions_vessel_time ON vessel_positions(vessel_id, recorded_at DESC);
CREATE TABLE IF NOT EXISTS activity_zones (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL UNIQUE,
tier TEXT NOT NULL,
center_lat REAL NOT NULL,
center_lng REAL NOT NULL,
radius_nm REAL NOT NULL,
season TEXT,
notes TEXT,
created_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%SZ', 'now'))
);
CREATE TABLE IF NOT EXISTS routes (
id INTEGER PRIMARY KEY, -- RWGPS route ID (natural key)
name TEXT NOT NULL,
description TEXT,
locality TEXT,
admin_area TEXT,
country_code TEXT,
distance_m REAL,
elevation_gain_m REAL,
elevation_loss_m REAL,
first_lat REAL,
first_lng REAL,
last_lat REAL,
last_lng REAL,
track_type TEXT,
terrain TEXT,
difficulty TEXT,
surface TEXT,
unpaved_pct REAL,
rwgps_url TEXT,
collection TEXT,
archived INTEGER DEFAULT 0,
synced_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%SZ', 'now'))
);
CREATE TABLE IF NOT EXISTS proximity_alerts (
id INTEGER PRIMARY KEY AUTOINCREMENT,
vessel_id INTEGER NOT NULL REFERENCES vessels(id),
zone_id INTEGER REFERENCES activity_zones(id),
alert_type TEXT NOT NULL CHECK (alert_type IN ('zone_entry', 'zone_exit', 'vessel_proximity', 'repositioning')),
distance_nm REAL,
details TEXT,
acknowledged INTEGER DEFAULT 0,
created_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%SZ', 'now'))
);
CREATE TABLE IF NOT EXISTS weather_snapshots (
id INTEGER PRIMARY KEY AUTOINCREMENT,
lat REAL NOT NULL,
lon REAL NOT NULL,
zone_id INTEGER REFERENCES activity_zones(id),
temp_c REAL,
wind_mph REAL,
precip_mm REAL,
wave_m REAL,
cycling_go INTEGER,
kayaking_go INTEGER,
hiking_go INTEGER,
source TEXT DEFAULT 'open-meteo',
recorded_at TEXT NOT NULL,
created_at TEXT NOT NULL DEFAULT (strftime('%Y-%m-%dT%H:%M:%SZ', 'now'))
);
CREATE TABLE IF NOT EXISTS condition_gates (
activity TEXT PRIMARY KEY,
min_temp_f REAL,
max_temp_f REAL,
max_wind_mph REAL,
max_precip_mm_hr REAL,
max_wave_m REAL,
notes TEXT
);
Fleet Registry + Activity Zones -> Mainframe
Why first: This is pure data ingestion from files we already control. No API keys needed, no network calls, no browser context. It gives the mainframe situational awareness -- it knows what vessels exist and what zones matter.
fleet-registry.md into vessels + fleet_membership tables (54 vessels)activity_zones.json into activity_zones + zone sub-tables (24 zones)condition_gates from the JSONsync_fleet_registry(), sync_activity_zones()mainframe.py for proximity calculationsDeliverable: mainframe.py can answer "What vessels are in TIER ONE?", "What zones support kayaking?", "What's the season window for Mallorca?"
RWGPS -> Mainframe
Why second: The route library is Shore Party's core IP. Caching 1,088 routes locally means any PKA team member can query routes by proximity, country, difficulty, or surface without needing browser context or API calls.
sync_routes() that pulls all 6 pages from RWGPS API and upserts into routes tablefind_routes_near(lat, lng, radius_nm) query functionroutes_in_zone(zone_id) that cross-references routes against zone bounding boxesMarineTraffic -> Mainframe
Why third: This is the highest-value operational data but also the hardest to automate. Building the persistence layer first means that when positions are pulled, they go straight into the time-series table.
record_positions(position_data)latest_positions() that returns the most recent position per vesselcheck_zone_proximity() that runs Haversine against all active zonesmovement_classifier(vessel_id, hours=24)Open-Meteo -> Mainframe
fetch_weather(lat, lng) using Open-Meteo free APIevaluate_conditions(lat, lng) that checks weather against condition_gatesactivity_briefing(vessel_id) — the full automated briefingMainframe <-> Notion
sync_to_notion_positions()sync_from_notion_vessels()sync_captain_intel()x-rwgps-api-key and x-rwgps-auth-token are needed for the route library sync. Are these stored somewhere accessible, or do I need to extract them from the browser session?shore-party-proxy forwards requests to api.datalastic.com. What is the API key, and what is the rate limit / monthly quota? Is this an active subscription?The ecosystem is rich but fragmented. The core intelligence -- routes, vessels, zones, positions -- lives across markdown files, JSON files, browser scraping sessions, and Notion databases with no persistent local query layer. The mainframe is perfectly positioned to become that layer.
The integration is five phases over roughly five weeks. Phase 1 requires zero external dependencies. Each subsequent phase builds on the last. By the end, a single call to activity_briefing("CLOUDBREAK") would return: current position, matched activity zone, nearby routes with distances, weather conditions, GO/NO-GO status per activity, and any proximity alerts -- all from local data.
The ten questions above are ordered by urgency. The first two unlock Phase 2. The next two unlock Phase 3. The rest can be answered anytime.
Standing by.
-- Neo