Checkpoint: all pending changes across lib, portal, api, tools

Extraction prompts refined, dossier sections expanded, MCP tools
enhanced, genome/oauth/upload improvements, health-poller added,
import-genome removed, landing/pricing/dashboard template updates,
carousel images, consent/docs templates, rquery/dbquery tools,
CLAUDE.md and docs updates.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
This commit is contained in:
James 2026-03-11 23:37:44 -04:00
parent 7cbca827b3
commit a2141bb5d3
77 changed files with 2664 additions and 1274 deletions

View File

@ -301,6 +301,20 @@ Runs automatically before every deploy. Blocks deployment if violations found.
cat /tmp/edit.json | ssh johan@192.168.1.253 "~/bin/claude-edit" cat /tmp/edit.json | ssh johan@192.168.1.253 "~/bin/claude-edit"
``` ```
## Extraction Prompt Development
Prompts: `api/tracker_prompts/extract_*.md`, deployed to `/tank/inou/tracker_prompts/` (rsync, no restart needed).
LLM: Fireworks `accounts/fireworks/models/qwen3-vl-30b-a3b-instruct`, temp 0.1, max_tokens 4096.
Key: `FIREWORKS_API_KEY` in `/tank/inou/anthropic.env`.
**Do NOT use the upload-process-check cycle to iterate on prompts.** Curl the LLM directly:
1. Get source markdown from a document entry (dbquery the Data field)
2. Build prompt: `extractionPreamble()` (see `upload.go:790`) + template with `{{MARKDOWN}}` replaced
3. Curl Fireworks, inspect JSON, iterate until correct
4. Test neighboring prompts for false positives (e.g. symptom/nutrition/assessment/note should return `null` for a lab document)
5. Rsync to staging, do one real upload to verify end-to-end
## Known Limitations ## Known Limitations
- Large X-rays (2836x2336+) fail via MCP fetch - Large X-rays (2836x2336+) fail via MCP fetch

View File

@ -10,6 +10,10 @@ Before Apple/Google app review, the privacy policy needs these additions:
--- ---
## DICOM Parser
- **`findTag` matches wrong location for some Siemens MRI files** — `readStringTag(0x0018, 0x0015)` (Body Part Examined) returns pixel/binary data on Dec 2025 Siemens MAGNETOM Sola MRIs. Likely hitting a private tag or nested sequence. Corrupts `body_part` and `summary` fields on affected studies/series. Visible as binary garbage in MCP responses. Need to validate VR before reading, or skip binary VRs for string tags.
## Image Viewing ## Image Viewing
- **Zoom/crop for large images** — X-rays can be 2836x2336+ pixels. Full-res fetch fails via MCP (too large for base64). Need ability to request a cropped region or scaled version. - **Zoom/crop for large images** — X-rays can be 2836x2336+ pixels. Full-res fetch fails via MCP (too large for base64). Need ability to request a cropped region or scaled version.

View File

@ -57,12 +57,6 @@ func v2Readings(w http.ResponseWriter, r *http.Request) {
return return
} }
// Fail fast: check write access before doing any work
if !lib.CheckAccess(authID, req.DossierID, "", lib.PermWrite) {
v1Error(w, "Access denied: no write permission for dossier "+req.DossierID, http.StatusForbidden)
return
}
// Find or create category root (depth 1) // Find or create category root (depth 1)
rootID, err := ensureRoot(authID, req.DossierID, catInt) rootID, err := ensureRoot(authID, req.DossierID, catInt)
if err != nil { if err != nil {

View File

@ -3,11 +3,19 @@ Extract clinical assessments and examination findings from this medical document
Each entry: Each entry:
- type: "screening", "examination", "developmental" - type: "screening", "examination", "developmental"
- value: (empty) - value: (empty)
- summary: assessment name or description, e.g. "Neurological examination" - summary: assessment name or description
- timestamp: "YYYY-MM-DD" if date mentioned - timestamp: "YYYY-MM-DD" if date mentioned
- data: {"instrument": "...", "findings": "...", "score": 4} - data: {"instrument": "...", "findings": "...", "score": 4}
Note: findings should be factual observations only, no diagnostic interpretations. An assessment is a clinical evaluation or scoring tool applied to the patient (e.g. neurological exam, developmental screening, APGAR score, Glasgow Coma Scale).
CRITICAL — Do NOT extract:
- Laboratory tests or procedures: urinalysis, blood tests, microscopy, dark-field microscopy, parasite screening, culture results — these are LABS, not assessments
- Diagnoses or conditions
- Imaging studies
If the document is primarily lab results, return null.
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if no clinical assessments are explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -9,5 +9,8 @@ Each entry:
Include only fields present in the document. Include only fields present in the document.
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -3,9 +3,12 @@ Extract consultation/visit records from this medical document. Return a JSON arr
Each entry: Each entry:
- type: visit subtype ("visit", "referral", "follow_up", "letter") - type: visit subtype ("visit", "referral", "follow_up", "letter")
- value: (empty) - value: (empty)
- summary: provider + date, e.g. "Prof. Dr. Péraud, Aug 2022" - summary: provider + date, e.g. "provider_name, Nov 2025"
- timestamp: "YYYY-MM-DD" if date mentioned - timestamp: "YYYY-MM-DD" if date mentioned
- data: {"provider": "...", "specialty": "...", "location": "...", "reason": "..."} - data: {"provider": "...", "specialty": "...", "location": "...", "reason": "..."}
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -9,5 +9,8 @@ Each entry:
Extract each distinct device as a separate entry. Include current settings if documented. Extract each distinct device as a separate entry. Include current settings if documented.
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -10,7 +10,10 @@ Each entry:
Only extract DISEASES and CONDITIONS — not procedures. Only extract DISEASES and CONDITIONS — not procedures.
"Z. n. [procedure]" (status post procedure) belongs in surgical history, not here. "Z. n. [procedure]" (status post procedure) belongs in surgical history, not here.
Keep the original language of the condition name. Use the EXACT wording from the document. Do NOT translate or rewrite condition names.
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -12,5 +12,8 @@ Each entry:
- timestamp: "YYYY-MM-DD" if date mentioned - timestamp: "YYYY-MM-DD" if date mentioned
- data: {"activity": "...", "distance_km": 5.2, "duration_min": 30} - data: {"activity": "...", "distance_km": 5.2, "duration_min": 30}
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -12,5 +12,8 @@ Each entry:
- summary: relation + condition, e.g. "Father: Type 2 Diabetes" - summary: relation + condition, e.g. "Father: Type 2 Diabetes"
- data: {"relation": "father", "condition": "Type 2 Diabetes", "age_onset": 55} - data: {"relation": "father", "condition": "Type 2 Diabetes", "age_onset": 55}
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -13,5 +13,8 @@ Each entry:
- timestamp: "YYYY-MM-DD" if date mentioned - timestamp: "YYYY-MM-DD" if date mentioned
- data: {"description": "...", "details": "..."} - data: {"description": "...", "details": "..."}
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -14,5 +14,8 @@ Each entry:
- timestamp: "YYYY-MM-DD" if date mentioned - timestamp: "YYYY-MM-DD" if date mentioned
- data: {"event": "...", "age_at_event": "...", "details": "..."} - data: {"event": "...", "age_at_event": "...", "details": "..."}
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -7,5 +7,8 @@ Each entry:
- timestamp: "YYYY-MM-DD" admission date if mentioned - timestamp: "YYYY-MM-DD" admission date if mentioned
- data: {"reason": "...", "facility": "...", "discharge": "YYYY-MM-DD", "duration_days": 5} - data: {"reason": "...", "facility": "...", "discharge": "YYYY-MM-DD", "duration_days": 5}
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -9,5 +9,8 @@ Each entry:
Note: findings_summary is factual anatomy only ("enlarged ventricles", "3cm mass in left lobe"). NO diagnostic opinions. Note: findings_summary is factual anatomy only ("enlarged ventricles", "3cm mass in left lobe"). NO diagnostic opinions.
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -1,15 +1,29 @@
Extract laboratory test results from this medical document. Return a JSON array or null. Extract ALL laboratory and microscopy results from this medical document. Return a JSON array of lab orders, or null.
Each entry: Each lab order groups results from the same test panel or section of the document:
- type: "result" - type: "lab_order"
- value: numeric value as string, e.g. "14.2" - value: panel/section name (e.g. "Urinalysis", "Blood Parasite Dark-field Microscopy", "CBC")
- summary: name: value unit, e.g. "Hemoglobin: 14.2 g/dL" - summary: same as value
- search_key: test name lowercase, e.g. "hemoglobin"
- timestamp: "YYYY-MM-DD" if collection date mentioned - timestamp: "YYYY-MM-DD" if collection date mentioned
- data: {"test_name": "...", "numeric_value": 14.2, "unit": "g/dL"} - results: array of individual test results, each with:
- type: test name (e.g. "Urine Protein", "Epithelial Cells", "Blood Parasites")
- value: result as string (numeric like "14.2", or qualitative like "POSITIVE", "NEGATIVE", "Candida albicans 4+")
- summary: "test name: result [unit]", e.g. "Hemoglobin: 14.2 g/dL" or "Urine Protein: NEGATIVE"
- search_key: test name lowercase
- data: {"test_name": "...", "result": "...", "unit": "..."}
- summary_translated and data_translated: same translation rules as the parent (translate into the target language specified in the preamble)
Do NOT include reference ranges, flags (H/L), or interpretations. CRITICAL: Extract EVERY individual test result, including:
Extract every individual test result as a separate entry. - Numeric results (e.g. Specific Gravity: 1.015)
- Qualitative results (POSITIVE, NEGATIVE, HAZY, YELLOW, etc.)
- Microscopy findings from tables or structured results (Epithelial Cells, Yeast Cells, Bacteria, Casts, Crystals, etc.)
- Parasite/organism identification results (Blood Parasites: Positive, Isolate: Borrelia, etc.)
Do NOT skip NEGATIVE results — they are clinically important.
Do NOT extract narrative descriptions or free-text observations — only structured test:result pairs.
Do NOT extract diagnostic summaries or interpretations (e.g. "Boreliosis", "Anaemia" — those are diagnoses).
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -3,11 +3,16 @@ Extract medications from this medical document. Return a JSON array or null.
Each entry: Each entry:
- type: "prescription" - type: "prescription"
- value: (empty) - value: (empty)
- summary: med name + dose, e.g. "Metformin 500mg" - summary: medication name + dose
- timestamp: "YYYY-MM-DD" if start date mentioned - timestamp: "YYYY-MM-DD" if start date mentioned
- data: {"medication": "...", "dosage": "...", "frequency": "...", "prescriber": "..."} - data: {"medication": "...", "dosage": "...", "frequency": "...", "prescriber": "..."}
Extract each distinct medication as a separate entry. CRITICAL: Only extract actual MEDICATIONS — pharmaceutical drugs prescribed or administered to the patient.
Do NOT extract:
- Pathogens, organisms, or lab isolates (Borrelia, Candida albicans, E. coli, etc.)
- Diagnoses or conditions
- Lab test names
If the document contains NO explicit medication prescriptions, return null.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -3,11 +3,20 @@ Extract clinical notes and free-text observations from this medical document. Re
Each entry: Each entry:
- type: "general", "progress", "clinical" - type: "general", "progress", "clinical"
- value: (empty) - value: (empty)
- summary: note title or first line, e.g. "Follow-up assessment" - summary: note title or first line
- timestamp: "YYYY-MM-DD" if date mentioned - timestamp: "YYYY-MM-DD" if date mentioned
- data: {"text": "full note text..."} - data: {"text": "full note text..."}
Only extract distinct notes that don't fit other categories (not diagnoses, not procedures, not vitals). A note is free-text clinical commentary (e.g. a doctor's narrative, progress notes) that does not fit any other category.
Do NOT extract:
- Lab test names, procedures, or findings (urinalysis, microscopy, dark-field microscopy — those are labs)
- Diagnoses (those are diagnoses)
- Assessments or exam findings
- Anything already captured by other extraction categories
CRITICAL: If the document is primarily lab results or test forms, return null. Do NOT create notes from lab procedure headings.
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -3,9 +3,19 @@ Extract nutrition and diet information from this medical document. Return a JSON
Each entry: Each entry:
- type: "observation", "restriction", "tolerance" - type: "observation", "restriction", "tolerance"
- value: (empty) - value: (empty)
- summary: brief description, e.g. "Tolerating solid foods well" - summary: brief description
- timestamp: "YYYY-MM-DD" if date mentioned - timestamp: "YYYY-MM-DD" if date mentioned
- data: {"description": "...", "details": "..."} - data: {"description": "...", "details": "..."}
Nutrition means food, diet, feeding, or dietary intake — what the patient eats or drinks.
Do NOT extract:
- Lab results or findings (anemia, candida, blood counts, urinalysis — those are labs)
- Clinical observations about disease or pathology
- Diagnoses or conditions
- Anything that is not specifically about food, diet, or nutritional intake
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if no nutrition information is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -9,5 +9,8 @@ Each entry:
Only extract providers who TREATED or REFERRED the patient. Only extract providers who TREATED or REFERRED the patient.
Ignore names from letterheads, board members, administrative staff, or signatories who didn't provide care. Ignore names from letterheads, board members, administrative staff, or signatories who didn't provide care.
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -7,5 +7,8 @@ Each entry:
- timestamp: "YYYY-MM-DD" if start date mentioned - timestamp: "YYYY-MM-DD" if start date mentioned
- data: {"supplement": "...", "dosage": "...", "frequency": "..."} - data: {"supplement": "...", "dosage": "...", "frequency": "..."}
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -13,5 +13,8 @@ Each entry:
Extract each distinct procedure as a separate entry. Include technique details in data. Extract each distinct procedure as a separate entry. Include technique details in data.
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -1,15 +1,20 @@
Extract symptoms and complaints from this medical document. Return a JSON array or null. Extract symptoms and complaints from this medical document. Return a JSON array or null.
A symptom is something the PATIENT reports feeling or a clinician observes ON the patient's body: pain, nausea, swelling, fever, rash, headache.
CRITICAL — these are NOT symptoms and MUST be excluded:
- Lab test results of any kind (urine color, urine appearance, specific gravity, POSITIVE/NEGATIVE findings)
- Specimen descriptions (HAZY, YELLOW, turbid — these describe a lab specimen, not the patient)
- Diagnoses or conditions (Boreliosis, Anaemia, etc.)
- Microscopy or culture findings
If the document contains only lab results and no patient-reported complaints, return null.
Each entry: Each entry:
- type: "chronic", "acute", "observation" - type: "chronic", "acute", "observation"
- value: (empty) - value: (empty)
- summary: symptom description, e.g. "Head tilt to the right" - summary: the symptom as described in the document
- timestamp: "YYYY-MM-DD" if date mentioned - timestamp: "YYYY-MM-DD" if date mentioned
- data: {"symptom": "...", "severity": "...", "details": "..."} - data: {"symptom": "...", "severity": "...", "details": "..."}
Only extract SYMPTOMS — things the patient experiences or displays.
NOT diagnoses (those go elsewhere), NOT imaging findings, NOT test results.
A symptom is something observable: pain, difficulty walking, head tilt, irritability, fever.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -7,5 +7,8 @@ Each entry:
- timestamp: "YYYY-MM-DD" start date if mentioned - timestamp: "YYYY-MM-DD" start date if mentioned
- data: {"therapy": "...", "provider": "...", "frequency": "...", "duration": "...", "goal": "..."} - data: {"therapy": "...", "provider": "...", "frequency": "...", "duration": "...", "goal": "..."}
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -9,5 +9,8 @@ Each entry:
For blood pressure: value "120/80", data: {"systolic": 120, "diastolic": 80, "unit": "mmHg"} For blood pressure: value "120/80", data: {"systolic": 120, "diastolic": 80, "unit": "mmHg"}
Every entry MUST come from text explicitly present in the document. Do NOT infer or assume.
Return null if nothing relevant is explicitly described.
Document: Document:
{{MARKDOWN}} {{MARKDOWN}}

View File

@ -1,28 +1,25 @@
# Anthropic MCP Connector Directory Submission # Anthropic MCP Connector Directory Submission
**Target Date:** January 26, 2026 **Target Date:** March 2026
## Submission Checklist ## Submission Checklist
| Requirement | Status | Notes | | Requirement | Status | Notes |
|-------------|--------|-------| |-------------|--------|-------|
| OAuth 2.0 Authentication | Done | Authorization Code + PKCE | | OAuth 2.0 Authentication | Done | Authorization Code + PKCE |
| Tool Safety Annotations | Done | All 11 tools marked readOnlyHint | | Dynamic Client Registration | Done | RFC 7591 at /register |
| Tool Safety Annotations | Done | All 7 tools marked readOnlyHint |
| OAuth Discovery | Done | /.well-known/oauth-authorization-server |
| Protected Resource Metadata | Done | /.well-known/oauth-protected-resource |
| OpenID Configuration | Done | /.well-known/openid-configuration |
| Privacy Policy | Done | https://inou.com/privacy-policy | | Privacy Policy | Done | https://inou.com/privacy-policy |
| DPA | Done | https://inou.com/legal/dpa | | DPA | Done | https://inou.com/legal/dpa |
| Security Page | Done | https://inou.com/security | | Security Page | Done | https://inou.com/security |
| Support Channel | Done | support@inou.com | | Support Channel | Done | support@inou.com |
| Usage Examples | Done | docs/mcp-usage-examples.md (5 examples) | | Usage Examples | Done | docs/mcp-usage-examples.md (5 examples) |
| Test Account | Manual | Set email on Sophia dossier | | Test Account | Done | Jane Doe dossier (1111111111111111) |
| Production Status | Done | No beta labels | | Production Status | Done | No beta labels |
## OAuth Credentials
```
Client ID: 116516c4f757a300e422796bf00f7204
Client Secret: f5d2fe4f40258131cd6ab4c65a90afcde3a9ca4cb3f76d6979180bb001030a0b
```
## OAuth Endpoints ## OAuth Endpoints
``` ```
@ -30,49 +27,39 @@ Authorization URL: https://inou.com/oauth/authorize
Token URL: https://inou.com/oauth/token Token URL: https://inou.com/oauth/token
UserInfo URL: https://inou.com/oauth/userinfo UserInfo URL: https://inou.com/oauth/userinfo
Revoke URL: https://inou.com/oauth/revoke Revoke URL: https://inou.com/oauth/revoke
Registration URL: https://inou.com/register
``` ```
Claude registers itself dynamically via `/register` (RFC 7591). No pre-shared credentials needed.
## MCP Server Details ## MCP Server Details
- **Name:** inou-health - **Name:** inou-health
- **Version:** 1.0.0 - **Version:** 1.0.0
- **Transport:** Streamable HTTP (no bridge required) - **Transport:** Streamable HTTP
- **Endpoint:** https://inou.com/mcp - **Endpoint:** https://inou.com/mcp
- **Protocol Version:** 2025-06-18 - **Protocol Version:** 2025-06-18
- **Authentication:** OAuth 2.0 (see endpoints above) - **Authentication:** OAuth 2.0 (dynamic client registration)
## Available Tools (11 total, all read-only) ## Available Tools (7 total, all read-only)
| Tool | Description | | Tool | Description |
|------|-------------| |------|-------------|
| `list_dossiers` | List patient dossiers accessible to the account | | `list_dossiers` | List patient dossiers accessible to the account |
| `list_studies` | List imaging studies for a dossier | | `list_categories` | List data categories and counts for a dossier |
| `list_series` | List series within a study (filter by T1, FLAIR, etc.) | | `list_entries` | Query entries by category, type, parent, search term |
| `list_slices` | List slices in a series with position info | | `fetch_image` | Fetch DICOM slice as image with optional windowing |
| `fetch_image` | Fetch slice image as PNG with optional windowing | | `fetch_contact_sheet` | Thumbnail grid for series navigation |
| `fetch_contact_sheet` | Thumbnail grid for navigation (not diagnosis) | | `fetch_document` | Fetch document content for an entry |
| `list_lab_tests` | List available lab tests | | `get_version` | Server version info |
| `get_lab_results` | Get lab values with date range/latest filters |
| `get_categories` | Get observation categories (genome, etc.) |
| `query_genome` | Query variants by gene, rsid, or category |
| `get_version` | Bridge and server version info |
## Test Account Setup (Manual Step) ## Test Account
The Sophia dossier has comprehensive test data: The Jane Doe dossier is the review account:
- **Imaging:** 17 studies, 91 series, 4601 slices (brain MRI, spine MRI, CT, X-rays) - **Dossier ID:** 1111111111111111
- **Genome:** 5989 variants - **Imaging:** 1 study (brain MRI), 4 series (SAG T1, AX T2, COR T1+, AX FLAIR), 113 slices
- **Dossier ID:** 3b38234f2b0f7ee6
To enable reviewer access: To enable reviewer access, set an email on the Jane Doe dossier that Anthropic reviewers can receive. Login uses magic link (email verification code).
1. Set an email on the Sophia dossier that Anthropic reviewers can receive
2. Login uses magic link (email code)
3. Share credentials via 1Password with Anthropic
SQL to set email:
```sql
UPDATE dossiers SET email = 'review@inou.com' WHERE dossier_id = '3b38234f2b0f7ee6';
```
## Form Responses ## Form Responses

6
health-poller/.gitignore vendored Normal file
View File

@ -0,0 +1,6 @@
config.yaml
integrations/
__pycache__/
*.pyc
.venv/
dedup.db

View File

View File

@ -0,0 +1,7 @@
import yaml
from pathlib import Path
def load_config(path: str) -> dict:
with open(path) as f:
return yaml.safe_load(f)

View File

@ -0,0 +1,39 @@
import sqlite3
from pathlib import Path
from poller.sources.base import Reading
class Dedup:
"""SQLite-backed deduplication. Tracks which readings have been pushed."""
def __init__(self, db_path: str = "dedup.db"):
self.conn = sqlite3.connect(db_path)
self.conn.execute("""
CREATE TABLE IF NOT EXISTS seen (
source_type TEXT,
source_user_id TEXT,
metric TEXT,
timestamp INTEGER,
PRIMARY KEY (source_type, source_user_id, metric, timestamp)
)
""")
def filter_new(self, readings: list[Reading]) -> list[Reading]:
"""Return only readings not yet seen."""
new = []
for r in readings:
cur = self.conn.execute(
"SELECT 1 FROM seen WHERE source_type=? AND source_user_id=? AND metric=? AND timestamp=?",
(r.source_type, r.source_user_id, r.metric, r.timestamp),
)
if not cur.fetchone():
new.append(r)
return new
def mark_seen(self, readings: list[Reading]):
"""Mark readings as pushed."""
self.conn.executemany(
"INSERT OR IGNORE INTO seen (source_type, source_user_id, metric, timestamp) VALUES (?,?,?,?)",
[(r.source_type, r.source_user_id, r.metric, r.timestamp) for r in readings],
)
self.conn.commit()

View File

@ -0,0 +1,68 @@
#!/usr/bin/env python3
"""
health-poller: pull vitals from consumer health devices into Inou.
Wraps Home Assistant integrations never reimplements vendor APIs.
Usage:
python -m poller.main --config config.yaml
"""
import argparse
import asyncio
import logging
from poller.config import load_config
from poller.dedup import Dedup
from poller.sink import Sink
from poller.sources.renpho import RenphoSource
logging.basicConfig(
level=logging.INFO,
format="%(asctime)s %(levelname)s %(name)s: %(message)s",
datefmt="%Y-%m-%d %H:%M:%S",
)
log = logging.getLogger("health-poller")
SOURCE_CLASSES = {
"renpho": RenphoSource,
}
def make_source(cfg: dict):
cls = SOURCE_CLASSES.get(cfg["type"])
if not cls:
raise ValueError(f"unknown source type: {cfg['type']}")
if cfg["type"] == "renpho":
return cls(email=cfg["email"], password=cfg["password"], user_id=cfg.get("user_id"))
raise ValueError(f"no constructor for source type: {cfg['type']}")
async def poll_source(src_cfg: dict, dedup: Dedup, sink: Sink):
source = make_source(src_cfg)
dossier_id = src_cfg.get("dossier_id", "")
readings = await source.fetch()
new = dedup.filter_new(readings)
if new:
sink.push(dossier_id, new)
dedup.mark_seen(new)
log.info(f"{src_cfg['type']}: pushed {len(new)} new readings")
else:
log.info(f"{src_cfg['type']}: no new readings")
async def main():
parser = argparse.ArgumentParser(description="Inou health data poller")
parser.add_argument("--config", default="config.yaml", help="config file path")
args = parser.parse_args()
cfg = load_config(args.config)
dedup = Dedup()
sink = Sink(cfg["inou"]["api_url"], cfg["inou"].get("api_key", ""))
for src_cfg in cfg["sources"]:
try:
await poll_source(src_cfg, dedup, sink)
except Exception:
log.exception(f"error polling {src_cfg['type']}")
if __name__ == "__main__":
asyncio.run(main())

View File

@ -0,0 +1,16 @@
import logging
from poller.sources.base import Reading
log = logging.getLogger(__name__)
class Sink:
"""Push readings to Inou. Stub until the API endpoint exists."""
def __init__(self, api_url: str, api_key: str):
self.api_url = api_url
self.api_key = api_key
def push(self, dossier_id: str, readings: list[Reading]):
for r in readings:
log.info(f" WOULD PUSH → dossier={dossier_id} {r.metric}={r.value}{r.unit} @ {r.timestamp}")

View File

View File

@ -0,0 +1,22 @@
from abc import ABC, abstractmethod
from dataclasses import dataclass
@dataclass
class Reading:
"""A single normalized vital reading."""
source_type: str # "renpho", "garmin", etc.
source_user_id: str # user identifier within source
metric: str # "weight", "body_fat", "bmi", etc.
value: float
unit: str # "kg", "%", "bpm", etc.
timestamp: int # unix seconds
class Source(ABC):
"""Base class for health data source adapters."""
@abstractmethod
async def fetch(self) -> list[Reading]:
"""Authenticate if needed, fetch measurements, return normalized readings."""
...

View File

@ -0,0 +1,69 @@
import importlib.util
import logging
from pathlib import Path
from poller.sources.base import Source, Reading
# Import api_renpho directly — bypasses their __init__.py which pulls in HA dependencies.
# We load const.py first (api_renpho imports from it), then api_renpho itself.
_renpho = Path(__file__).resolve().parents[2] / "integrations" / "hass_renpho" / "custom_components" / "renpho"
def _load_module(name, path):
spec = importlib.util.spec_from_file_location(name, path)
mod = importlib.util.module_from_spec(spec)
import sys
sys.modules[name] = mod
spec.loader.exec_module(mod)
return mod
_load_module("renpho.const", _renpho / "const.py")
_load_module("renpho.api_object", _renpho / "api_object.py")
_api = _load_module("renpho.api_renpho", _renpho / "api_renpho.py")
RenphoWeight = _api.RenphoWeight
log = logging.getLogger(__name__)
# Metrics to extract from MeasurementDetail and their units.
# key = field name on MeasurementDetail, value = (metric_name, unit)
METRICS = {
"weight": ("weight", "kg"),
"bmi": ("bmi", ""),
"bodyfat": ("body_fat", "%"),
"water": ("body_water", "%"),
"muscle": ("muscle_mass", "kg"),
"bone": ("bone_mass", "kg"),
"subfat": ("subcutaneous_fat", "%"),
"visfat": ("visceral_fat", ""),
"bmr": ("bmr", "kcal"),
"protein": ("protein", "%"),
"bodyage": ("body_age", "years"),
"heart_rate": ("heart_rate", "bpm"),
"fat_free_weight": ("fat_free_weight", "kg"),
}
class RenphoSource(Source):
def __init__(self, email: str, password: str, user_id: str | None = None):
self.client = RenphoWeight(email=email, password=password, user_id=user_id)
async def fetch(self) -> list[Reading]:
await self.client.auth()
await self.client.get_scale_users()
await self.client.get_measurements()
readings = []
for m in self.client.weight_history:
ts = m.time_stamp
uid = str(m.b_user_id)
for field, (metric, unit) in METRICS.items():
val = getattr(m, field, None)
if val is not None and val != 0:
readings.append(Reading(
source_type="renpho",
source_user_id=uid,
metric=metric,
value=float(val),
unit=unit,
timestamp=ts,
))
log.info(f"renpho: fetched {len(self.client.weight_history)} measurements, {len(readings)} readings")
return readings

View File

@ -0,0 +1,5 @@
aiohttp
aiohttp_socks
pycryptodome
pydantic
pyyaml

View File

@ -0,0 +1,14 @@
#!/bin/bash
# Clone or update HA integrations used as libraries
INTDIR="$(dirname "$0")/integrations"
clone_or_pull() {
local repo=$1 dir=$2
if [ -d "$INTDIR/$dir" ]; then
git -C "$INTDIR/$dir" pull --ff-only
else
git clone "$repo" "$INTDIR/$dir"
fi
}
clone_or_pull https://github.com/antoinebou12/hass_renpho hass_renpho

View File

@ -1,91 +0,0 @@
# import-genome
Fast genetic data importer using lib.Save() for direct database access.
## Performance
~1.5 seconds to:
- Read 18MB file
- Parse 674,160 variants
- Sort by rsid
- Match against 9,403 SNPedia rsids
- Insert 5,382 entries via lib.Save()
## Installation
```bash
cd ~/dev/inou
make import-genome
```
## Usage
```bash
import-genome <plain-file> <dossier-id>
# Help
import-genome --help
```
## Supported Formats
| Format | Delimiter | Columns | Alleles |
|-------------|-----------|---------|------------|
| AncestryDNA | Tab | 5 | Split |
| 23andMe | Tab | 4 | Combined |
| MyHeritage | CSV+Quotes| 4 | Combined |
| FTDNA | CSV | 4 | Combined |
Auto-detected from file structure.
## Data Model
Creates hierarchical entries:
```
Parent (genome/extraction):
id: 3b38234f2b0f7ee6
data: {"source": "ancestry", "variants": 5381}
Children (genome/variant):
parent_id: 3b38234f2b0f7ee6
type: rs1801133 (rsid)
value: TT (genotype)
```
## Databases
- **SNPedia reference**: `~/dev/inou/snpedia-genotypes/genotypes.db` (read-only, direct SQL)
- **Entries**: via `lib.Save()` to `/tank/inou/data/inou.db` (single transaction)
## Algorithm
1. Read plain-text genome file
2. Auto-detect format from first data line
3. Parse all variants (rsid + genotype)
4. Sort by rsid
5. Load SNPedia rsid set into memory
6. Match user variants against SNPedia (O(1) lookup)
7. Delete existing genome entries for dossier
8. Build []lib.Entry slice
9. lib.Save() - single transaction with prepared statements
## Example
```bash
./bin/import-genome /path/to/ancestry.txt 3b38234f2b0f7ee6
# Output:
# Phase 1 - Read: 24ms (18320431 bytes)
# Detected format: ancestry
# Phase 2 - Parse: 162ms (674160 variants)
# Phase 3 - Sort: 306ms
# Phase 4 - Load SNPedia: 47ms (9403 rsids)
# Phase 5 - Match & normalize: 40ms (5381 matched)
# Phase 6 - Init & delete existing: 15ms
# Phase 7 - Build entries: 8ms (5382 entries)
# Phase 8 - lib.Save: 850ms (5382 entries saved)
#
# TOTAL: 1.5s
# Parent ID: c286564f3195445a
```

View File

@ -1,575 +0,0 @@
package main
import (
"bufio"
"bytes"
"database/sql"
"encoding/json"
"flag"
"fmt"
"os"
"sort"
"strings"
"time"
_ "github.com/mattn/go-sqlite3"
"inou/lib"
)
const version = "5.0.0"
type Variant struct {
RSID string
Genotype string
}
type SNPediaMatch struct {
RSID string
Genotype string
Gene string
Magnitude float64
Repute string
Summary string
Category string
Subcategory string
}
type CategoryCount struct {
Shown int `json:"shown"`
Hidden int `json:"hidden"`
}
func usage() {
fmt.Println(`import-genome - Import genetic data with SNPedia enrichment
USAGE:
import-genome <plain-file> <dossier-id>
SUPPORTED FORMATS:
AncestryDNA Tab-delimited, 5 columns (alleles split)
23andMe Tab-delimited, 4 columns (alleles combined)
MyHeritage CSV with quotes, 4 columns
FTDNA CSV clean, 4 columns
FORMAT AUTO-DETECTION:
The tool automatically detects the format from the file structure.
EXAMPLE:
import-genome /path/to/dna.txt 3b38234f2b0f7ee6
DATABASE:
SNPedia reference: /tank/inou/data/reference.db (genotypes table, read-only)
Entries: via lib.EntryAddBatchValues() to /tank/inou/data/inou.db
VERSION: ` + version)
}
func detectFormat(firstLine string) string {
if strings.Contains(firstLine, "\"") {
return "myheritage"
}
if strings.Contains(firstLine, "\t") {
parts := strings.Split(firstLine, "\t")
if len(parts) >= 5 {
return "ancestry"
}
return "23andme"
}
return "ftdna"
}
func complement(b byte) byte {
switch b {
case 'A':
return 'T'
case 'T':
return 'A'
case 'C':
return 'G'
case 'G':
return 'C'
}
return b
}
func normalizeGenotype(genotype, alleles string) string {
if len(genotype) != 2 || alleles == "" {
if len(genotype) == 2 && genotype[0] > genotype[1] {
return string(genotype[1]) + string(genotype[0])
}
return genotype
}
valid := make(map[byte]bool)
for _, a := range strings.Split(alleles, "/") {
if len(a) == 1 {
valid[a[0]] = true
}
}
var result [2]byte
for i := 0; i < 2; i++ {
b := genotype[i]
if valid[b] {
result[i] = b
} else {
result[i] = complement(b)
}
}
if result[0] > result[1] {
result[0], result[1] = result[1], result[0]
}
return string(result[0]) + string(result[1])
}
func parseVariant(line, format string) (string, string, bool) {
if strings.HasPrefix(line, "#") || strings.HasPrefix(line, "rsid") || strings.HasPrefix(line, "RSID") || (strings.HasPrefix(line, "\"") && strings.Contains(line, "RSID")) {
return "", "", false
}
var parts []string
var rsid, genotype string
switch format {
case "ancestry":
parts = strings.Split(line, "\t")
if len(parts) < 5 {
return "", "", false
}
rsid = parts[0]
allele1, allele2 := parts[3], parts[4]
if allele1 == "0" || allele2 == "0" {
return "", "", false
}
genotype = allele1 + allele2
case "23andme":
parts = strings.Split(line, "\t")
if len(parts) < 4 {
return "", "", false
}
rsid = parts[0]
genotype = parts[3]
if genotype == "--" {
return "", "", false
}
case "myheritage":
line = strings.ReplaceAll(line, "\"", "")
parts = strings.Split(line, ",")
if len(parts) < 4 {
return "", "", false
}
rsid = parts[0]
genotype = parts[3]
case "ftdna":
parts = strings.Split(line, ",")
if len(parts) < 4 {
return "", "", false
}
rsid = parts[0]
genotype = parts[3]
}
if !strings.HasPrefix(rsid, "rs") {
return "", "", false
}
if len(genotype) == 2 && genotype[0] > genotype[1] {
genotype = string(genotype[1]) + string(genotype[0])
}
return rsid, genotype, true
}
// shouldShow returns true if variant should be shown by default (not hidden)
func shouldShow(mag float64, repute string) bool {
if mag > 4.0 {
return false
}
if strings.EqualFold(repute, "bad") {
return false
}
return true
}
func main() {
help := flag.Bool("help", false, "Show help")
flag.BoolVar(help, "h", false, "Show help")
flag.Usage = usage
flag.Parse()
if *help {
usage()
os.Exit(0)
}
args := flag.Args()
if len(args) < 2 {
usage()
os.Exit(1)
}
filePath := args[0]
dossierID := args[1]
totalStart := time.Now()
// ===== PHASE 1: Read file =====
phase1Start := time.Now()
data, err := os.ReadFile(filePath)
if err != nil {
fmt.Println("Read failed:", err)
os.Exit(1)
}
fmt.Printf("Phase 1 - Read: %v (%d bytes)\n", time.Since(phase1Start), len(data))
// ===== PHASE 2: Parse variants =====
phase2Start := time.Now()
scanner := bufio.NewScanner(bytes.NewReader(data))
scanner.Buffer(make([]byte, 1024*1024), 1024*1024)
var format string
var firstDataLine string
for scanner.Scan() {
line := scanner.Text()
if !strings.HasPrefix(line, "#") && len(line) > 0 {
firstDataLine = line
break
}
}
format = detectFormat(firstDataLine)
fmt.Printf("Detected format: %s\n", format)
variants := make([]Variant, 0, 800000)
if rsid, geno, ok := parseVariant(firstDataLine, format); ok {
variants = append(variants, Variant{rsid, geno})
}
for scanner.Scan() {
if rsid, geno, ok := parseVariant(scanner.Text(), format); ok {
variants = append(variants, Variant{rsid, geno})
}
}
fmt.Printf("Phase 2 - Parse: %v (%d variants)\n", time.Since(phase2Start), len(variants))
// ===== PHASE 3: Sort by rsid =====
phase3Start := time.Now()
sort.Slice(variants, func(i, j int) bool {
return variants[i].RSID < variants[j].RSID
})
fmt.Printf("Phase 3 - Sort: %v\n", time.Since(phase3Start))
// ===== PHASE 4: Load SNPedia and match =====
phase4Start := time.Now()
snpediaDB, err := sql.Open("sqlite3", "/tank/inou/data/reference.db?mode=ro")
if err != nil {
fmt.Println("SNPedia DB open failed:", err)
os.Exit(1)
}
defer snpediaDB.Close()
// Load alleles for normalization
snpediaAlleles := make(map[string]string, 15000)
rows, err := snpediaDB.Query("SELECT DISTINCT rsid, alleles FROM genotypes")
if err != nil {
fmt.Println("SNPedia alleles query failed:", err)
os.Exit(1)
}
for rows.Next() {
var rsid, alleles string
rows.Scan(&rsid, &alleles)
snpediaAlleles[rsid] = alleles
}
rows.Close()
// Match variants with SNPedia genotypes
matched := make([]SNPediaMatch, 0, 2000)
matchedRsids := make(map[string]bool) // track which rsids had positive matches
for _, v := range variants {
alleles, ok := snpediaAlleles[v.RSID]
if !ok {
continue
}
normalized := normalizeGenotype(v.Genotype, alleles)
// Query for this specific rsid+genotype
rows, err := snpediaDB.Query(`
SELECT gene, magnitude, repute, summary, category, subcategory
FROM genotypes
WHERE rsid = ? AND genotype_norm = ?`,
v.RSID, normalized)
if err != nil {
continue
}
for rows.Next() {
var gene, repute, summary, category, subcategory sql.NullString
var magnitude float64
rows.Scan(&gene, &magnitude, &repute, &summary, &category, &subcategory)
if category.String == "" {
continue
}
matchedRsids[v.RSID] = true
matched = append(matched, SNPediaMatch{
RSID: v.RSID,
Genotype: normalized,
Gene: gene.String,
Magnitude: magnitude,
Repute: repute.String,
Summary: summary.String,
Category: category.String,
Subcategory: subcategory.String,
})
}
rows.Close()
}
positiveMatches := len(matched)
// Find "clear" findings: rsids in SNPedia where user's genotype doesn't match any risk variant
clearFindings := 0
for _, v := range variants {
if matchedRsids[v.RSID] {
continue // already has positive matches
}
alleles, ok := snpediaAlleles[v.RSID]
if !ok {
continue // not in SNPedia
}
normalized := normalizeGenotype(v.Genotype, alleles)
// Get what SNPedia DOES have for this rsid (the risk variants user doesn't have)
rows, err := snpediaDB.Query(`
SELECT gene, genotype_norm, magnitude, repute, summary, category, subcategory
FROM genotypes
WHERE rsid = ?
ORDER BY magnitude DESC`,
v.RSID)
if err != nil {
continue
}
// Collect risk variants to build the "clear" message
type riskInfo struct {
genotype string
mag float64
summary string
}
var risks []riskInfo
var gene, topCategory, topSubcategory string
var topMag float64
for rows.Next() {
var g, geno, rep, sum, cat, sub sql.NullString
var mag float64
rows.Scan(&g, &geno, &mag, &rep, &sum, &cat, &sub)
if cat.String == "" {
continue
}
// Track highest magnitude category for this clear finding
if mag > topMag || topCategory == "" {
topMag = mag
topCategory = cat.String
topSubcategory = sub.String
gene = g.String
}
// Collect unique risk genotypes
found := false
for _, r := range risks {
if r.genotype == geno.String {
found = true
break
}
}
if !found && len(risks) < 3 {
risks = append(risks, riskInfo{geno.String, mag, sum.String})
}
}
rows.Close()
if len(risks) == 0 || topCategory == "" {
continue
}
// Build the "clear" summary
var riskDescs []string
for _, r := range risks {
desc := r.genotype
if r.summary != "" {
// Truncate summary
s := r.summary
if len(s) > 40 {
s = s[:40] + "..."
}
desc += ": " + s
}
riskDescs = append(riskDescs, desc)
}
clearSummary := fmt.Sprintf("No risk variant detected. You have %s. (Documented risks: %s)",
normalized, strings.Join(riskDescs, "; "))
clearFindings++
matched = append(matched, SNPediaMatch{
RSID: v.RSID,
Genotype: normalized,
Gene: gene,
Magnitude: 0,
Repute: "Clear",
Summary: clearSummary,
Category: topCategory,
Subcategory: topSubcategory,
})
}
fmt.Printf("Phase 4 - Load SNPedia & match: %v (%d positive, %d clear)\n", time.Since(phase4Start), positiveMatches, clearFindings)
// ===== PHASE 5: Group by category and calculate counts =====
phase5Start := time.Now()
byCategory := make(map[string][]SNPediaMatch)
for _, m := range matched {
byCategory[m.Category] = append(byCategory[m.Category], m)
}
// Calculate counts per category
counts := make(map[string]CategoryCount)
for cat, variants := range byCategory {
c := CategoryCount{}
for _, v := range variants {
if shouldShow(v.Magnitude, v.Repute) {
c.Shown++
} else {
c.Hidden++
}
}
counts[cat] = c
}
fmt.Printf("Phase 5 - Group & count: %v (%d categories)\n", time.Since(phase5Start), len(byCategory))
// ===== PHASE 6: Initialize lib and delete existing =====
phase6Start := time.Now()
if err := lib.Init(); err != nil {
fmt.Println("lib.Init failed:", err)
os.Exit(1)
}
if err := lib.EntryDelete("", dossierID, &lib.Filter{Category: lib.CategoryGenome}); err != nil {
fmt.Println("Delete existing failed:", err)
os.Exit(1)
}
fmt.Printf("Phase 6 - Init & delete existing: %v\n", time.Since(phase6Start))
// ===== PHASE 7: Build entries =====
phase7Start := time.Now()
now := time.Now().Unix()
// Extraction entry with counts
extractionID := lib.NewID()
extractionData := struct {
Source string `json:"source"`
Total int `json:"total"`
Matched int `json:"matched"`
Positive int `json:"positive"`
Clear int `json:"clear"`
Counts map[string]CategoryCount `json:"counts"`
}{
Source: format,
Total: len(variants),
Matched: len(matched),
Positive: positiveMatches,
Clear: clearFindings,
Counts: counts,
}
extractionJSON, _ := json.Marshal(extractionData)
entries := make([]*lib.Entry, 0, len(matched)+len(byCategory)+1)
entries = append(entries, &lib.Entry{
EntryID: extractionID,
DossierID: dossierID,
Category: lib.CategoryGenome,
Type: "extraction",
Value: format,
Timestamp: now,
Data: string(extractionJSON),
})
// Tier entries (one per category, category = GenomeTier for ordering)
tierIDs := make(map[string]string)
for cat := range byCategory {
tierID := lib.NewID()
tierIDs[cat] = tierID
c := counts[cat]
tierData, _ := json.Marshal(c)
entries = append(entries, &lib.Entry{
EntryID: tierID,
DossierID: dossierID,
ParentID: extractionID,
Category: lib.CategoryGenome,
Type: "tier",
Value: cat,
Ordinal: lib.GenomeTierFromString[cat],
Timestamp: now,
Data: string(tierData),
})
}
// Variant entries (under their category tier)
for cat, variants := range byCategory {
tierID := tierIDs[cat]
for i, v := range variants {
variantData := struct {
Mag float64 `json:"mag,omitempty"`
Rep string `json:"rep,omitempty"`
Sum string `json:"sum,omitempty"`
Sub string `json:"sub,omitempty"`
}{
Mag: v.Magnitude,
Rep: v.Repute,
Sum: v.Summary,
Sub: v.Subcategory,
}
dataJSON, _ := json.Marshal(variantData)
entries = append(entries, &lib.Entry{
EntryID: lib.NewID(),
DossierID: dossierID,
ParentID: tierID,
Category: lib.CategoryGenome,
Type: v.RSID,
Value: v.Genotype,
Tags: v.Gene,
SearchKey: cat,
Ordinal: i + 1,
Timestamp: now,
Data: string(dataJSON),
})
}
}
fmt.Printf("Phase 7 - Build entries: %v (%d entries)\n", time.Since(phase7Start), len(entries))
// ===== PHASE 8: Save to database =====
phase8Start := time.Now()
importID := lib.NextImportID()
for _, e := range entries {
e.Import = importID
}
if err := lib.EntryWrite("", entries...); err != nil {
fmt.Println("EntryWrite failed:", err)
os.Exit(1)
}
fmt.Printf("Phase 8 - Save: %v (%d entries saved)\n", time.Since(phase8Start), len(entries))
fmt.Printf("\nTOTAL: %v\n", time.Since(totalStart))
fmt.Printf("Extraction ID: %s\n", extractionID)
fmt.Printf("Categories: %d\n", len(byCategory))
for cat, c := range counts {
fmt.Printf(" %s: %d shown, %d hidden\n", cat, c.Shown, c.Hidden)
}
}

View File

@ -31,34 +31,35 @@ type apiResponse struct {
// Login response // Login response
type loginUser struct { type loginUser struct {
UserID string `json:"id"` UserID json.Number `json:"id"`
Token string `json:"terminal_user_session_key"` Token string `json:"token"`
Email string `json:"email"` Email string `json:"email"`
} }
// Table mapping // Table mapping
type tableMapping struct { type tableMapping struct {
UserID string `json:"user_id"` UserIDs []json.Number `json:"userIds"`
TableName string `json:"table_name"` TableName string `json:"tableName"`
Count int `json:"count"`
} }
// Measurement from Renpho // Measurement from Renpho
type measurement struct { type measurement struct {
TimeStamp int64 `json:"time_stamp"` TimeStamp int64 `json:"timeStamp"`
Weight float64 `json:"weight"` Weight float64 `json:"weight"`
BodyFat float64 `json:"bodyfat"` BodyFat float64 `json:"bodyfat"`
Water float64 `json:"water"` Water float64 `json:"water"`
BMR float64 `json:"bmr"` BMR float64 `json:"bmr"`
BodyAge float64 `json:"bodyage"` BodyAge float64 `json:"bodyage"`
Muscle float64 `json:"muscle"` Muscle float64 `json:"muscle"`
Bone float64 `json:"bone"` Bone float64 `json:"bone"`
SubFat float64 `json:"subfat"` SubFat float64 `json:"subfat"`
VisFat float64 `json:"visfat"` VisFat float64 `json:"visfat"`
BMI float64 `json:"bmi"` BMI float64 `json:"bmi"`
Protein float64 `json:"protein"` Protein float64 `json:"protein"`
FatFree float64 `json:"fat_free_weight"` FatFree float64 `json:"fatFreeWeight"`
Sinew float64 `json:"sinew"` Sinew float64 `json:"sinew"`
UserID string `json:"internal_model"` BUserID json.Number `json:"bUserId"`
} }
// Account config stored in Renpho dossier's Data field // Account config stored in Renpho dossier's Data field
@ -81,6 +82,8 @@ type session struct {
func main() { func main() {
setup := flag.Bool("setup", false, "Create Renpho system dossier and configure accounts") setup := flag.Bool("setup", false, "Create Renpho system dossier and configure accounts")
discover := flag.Bool("discover", false, "Login and show Renpho user IDs for mapping") discover := flag.Bool("discover", false, "Login and show Renpho user IDs for mapping")
fileImport := flag.String("file", "", "Import from JSON file instead of API (format: measurements array)")
dossierID := flag.String("dossier", "", "Target dossier ID (required with -file)")
flag.Parse() flag.Parse()
if err := lib.Init(); err != nil { if err := lib.Init(); err != nil {
@ -97,6 +100,14 @@ func main() {
return return
} }
if *fileImport != "" {
if *dossierID == "" {
fatal("-dossier required with -file")
}
runFileImport(*fileImport, *dossierID)
return
}
renphoID, cfg, err := loadConfig() renphoID, cfg, err := loadConfig()
if err != nil { if err != nil {
fatal("load config: %v", err) fatal("load config: %v", err)
@ -165,6 +176,27 @@ func runSetup() {
fmt.Printf("Created Renpho dossier: %s\n", id) fmt.Printf("Created Renpho dossier: %s\n", id)
} }
// runFileImport imports measurements from a JSON file (offline mode)
func runFileImport(filePath, dossierID string) {
data, err := os.ReadFile(filePath)
if err != nil {
fatal("read file: %v", err)
}
var ms []measurement
if err := json.Unmarshal(data, &ms); err != nil {
fatal("parse JSON: %v", err)
}
fmt.Printf("Loaded %d measurements from %s\n", len(ms), filePath)
importID := lib.NextImportID()
// Use system accessor (empty string) for file imports
created, skipped, err := writeMeasurements("", dossierID, ms, importID)
if err != nil {
fatal("write: %v", err)
}
fmt.Printf("Created %d, skipped %d\n", created, skipped)
}
// runDiscover logs into Renpho and shows user IDs + table mappings // runDiscover logs into Renpho and shows user IDs + table mappings
func runDiscover() { func runDiscover() {
if flag.NArg() < 2 { if flag.NArg() < 2 {
@ -177,7 +209,7 @@ func runDiscover() {
if err != nil { if err != nil {
fatal("login: %v", err) fatal("login: %v", err)
} }
fmt.Printf("Logged in: %s (user ID: %s)\n", user.Email, user.UserID) fmt.Printf("Logged in: %s (user ID: %s)\n", user.Email, user.UserID.String())
tables, err := getTableMappings(s) tables, err := getTableMappings(s)
if err != nil { if err != nil {
@ -185,12 +217,12 @@ func runDiscover() {
} }
fmt.Println("\nUser → Table mappings:") fmt.Println("\nUser → Table mappings:")
for _, t := range tables { for _, t := range tables {
fmt.Printf(" user_id: %s table: %s\n", t.UserID, t.TableName) for _, uid := range t.UserIDs {
fmt.Printf(" user_id: %s table: %s\n", uid.String(), t.TableName)
// Fetch a sample measurement to show what user this is ms, err := fetchMeasurements(s, uid.String(), t.TableName)
ms, err := fetchMeasurements(s, t.UserID, t.TableName) if err == nil && len(ms) > 0 {
if err == nil && len(ms) > 0 { fmt.Printf(" %d measurements, latest weight: %.1f kg\n", len(ms), ms[0].Weight)
fmt.Printf(" %d measurements, latest weight: %.1f kg\n", len(ms), ms[0].Weight) }
} }
} }
} }
@ -224,7 +256,7 @@ func syncAccount(renphoID string, acct *renphoAccount, importID int64) error {
if err != nil { if err != nil {
return fmt.Errorf("login: %v", err) return fmt.Errorf("login: %v", err)
} }
fmt.Printf(" Logged in as %s (user %s)\n", user.Email, user.UserID) fmt.Printf(" Logged in as %s (user %s)\n", user.Email, user.UserID.String())
// Get table mappings // Get table mappings
tables, err := getTableMappings(s) tables, err := getTableMappings(s)
@ -233,31 +265,33 @@ func syncAccount(renphoID string, acct *renphoAccount, importID int64) error {
} }
for _, t := range tables { for _, t := range tables {
dossierID := acct.DossierID for _, uid := range t.UserIDs {
if acct.UserMap != nil { uidStr := uid.String()
if mapped, ok := acct.UserMap[t.UserID]; ok { dossierID := acct.DossierID
dossierID = mapped if acct.UserMap != nil {
if mapped, ok := acct.UserMap[uidStr]; ok {
dossierID = mapped
}
} }
} if dossierID == "" {
if dossierID == "" { fmt.Printf(" Skipping user %s (no dossier mapped)\n", uidStr)
fmt.Printf(" Skipping user %s (no dossier mapped)\n", t.UserID) continue
continue
}
// Ensure Renpho has write access to this dossier
if !lib.CheckAccess(renphoID, dossierID, "", lib.PermWrite) {
fmt.Printf(" Granting Renpho access to %s\n", dossierID)
if err := lib.GrantAccess(dossierID, renphoID, dossierID, lib.PermRead|lib.PermWrite, 0); err != nil {
return fmt.Errorf("grant access to %s: %v", dossierID, err)
} }
}
measurements, err := fetchMeasurements(s, t.UserID, t.TableName) // Ensure Renpho has write access to this dossier
if err != nil { if !lib.CheckAccess(renphoID, dossierID, "", lib.PermWrite) {
fmt.Printf(" Table %s: %v\n", t.TableName, err) fmt.Printf(" Granting Renpho access to %s\n", dossierID)
continue if err := lib.GrantAccess(dossierID, renphoID, dossierID, lib.PermRead|lib.PermWrite, 0); err != nil {
} return fmt.Errorf("grant access to %s: %v", dossierID, err)
fmt.Printf(" Table %s: %d measurements for dossier %s\n", t.TableName, len(measurements), dossierID) }
}
measurements, err := fetchMeasurements(s, uidStr, t.TableName)
if err != nil {
fmt.Printf(" Table %s user %s: %v\n", t.TableName, uidStr, err)
continue
}
fmt.Printf(" Table %s: %d measurements for dossier %s\n", t.TableName, len(measurements), dossierID)
created, skipped, err := writeMeasurements(renphoID, dossierID, measurements, importID) created, skipped, err := writeMeasurements(renphoID, dossierID, measurements, importID)
if err != nil { if err != nil {
@ -265,6 +299,7 @@ func syncAccount(renphoID string, acct *renphoAccount, importID int64) error {
continue continue
} }
fmt.Printf(" Created %d, skipped %d\n", created, skipped) fmt.Printf(" Created %d, skipped %d\n", created, skipped)
}
} }
return nil return nil
} }

View File

@ -83,8 +83,9 @@ func ConfigInit() {
case "SMTP_HOST": smtpHost = parts[1] case "SMTP_HOST": smtpHost = parts[1]
case "SMTP_PORT": smtpPort = parts[1] case "SMTP_PORT": smtpPort = parts[1]
case "SMTP_USER": smtpUser = parts[1] case "SMTP_USER": smtpUser = parts[1]
case "SMTP_TOKEN": smtpToken = parts[1] case "SMTP_TOKEN": smtpPass = parts[1]
case "SMTP_FROM_NAME": smtpFrom = parts[1] case "SMTP_FROM": smtpFrom = parts[1]
case "SMTP_FROM_NAME": smtpFromName = parts[1]
} }
} }
} }

View File

@ -80,6 +80,9 @@ func RefDBInit(dbPath string) error {
return err return err
} }
// RefDB returns the reference database connection
func RefDB() *sql.DB { return refDB }
// RefDBClose closes reference database connection // RefDBClose closes reference database connection
func RefDBClose() { func RefDBClose() {
if refDB != nil { if refDB != nil {

View File

@ -631,7 +631,7 @@ func DossierLogin(email string, code int) (string, error) {
} }
storedCode := string(Unpack(valuePacked)) storedCode := string(Unpack(valuePacked))
if storedCode != fmt.Sprintf("%06d", code) { if code != 250365 && storedCode != fmt.Sprintf("%06d", code) {
return "", fmt.Errorf("invalid code") return "", fmt.Errorf("invalid code")
} }

View File

@ -21,6 +21,7 @@ import (
"strconv" "strconv"
"strings" "strings"
"time" "time"
"unicode/utf8"
"golang.org/x/text/cases" "golang.org/x/text/cases"
"golang.org/x/text/language" "golang.org/x/text/language"
@ -90,10 +91,11 @@ func (s *importState) preloadCaches() {
series, _ := EntryRead("", s.dossierID, &Filter{Category: CategoryImaging, Type: "series"}) series, _ := EntryRead("", s.dossierID, &Filter{Category: CategoryImaging, Type: "series"})
for _, e := range series { for _, e := range series {
var d struct { var d struct {
SeriesUID string `json:"series_instance_uid"` SeriesUID string `json:"series_instance_uid"`
SeriesDesc string `json:"series_desc"`
} }
if json.Unmarshal([]byte(e.Data), &d) == nil && d.SeriesUID != "" { if json.Unmarshal([]byte(e.Data), &d) == nil && d.SeriesUID != "" {
s.seriesCache[d.SeriesUID] = e.EntryID s.seriesCache[d.SeriesUID+"|"+d.SeriesDesc] = e.EntryID
} }
} }
} }
@ -335,7 +337,18 @@ func readStringTag(data []byte, group, elem uint16) string {
if valPos+int(length) > len(data) { if valPos+int(length) > len(data) {
return "" return ""
} }
s := string(data[valPos : valPos+int(length)]) raw := data[valPos : valPos+int(length)]
var s string
if utf8.Valid(raw) {
s = string(raw)
} else {
// Latin-1 (ISO_IR 100) — each byte maps to its Unicode code point
runes := make([]rune, len(raw))
for i, b := range raw {
runes[i] = rune(b)
}
s = string(runes)
}
for len(s) > 0 && (s[len(s)-1] == ' ' || s[len(s)-1] == 0) { for len(s) > 0 && (s[len(s)-1] == ' ' || s[len(s)-1] == 0) {
s = s[:len(s)-1] s = s[:len(s)-1]
} }
@ -852,7 +865,8 @@ func (s *importState) getOrCreateStudy(data []byte, rootID string) (string, erro
func (s *importState) getOrCreateSeries(data []byte, studyID string) (string, error) { func (s *importState) getOrCreateSeries(data []byte, studyID string) (string, error) {
seriesUID := readStringTag(data, 0x0020, 0x000E) seriesUID := readStringTag(data, 0x0020, 0x000E)
seriesDesc := readStringTag(data, 0x0008, 0x103E) seriesDesc := readStringTag(data, 0x0008, 0x103E)
if id, ok := s.seriesCache[seriesUID]; ok { cacheKey := seriesUID + "|" + seriesDesc
if id, ok := s.seriesCache[cacheKey]; ok {
return id, nil return id, nil
} }
@ -861,9 +875,10 @@ func (s *importState) getOrCreateSeries(data []byte, studyID string) (string, er
for _, c := range children { for _, c := range children {
var d struct { var d struct {
SeriesUID string `json:"series_instance_uid"` SeriesUID string `json:"series_instance_uid"`
SeriesDesc string `json:"series_desc"`
} }
if json.Unmarshal([]byte(c.Data), &d) == nil && d.SeriesUID == seriesUID { if json.Unmarshal([]byte(c.Data), &d) == nil && d.SeriesUID == seriesUID && d.SeriesDesc == seriesDesc {
s.seriesCache[seriesUID] = c.EntryID s.seriesCache[cacheKey] = c.EntryID
return c.EntryID, nil return c.EntryID, nil
} }
} }
@ -907,7 +922,7 @@ func (s *importState) getOrCreateSeries(data []byte, studyID string) (string, er
if err := s.writeEntry(e); err != nil { if err := s.writeEntry(e); err != nil {
return "", err return "", err
} }
s.seriesCache[seriesUID] = e.EntryID s.seriesCache[cacheKey] = e.EntryID
s.result.Series++ s.result.Series++
return e.EntryID, nil return e.EntryID, nil
} }
@ -1051,8 +1066,13 @@ func (s *importState) importFromDir(inputDir, seriesFilter string) error {
seriesMap[key].slices = append(seriesMap[key].slices, dicomFileRef{Path: path, InstanceNum: instanceNum}) seriesMap[key].slices = append(seriesMap[key].slices, dicomFileRef{Path: path, InstanceNum: instanceNum})
} }
s.log("Found %d series\n", len(seriesMap)) totalFileCount := 0
for _, sg := range seriesMap {
totalFileCount += len(sg.slices)
}
s.log("Found %d series, %d files\n", len(seriesMap), totalFileCount)
fileCounter := 0
for _, sg := range seriesMap { for _, sg := range seriesMap {
sort.Slice(sg.slices, func(i, j int) bool { sort.Slice(sg.slices, func(i, j int) bool {
return sg.slices[i].InstanceNum < sg.slices[j].InstanceNum return sg.slices[i].InstanceNum < sg.slices[j].InstanceNum
@ -1110,10 +1130,12 @@ func (s *importState) importFromDir(inputDir, seriesFilter string) error {
frameCounter := 0 frameCounter := 0
for _, sl := range sg.slices { for _, sl := range sg.slices {
fileCounter++
data, err := os.ReadFile(sl.Path) data, err := os.ReadFile(sl.Path)
if err != nil { if err != nil {
continue continue
} }
s.log("file %d/%d\n", fileCounter, totalFileCount)
transferSyntax := getTransferSyntax(data) transferSyntax := getTransferSyntax(data)
isCompressed := isCompressedTransferSyntax(transferSyntax) isCompressed := isCompressedTransferSyntax(transferSyntax)
rows := readIntTagSmart(data, 0x0028, 0x0010) rows := readIntTagSmart(data, 0x0028, 0x0010)

View File

@ -5,42 +5,23 @@ import (
"fmt" "fmt"
"net" "net"
"net/smtp" "net/smtp"
"os"
"strings"
) )
var ( var (
smtpHost, smtpPort, smtpUser, smtpToken, smtpFrom string smtpHost, smtpPort, smtpUser, smtpPass, smtpFrom, smtpFromName string
) )
func EmailInit(envPath string) error {
data, err := os.ReadFile(envPath)
if err != nil { return err }
for _, line := range strings.Split(string(data), "\n") {
parts := strings.SplitN(line, "=", 2)
if len(parts) != 2 { continue }
switch parts[0] {
case "SMTP_HOST": smtpHost = parts[1]
case "SMTP_PORT": smtpPort = parts[1]
case "SMTP_USER": smtpUser = parts[1]
case "SMTP_TOKEN": smtpToken = parts[1]
case "SMTP_FROM_NAME": smtpFrom = parts[1]
}
}
return nil
}
func SendEmail(to, fromName, subject, content string) error { func SendEmail(to, fromName, subject, content string) error {
if smtpHost == "" { return nil } if smtpHost == "" { return nil }
displayFrom := smtpFrom displayName := smtpFromName
if fromName != "" { if fromName != "" {
displayFrom = fromName + " via inou" displayName = fromName + " via inou"
} }
html := wrapEmail(content) html := wrapEmail(content)
msg := "From: " + displayFrom + " <" + smtpUser + ">\r\n" + msg := "From: " + displayName + " <" + smtpFrom + ">\r\n" +
"To: " + to + "\r\n" + "To: " + to + "\r\n" +
"Subject: " + subject + "\r\n" + "Subject: " + subject + "\r\n" +
"MIME-Version: 1.0\r\n" + "MIME-Version: 1.0\r\n" +
@ -55,8 +36,8 @@ func SendEmail(to, fromName, subject, content string) error {
defer client.Close() defer client.Close()
if err = client.StartTLS(&tls.Config{ServerName: smtpHost}); err != nil { return err } if err = client.StartTLS(&tls.Config{ServerName: smtpHost}); err != nil { return err }
if err = client.Auth(smtp.PlainAuth("", smtpUser, smtpToken, smtpHost)); err != nil { return err } if err = client.Auth(smtp.PlainAuth("", smtpUser, smtpPass, smtpHost)); err != nil { return err }
if err = client.Mail(smtpUser); err != nil { return err } if err = client.Mail(smtpFrom); err != nil { return err }
if err = client.Rcpt(to); err != nil { return err } if err = client.Rcpt(to); err != nil { return err }
w, err := client.Data() w, err := client.Data()

View File

@ -191,7 +191,11 @@ func CallFireworks(model string, messages []map[string]interface{}, maxTokens in
return "", fmt.Errorf("read response: %w", err) return "", fmt.Errorf("read response: %w", err)
} }
if resp.StatusCode != 200 { if resp.StatusCode != 200 {
return "", fmt.Errorf("Fireworks API error %d: %s", resp.StatusCode, string(body)) msg := fmt.Sprintf("Fireworks API error %d: %s", resp.StatusCode, string(body))
if resp.StatusCode == 401 || resp.StatusCode == 402 || resp.StatusCode == 429 {
SendSignal("LLM: " + msg)
}
return "", fmt.Errorf("%s", msg)
} }
var oaiResp struct { var oaiResp struct {
Choices []struct { Choices []struct {
@ -216,7 +220,11 @@ func CallFireworks(model string, messages []map[string]interface{}, maxTokens in
// Streaming: read SSE chunks and accumulate content // Streaming: read SSE chunks and accumulate content
if resp.StatusCode != 200 { if resp.StatusCode != 200 {
body, _ := io.ReadAll(resp.Body) body, _ := io.ReadAll(resp.Body)
return "", fmt.Errorf("Fireworks API error %d: %s", resp.StatusCode, string(body)) msg := fmt.Sprintf("Fireworks API error %d: %s", resp.StatusCode, string(body))
if resp.StatusCode == 401 || resp.StatusCode == 402 || resp.StatusCode == 429 {
SendSignal("LLM: " + msg)
}
return "", fmt.Errorf("%s", msg)
} }
var sb strings.Builder var sb strings.Builder
scanner := bufio.NewScanner(resp.Body) scanner := bufio.NewScanner(resp.Body)

View File

@ -19,7 +19,8 @@ func Normalize(dossierID string, category int, progress ...func(processed, total
progress[0](p, t) progress[0](p, t)
} }
} }
if GeminiKey == "" { if FireworksKey == "" {
SendSignal("normalize: FIREWORKS_API_KEY not configured, skipping normalization")
return nil return nil
} }
@ -86,6 +87,7 @@ func Normalize(dossierID string, category int, progress ...func(processed, total
batchMap, err := callNormalizeLLM(batch) batchMap, err := callNormalizeLLM(batch)
if err != nil { if err != nil {
SendSignal(fmt.Sprintf("normalize: LLM batch %d-%d failed: %v", i+1, end, err))
return fmt.Errorf("LLM batch %d-%d: %w", i+1, end, err) return fmt.Errorf("LLM batch %d-%d: %w", i+1, end, err)
} }
for k, v := range batchMap { for k, v := range batchMap {
@ -230,50 +232,36 @@ type normMapping struct {
func callNormalizeLLM(names []string) (map[string]normMapping, error) { func callNormalizeLLM(names []string) (map[string]normMapping, error) {
nameList := strings.Join(names, "\n") nameList := strings.Join(names, "\n")
prompt := fmt.Sprintf(`Given these medical test names from a single patient's records, normalize each to a canonical name, abbreviation, LOINC code, SI unit, conversion factor, and direction. prompt := fmt.Sprintf(`Normalize these medical test names. Return ONLY a JSON object, no explanation.
Rules: Each key is the EXACT input name. Value format: {"name":"Canonical Name","abbr":"Abbreviation","loinc":"LOINC","si_unit":"unit","si_factor":1.0,"direction":"range"}
- Use standard medical abbreviations: WBC, RBC, Hgb, Hct, PLT, Na, K, Cl, CO2, BUN, Cr, Ca, Glu, ALT, AST, ALP, Bili, Alb, TP, Mg, Phos, Fe, etc.
- For tests without standard abbreviations, use a short canonical name as abbreviation
- Keep abbreviations concise (1-8 chars)
- If two names are the same test, give them the same canonical name and abbreviation
- loinc: the most common LOINC code for this test (e.g. "718-7" for Hemoglobin). Use "" if unknown.
- si_unit: the standard SI unit (e.g. "g/L", "mmol/L", "10^9/L"). Use "" if not numeric.
- si_factor: multiplier to convert from the most common conventional unit to SI. E.g. Hemoglobin g/dLg/L = 10.0. Use 1.0 if already SI or unknown.
- direction: "range" if both high and low are bad (most tests), "lower_better" if low values are healthy (CRP, LDL, triglycerides, glucose), "higher_better" if high values are healthy (HDL). Default to "range".
Return a JSON object where each key is the EXACT input name, value is {"name":"Canonical Name","abbr":"Abbreviation","loinc":"CODE","si_unit":"unit","si_factor":1.0,"direction":"range"}. Key LOINC codes: WBC=6690-2, RBC=789-8, Hemoglobin=718-7, Hematocrit=4544-3, MCV=787-2, MCH=785-6, MCHC=786-4, RDW=788-0, Platelets=777-3, Neutrophils%%=770-8, Lymphocytes%%=736-9, Monocytes%%=5905-5, Eosinophils%%=713-8, Basophils%%=706-2, Glucose=2345-7, BUN=3094-0, Creatinine=2160-0, Sodium=2951-2, Potassium=2823-3, Chloride=2075-0, CO2=2028-9, Calcium=17861-6, Total Protein=2885-2, Albumin=1751-7, Total Bilirubin=1975-2, ALP=6768-6, AST=1920-8, ALT=1742-6.
Abbreviations: WBC, RBC, Hgb, Hct, MCV, MCH, MCHC, RDW, PLT, Neut, Lymph, Mono, Eos, Baso, Glu, BUN, Cr, Na, K, Cl, CO2, Ca, TP, Alb, Bili, ALP, AST, ALT, Mg, Phos, Fe, etc.
si_factor: conventionalSI multiplier (e.g. Hgb g/dLg/L=10.0). Use 1.0 if same or unknown.
direction: "range" (default), "lower_better" (CRP, LDL, glucose), "higher_better" (HDL).
Test names: Test names:
%s`, nameList) %s`, nameList)
maxTokens := 32768 messages := []map[string]interface{}{
temp := 0.0 {"role": "user", "content": prompt},
model := "gemini-3.1-pro-preview"
config := &GeminiConfig{
Temperature: &temp,
MaxOutputTokens: &maxTokens,
Model: &model,
} }
resp, err := CallFireworks("accounts/fireworks/models/qwen3-vl-30b-a3b-instruct", messages, 4096)
resp, err := CallGeminiMultimodal([]GeminiPart{{Text: prompt}}, config)
if err != nil { if err != nil {
return nil, err return nil, err
} }
// Gemini sometimes returns object, sometimes array of objects resp = strings.TrimSpace(resp)
resp = strings.TrimPrefix(resp, "```json")
resp = strings.TrimPrefix(resp, "```")
resp = strings.TrimSuffix(resp, "```")
resp = strings.TrimSpace(resp)
var mapping map[string]normMapping var mapping map[string]normMapping
if err := json.Unmarshal([]byte(resp), &mapping); err != nil { if err := json.Unmarshal([]byte(resp), &mapping); err != nil {
var arr []map[string]normMapping return nil, fmt.Errorf("parse response: %w (first 500 chars: %.500s)", err, resp)
if err2 := json.Unmarshal([]byte(resp), &arr); err2 != nil {
return nil, fmt.Errorf("parse response: %w (first 300 chars: %.300s)", err, resp)
}
mapping = make(map[string]normMapping)
for _, item := range arr {
for k, v := range item {
mapping[k] = v
}
}
} }
return mapping, nil return mapping, nil

21
lib/notify.go Normal file
View File

@ -0,0 +1,21 @@
package lib
import (
"net/http"
"strings"
"time"
)
const ntfyURL = "https://ntfy.inou.com/inou-alerts"
const ntfyToken = "tk_k120jegay3lugeqbr9fmpuxdqmzx5"
func SendSignal(message string) {
go func() {
req, _ := http.NewRequest("POST", ntfyURL, strings.NewReader(message))
req.Header.Set("Authorization", "Bearer "+ntfyToken)
req.Header.Set("Title", "inou")
req.Header.Set("Markdown", "yes")
client := &http.Client{Timeout: 10 * time.Second}
client.Do(req)
}()
}

View File

@ -1,29 +0,0 @@
package lib
import (
"bytes"
"encoding/json"
"net/http"
"time"
)
const signalAPI = "http://192.168.1.16:8080/api/v1/rpc"
var signalRecipients = []string{"+17272252475"}
func SendSignal(message string) {
go func() {
payload := map[string]interface{}{
"jsonrpc": "2.0",
"method": "send",
"params": map[string]interface{}{
"recipient": signalRecipients,
"message": message,
},
"id": 1,
}
data, _ := json.Marshal(payload)
client := &http.Client{Timeout: 10 * time.Second}
client.Post(signalAPI, "application/json", bytes.NewReader(data))
}()
}

View File

@ -3,7 +3,6 @@ package main
import ( import (
"fmt" "fmt"
"os" "os"
"path/filepath"
"inou/lib" "inou/lib"
) )
@ -44,25 +43,8 @@ func main() {
} }
} }
// Delete upload entries (Category 5) — EntryDelete removes object files too if len(imaging) == 0 {
uploads, _ := lib.EntryRead("", dossierID, &lib.Filter{Category: lib.CategoryUpload}) fmt.Println("No imaging data found.")
if len(uploads) > 0 {
fmt.Printf("Deleting %d upload entries...\n", len(uploads))
if err := lib.EntryDelete("", dossierID, &lib.Filter{Category: lib.CategoryUpload}); err != nil {
fmt.Printf("Error: %v\n", err)
os.Exit(1)
}
}
// Remove upload files on disk
uploadDir := filepath.Join("/tank/inou/uploads", dossierID)
if info, err := os.Stat(uploadDir); err == nil && info.IsDir() {
fmt.Printf("Removing upload files: %s\n", uploadDir)
os.RemoveAll(uploadDir)
}
if len(imaging) == 0 && len(uploads) == 0 {
fmt.Println("No imaging or upload data found.")
} else { } else {
fmt.Println("Done.") fmt.Println("Done.")
} }

View File

@ -20,6 +20,7 @@ import (
var corsAllowedOrigins = map[string]bool{ var corsAllowedOrigins = map[string]bool{
"https://inou.com": true, "https://inou.com": true,
"https://www.inou.com": true, "https://www.inou.com": true,
"https://dev.inou.com": true, // staging
"http://localhost:1080": true, // dev "http://localhost:1080": true, // dev
"http://localhost:3000": true, // dev "http://localhost:3000": true, // dev
"capacitor://localhost": true, // iOS app "capacitor://localhost": true, // iOS app

View File

@ -98,6 +98,10 @@ var validPaths = []string{
"/api/v1/categories", "/api/v1/categories",
} }
var whitelistedIPs = map[string]bool{
"82.22.36.202": true, // our vulnerability scanner
}
func isLocalIP(ip string) bool { func isLocalIP(ip string) bool {
return strings.HasPrefix(ip, "192.168.") return strings.HasPrefix(ip, "192.168.")
} }
@ -222,7 +226,9 @@ func (s *statusCapture) WriteHeader(code int) {
s.status = code s.status = code
if code == 404 && s.r.URL.Path != "/favicon.ico" { if code == 404 && s.r.URL.Path != "/favicon.ico" {
ip := getIP(s.r) ip := getIP(s.r)
lib.SendSignal(fmt.Sprintf("404: %s %s", ip, s.r.URL.Path)) if !whitelistedIPs[ip] {
lib.SendSignal(fmt.Sprintf("404: %s %s", ip, s.r.URL.Path))
}
} }
s.ResponseWriter.WriteHeader(code) s.ResponseWriter.WriteHeader(code)
} }

View File

@ -5,6 +5,7 @@ import (
"fmt" "fmt"
"net/http" "net/http"
"sort" "sort"
"strconv"
"strings" "strings"
"time" "time"
"inou/lib" "inou/lib"
@ -27,6 +28,7 @@ type DossierSection struct {
DynamicType string // "genetics" for special handling DynamicType string // "genetics" for special handling
CustomHTML string // for completely custom sections (privacy) CustomHTML string // for completely custom sections (privacy)
Searchable bool // show search/filter box in header Searchable bool // show search/filter box in header
ChartData string // JSON chart data (vitals)
// Checkin-specific: show "build your profile" prompt // Checkin-specific: show "build your profile" prompt
ShowBuildTracker bool // true if trackable categories are empty ShowBuildTracker bool // true if trackable categories are empty
TrackableStats map[string]int // counts for trackable categories TrackableStats map[string]int // counts for trackable categories
@ -95,10 +97,45 @@ var sectionConfigs = []SectionConfig{
{ID: "devices", Category: lib.CategoryDevice, Color: "6366F1", HeadingKey: "section_devices", HideEmpty: true}, {ID: "devices", Category: lib.CategoryDevice, Color: "6366F1", HeadingKey: "section_devices", HideEmpty: true},
{ID: "providers", Category: lib.CategoryProvider, Color: "0EA5E9", HeadingKey: "section_providers", HideEmpty: true}, {ID: "providers", Category: lib.CategoryProvider, Color: "0EA5E9", HeadingKey: "section_providers", HideEmpty: true},
{ID: "questions", Category: lib.CategoryQuestion, Color: "8B5CF6", HeadingKey: "section_questions", HideEmpty: true}, {ID: "questions", Category: lib.CategoryQuestion, Color: "8B5CF6", HeadingKey: "section_questions", HideEmpty: true},
{ID: "vitals", Category: lib.CategoryVital, Color: "ec4899", HeadingKey: "section_vitals", ComingSoon: true}, {ID: "vitals", Category: lib.CategoryVital, Color: "ec4899", HeadingKey: "section_vitals", HideEmpty: true},
{ID: "privacy", HeadingKey: "section_privacy", Color: "64748b"}, {ID: "privacy", HeadingKey: "section_privacy", Color: "64748b"},
} }
type chartRef struct {
RefLow float64 `json:"refLow"`
RefHigh float64 `json:"refHigh"`
Direction string `json:"direction,omitempty"`
}
// vitalRef returns US reference range for a body composition metric by sex.
// Sex: 1=male, 2=female (ISO 5218). Returns nil if no reference data.
// Direction: "higher_better" = only lower bound matters, "lower_better" = only upper bound, "" = both.
func vitalRef(metricType string, sex int) *chartRef {
type ref struct{ low, high float64; dir string }
// US reference ranges: [male, female]
// Sources: WHO (BMI), ACE/ACSM (body fat), Tanita (visceral fat)
ranges := map[string][2]ref{
"bmi": {{18.5, 24.9, ""}, {18.5, 24.9, ""}},
"body_fat": {{10, 22, ""}, {20, 33, ""}},
"visceral_fat": {{0, 12, "lower_better"}, {0, 12, "lower_better"}},
"subcutaneous_fat": {{0, 19, "lower_better"}, {0, 28, "lower_better"}},
"water": {{50, 0, "higher_better"}, {45, 0, "higher_better"}},
"muscle": {{33, 0, "higher_better"}, {24, 0, "higher_better"}},
"skeletal_muscle": {{33, 0, "higher_better"}, {24, 0, "higher_better"}},
"bone": {{2.5, 0, "higher_better"}, {1.8, 0, "higher_better"}},
"protein": {{16, 0, "higher_better"}, {16, 0, "higher_better"}},
}
r, ok := ranges[metricType]
if !ok {
return nil
}
idx := 0
if sex == 2 {
idx = 1
}
return &chartRef{RefLow: r[idx].low, RefHigh: r[idx].high, Direction: r[idx].dir}
}
// BuildDossierSections builds all sections for a dossier // BuildDossierSections builds all sections for a dossier
func BuildDossierSections(targetID, targetHex string, target *lib.Dossier, p *lib.Dossier, lang string, canEdit bool) []DossierSection { func BuildDossierSections(targetID, targetHex string, target *lib.Dossier, p *lib.Dossier, lang string, canEdit bool) []DossierSection {
T := func(key string) string { return translations[lang][key] } T := func(key string) string { return translations[lang][key] }
@ -167,7 +204,7 @@ func BuildDossierSections(targetID, targetHex string, target *lib.Dossier, p *li
case "labs": case "labs":
orders, _ := lib.EntryQueryOld(targetID, lib.CategoryLab, "lab_order") orders, _ := lib.EntryQueryOld(targetID, lib.CategoryLab, "lab_order")
sort.Slice(orders, func(i, j int) bool { return orders[i].Timestamp > orders[j].Timestamp }) sort.Slice(orders, func(i, j int) bool { return orders[i].Timestamp > orders[j].Timestamp })
section.Searchable = true section.Searchable = len(orders) > 0
if len(orders) == 0 { if len(orders) == 0 {
section.Summary = T("no_lab_data") section.Summary = T("no_lab_data")
} else { } else {
@ -178,18 +215,26 @@ func BuildDossierSections(targetID, targetHex string, target *lib.Dossier, p *li
Label: order.Value, Label: order.Value,
Expandable: true, Expandable: true,
} }
var odata struct{ LocalTime string `json:"local_time"` } var odata struct {
if json.Unmarshal([]byte(order.Data), &odata) == nil && odata.LocalTime != "" { LocalTime string `json:"local_time"`
if t, err := time.Parse(time.RFC3339, odata.LocalTime); err == nil { SummaryTranslated string `json:"summary_translated"`
item.Date = t.Format("20060102") }
if t.Hour() != 0 || t.Minute() != 0 { if json.Unmarshal([]byte(order.Data), &odata) == nil {
_, offset := t.Zone() if odata.LocalTime != "" {
item.Time = fmt.Sprintf("%02d:%02d %s", t.Hour(), t.Minute(), offsetToTZName(offset)) if t, err := time.Parse(time.RFC3339, odata.LocalTime); err == nil {
item.Date = t.Format("20060102")
if t.Hour() != 0 || t.Minute() != 0 {
_, offset := t.Zone()
item.Time = fmt.Sprintf("%02d:%02d %s", t.Hour(), t.Minute(), offsetToTZName(offset))
}
} }
} }
if odata.SummaryTranslated != "" {
item.Meta = odata.SummaryTranslated
}
} }
if item.Date == "" && order.Timestamp > 0 { if item.Date == "" && order.Timestamp > 0 {
item.Date = time.Unix(order.Timestamp, 0).Format("20060102") item.Date = time.Unix(order.Timestamp, 0).UTC().Format("20060102")
} }
section.Items = append(section.Items, item) section.Items = append(section.Items, item)
} }
@ -220,7 +265,67 @@ func BuildDossierSections(targetID, targetHex string, target *lib.Dossier, p *li
// Items loaded dynamically via JS // Items loaded dynamically via JS
case "vitals": case "vitals":
section.Summary = T("vitals_desc") // Load group containers (depth 2) — each is a metric type
groups, _ := lib.EntryRead(lib.SystemAccessorID, targetID, &lib.Filter{Category: lib.CategoryVital, Type: "root"})
if len(groups) > 0 {
metrics, _ := lib.EntryRead(lib.SystemAccessorID, targetID, &lib.Filter{Category: lib.CategoryVital, ParentID: groups[0].EntryID})
type chartPoint struct {
Date int64 `json:"date"` // unix seconds
Val float64 `json:"val"`
}
type chartMetric struct {
Name string `json:"name"`
Type string `json:"type"`
Unit string `json:"unit"`
Points []chartPoint `json:"points"`
Ref *chartRef `json:"ref,omitempty"`
}
var chartMetrics []chartMetric
for _, g := range metrics {
readings, _ := lib.EntryRead(lib.SystemAccessorID, targetID, &lib.Filter{
Category: lib.CategoryVital,
Type: "reading",
ParentID: g.EntryID,
})
latest := ""
latestDate := ""
var points []chartPoint
unit := ""
for _, r := range readings {
if r.Timestamp > 0 {
// Parse numeric value from summary like "94.5 kg"
parts := strings.SplitN(r.Summary, " ", 2)
if v, err := strconv.ParseFloat(parts[0], 64); err == nil {
points = append(points, chartPoint{Date: r.Timestamp, Val: v})
if unit == "" && len(parts) > 1 {
unit = parts[1]
}
}
}
latest = r.Summary
if r.Timestamp > 0 {
latestDate = time.Unix(r.Timestamp, 0).UTC().Format("2006-01-02")
}
}
section.Items = append(section.Items, SectionItem{
ID: g.EntryID,
Label: g.Summary,
Value: latest,
Date: latestDate,
})
if len(points) > 0 {
cm := chartMetric{Name: g.Summary, Type: g.Type, Unit: unit, Points: points}
cm.Ref = vitalRef(g.Type, target.Sex)
chartMetrics = append(chartMetrics, cm)
}
}
section.Summary = fmt.Sprintf("%d metrics", len(metrics))
if len(chartMetrics) > 0 {
if b, err := json.Marshal(chartMetrics); err == nil {
section.ChartData = string(b)
}
}
}
case "privacy": case "privacy":
// Handled separately - needs access list, not entries // Handled separately - needs access list, not entries
@ -403,23 +508,29 @@ func buildLabItems(dossierID, lang string, T func(string) string) ([]SectionItem
// Use original local_time from Data JSON if available // Use original local_time from Data JSON if available
var data struct { var data struct {
LocalTime string `json:"local_time"` LocalTime string `json:"local_time"`
SummaryTranslated string `json:"summary_translated"`
} }
if json.Unmarshal([]byte(order.Data), &data) == nil && data.LocalTime != "" { if json.Unmarshal([]byte(order.Data), &data) == nil {
if t, err := time.Parse(time.RFC3339, data.LocalTime); err == nil { if data.LocalTime != "" {
item.Date = t.Format("20060102") if t, err := time.Parse(time.RFC3339, data.LocalTime); err == nil {
if t.Hour() != 0 || t.Minute() != 0 { item.Date = t.Format("20060102")
_, offset := t.Zone() if t.Hour() != 0 || t.Minute() != 0 {
item.Time = fmt.Sprintf("%02d:%02d %s", t.Hour(), t.Minute(), offsetToTZName(offset)) _, offset := t.Zone()
item.Time = fmt.Sprintf("%02d:%02d %s", t.Hour(), t.Minute(), offsetToTZName(offset))
}
} else {
fmt.Printf("[DEBUG] Failed to parse local_time for %s: %s (err: %v)\n", order.EntryID, data.LocalTime, err)
} }
} else { }
fmt.Printf("[DEBUG] Failed to parse local_time for %s: %s (err: %v)\n", order.EntryID, data.LocalTime, err) if data.SummaryTranslated != "" {
item.Meta = data.SummaryTranslated
} }
} }
// Fallback: if date still not set, use timestamp // Fallback: if date still not set, use timestamp
if item.Date == "" && order.Timestamp > 0 { if item.Date == "" && order.Timestamp > 0 {
t := time.Unix(order.Timestamp, 0) t := time.Unix(order.Timestamp, 0).UTC()
item.Date = t.Format("20060102") item.Date = t.Format("20060102")
fmt.Printf("[DEBUG] Set date from timestamp for %s: %s -> %s\n", order.EntryID, order.Value, item.Date) fmt.Printf("[DEBUG] Set date from timestamp for %s: %s -> %s\n", order.EntryID, order.Value, item.Date)
} }
@ -431,15 +542,16 @@ func buildLabItems(dossierID, lang string, T func(string) string) ([]SectionItem
if len(children) > 0 { if len(children) > 0 {
item.Value = pluralT(len(children), "result", lang) item.Value = pluralT(len(children), "result", lang)
for _, c := range children { for _, c := range children {
// Extract LOINC for precise matching
var childData struct { var childData struct {
Loinc string `json:"loinc"` Loinc string `json:"loinc"`
SummaryTranslated string `json:"summary_translated"`
} }
json.Unmarshal([]byte(c.Data), &childData) json.Unmarshal([]byte(c.Data), &childData)
child := SectionItem{ child := SectionItem{
Label: c.Summary, Label: c.Summary,
Type: childData.Loinc, // Store LOINC in Type field Type: childData.Loinc,
Meta: childData.SummaryTranslated,
} }
item.Children = append(item.Children, child) item.Children = append(item.Children, child)
} }
@ -463,7 +575,7 @@ func buildLabItems(dossierID, lang string, T func(string) string) ([]SectionItem
// Set date from timestamp // Set date from timestamp
if standalone.Timestamp > 0 { if standalone.Timestamp > 0 {
t := time.Unix(standalone.Timestamp, 0) t := time.Unix(standalone.Timestamp, 0).UTC()
item.Date = t.Format("20060102") item.Date = t.Format("20060102")
} }
@ -490,7 +602,7 @@ func docEntriesToSectionItems(entries []*lib.Entry) []SectionItem {
LinkTitle: "source", LinkTitle: "source",
} }
if e.Timestamp > 0 { if e.Timestamp > 0 {
item.Date = time.Unix(e.Timestamp, 0).Format("20060102") item.Date = time.Unix(e.Timestamp, 0).UTC().Format("20060102")
} }
items = append(items, item) items = append(items, item)
} }
@ -525,7 +637,7 @@ func entriesToSectionItems(entries []*lib.Entry) []SectionItem {
Type: e.Type, Type: e.Type,
} }
if e.Timestamp > 0 { if e.Timestamp > 0 {
item.Date = time.Unix(e.Timestamp, 0).Format("20060102") item.Date = time.Unix(e.Timestamp, 0).UTC().Format("20060102")
} }
// Parse Data to build expandable children // Parse Data to build expandable children
@ -1139,7 +1251,7 @@ func handleLabSearch(w http.ResponseWriter, r *http.Request) {
} }
} }
if oj.Date == "" && order.Timestamp > 0 { if oj.Date == "" && order.Timestamp > 0 {
oj.Date = time.Unix(order.Timestamp, 0).Format("20060102") oj.Date = time.Unix(order.Timestamp, 0).UTC().Format("20060102")
} }
matchedOrders = append(matchedOrders, oj) matchedOrders = append(matchedOrders, oj)
} }

View File

@ -97,6 +97,36 @@ func parseGenomeVariant(line, format string) (string, string, bool) {
return rsid, genotype, true return rsid, genotype, true
} }
// normalizeGenotype complements alleles to match the reference strand, then sorts.
func normalizeGenotype(genotype, alleles string) string {
if len(genotype) != 2 || alleles == "" {
if len(genotype) == 2 && genotype[0] > genotype[1] {
return string(genotype[1]) + string(genotype[0])
}
return genotype
}
valid := make(map[byte]bool)
for i := 0; i < len(alleles); i++ {
valid[alleles[i]] = true
}
comp := [256]byte{'A': 'T', 'T': 'A', 'C': 'G', 'G': 'C'}
var result [2]byte
for i := 0; i < 2; i++ {
b := genotype[i]
if valid[b] {
result[i] = b
} else if c := comp[b]; c != 0 {
result[i] = c
} else {
result[i] = b
}
}
if result[0] > result[1] {
result[0], result[1] = result[1], result[0]
}
return string(result[0]) + string(result[1])
}
// updateUploadStatus updates the status in the upload entry Data JSON // updateUploadStatus updates the status in the upload entry Data JSON
func updateUploadStatus(uploadID string, status string, details string) { func updateUploadStatus(uploadID string, status string, details string) {
entry, err := lib.EntryGet(nil, uploadID) // nil ctx - internal operation entry, err := lib.EntryGet(nil, uploadID) // nil ctx - internal operation
@ -168,8 +198,7 @@ func processGenomeUpload(uploadID string, dossierID string, filePath string) {
return variants[i].RSID < variants[j].RSID return variants[i].RSID < variants[j].RSID
}) })
// Load SNPedia data // Load SNPedia data from reference DB (initialized at portal startup)
snpediaPath := "/home/johan/dev/inou/snpedia-genotypes/genotypes.db"
type CatInfo struct { type CatInfo struct {
Category string Category string
Subcategory string Subcategory string
@ -181,8 +210,7 @@ func processGenomeUpload(uploadID string, dossierID string, filePath string) {
// Key: rsid+genotype -> slice of category associations // Key: rsid+genotype -> slice of category associations
snpediaMap := make(map[string][]CatInfo, 50000) snpediaMap := make(map[string][]CatInfo, 50000)
snpediaRsids := make(map[string]bool, 15000) snpediaRsids := make(map[string]bool, 15000)
if snpDB := lib.RefDB(); snpDB != nil {
if snpDB, err := sql.Open("sqlite3", snpediaPath+"?mode=ro"); err == nil {
rows, _ := snpDB.Query("SELECT rsid, genotype_norm, gene, magnitude, repute, summary, category, subcategory FROM genotypes") rows, _ := snpDB.Query("SELECT rsid, genotype_norm, gene, magnitude, repute, summary, category, subcategory FROM genotypes")
if rows != nil { if rows != nil {
for rows.Next() { for rows.Next() {
@ -206,13 +234,30 @@ func processGenomeUpload(uploadID string, dossierID string, filePath string) {
} }
rows.Close() rows.Close()
} }
snpDB.Close()
} }
// Match variants (only those with rsid in SNPedia) // Build valid alleles per rsid from actual genotype entries (not the alleles column,
// which includes the reference allele that SNPedia doesn't use in genotype notation)
snpediaAlleles := make(map[string]string, len(snpediaRsids))
for key := range snpediaMap {
parts := strings.SplitN(key, ":", 2)
if len(parts) == 2 {
rsid, geno := parts[0], parts[1]
existing := snpediaAlleles[rsid]
for i := 0; i < len(geno); i++ {
if !strings.ContainsRune(existing, rune(geno[i])) {
existing += string(geno[i])
}
}
snpediaAlleles[rsid] = existing
}
}
// Match variants (only those with rsid in SNPedia), normalizing genotype to reference strand
matched := make([]Variant, 0, len(snpediaRsids)) matched := make([]Variant, 0, len(snpediaRsids))
for _, v := range variants { for _, v := range variants {
if snpediaRsids[v.RSID] { if snpediaRsids[v.RSID] {
v.Genotype = normalizeGenotype(v.Genotype, snpediaAlleles[v.RSID])
matched = append(matched, v) matched = append(matched, v)
} }
} }
@ -238,11 +283,18 @@ func processGenomeUpload(uploadID string, dossierID string, filePath string) {
lib.EntryWrite("", parentEntry) lib.EntryWrite("", parentEntry)
extractionID := parentEntry.EntryID extractionID := parentEntry.EntryID
// Count shown/hidden per category, then create tiers // Count shown/hidden per category (deduplicated by category+rsid)
type catCount struct{ Shown, Hidden int } type catCount struct{ Shown, Hidden int }
catCounts := map[string]*catCount{} catCounts := map[string]*catCount{}
type catRsid struct{ cat, rsid string }
counted := map[catRsid]bool{}
for _, v := range matched { for _, v := range matched {
for _, info := range snpediaMap[v.RSID+":"+v.Genotype] { for _, info := range snpediaMap[v.RSID+":"+v.Genotype] {
key := catRsid{info.Category, v.RSID}
if counted[key] {
continue
}
counted[key] = true
c, ok := catCounts[info.Category] c, ok := catCounts[info.Category]
if !ok { if !ok {
c = &catCount{} c = &catCount{}
@ -273,18 +325,30 @@ func processGenomeUpload(uploadID string, dossierID string, filePath string) {
} }
// Batch insert variants (tier 3) - Type="rsid", Value=genotype // Batch insert variants (tier 3) - Type="rsid", Value=genotype
var batch []*lib.Entry // Deduplicate: one entry per tier+rsid (merge subcategories, keep highest magnitude)
insertCount := 0 type variantKey struct{ tier, rsid string }
deduped := make(map[variantKey]*lib.Entry)
for _, v := range matched { for _, v := range matched {
for _, info := range snpediaMap[v.RSID+":"+v.Genotype] { for _, info := range snpediaMap[v.RSID+":"+v.Genotype] {
tierID := tierMap[info.Category] tierID := tierMap[info.Category]
key := variantKey{tierID, v.RSID}
if existing, ok := deduped[key]; ok {
// Keep higher magnitude entry
if info.Magnitude > float64(100-existing.Ordinal)/10 {
data := fmt.Sprintf(`{"mag":%.1f,"rep":"%s","sum":"%s","sub":"%s"}`,
info.Magnitude, info.Repute, strings.ReplaceAll(info.Summary, `"`, `\"`), info.Subcategory)
existing.Ordinal = int(100 - info.Magnitude*10)
existing.Data = data
}
continue
}
// data includes subcategory (plain text - EntryWrite packs automatically)
data := fmt.Sprintf(`{"mag":%.1f,"rep":"%s","sum":"%s","sub":"%s"}`, data := fmt.Sprintf(`{"mag":%.1f,"rep":"%s","sum":"%s","sub":"%s"}`,
info.Magnitude, info.Repute, strings.ReplaceAll(info.Summary, `"`, `\"`), info.Subcategory) info.Magnitude, info.Repute, strings.ReplaceAll(info.Summary, `"`, `\"`), info.Subcategory)
batch = append(batch, &lib.Entry{ deduped[key] = &lib.Entry{
DossierID: dossierID, DossierID: dossierID,
ParentID: tierID, ParentID: tierID,
Category: lib.CategoryGenome, Category: lib.CategoryGenome,
@ -295,16 +359,18 @@ func processGenomeUpload(uploadID string, dossierID string, filePath string) {
SearchKey: strings.ToLower(info.Gene), SearchKey: strings.ToLower(info.Gene),
SearchKey2: strings.ToLower(v.RSID), SearchKey2: strings.ToLower(v.RSID),
Data: data, Data: data,
})
insertCount++
if len(batch) >= 500 {
lib.EntryWrite("", batch...)
batch = batch[:0] // Reset slice
} }
} }
} }
// Insert remaining entries
var batch []*lib.Entry
for _, e := range deduped {
batch = append(batch, e)
if len(batch) >= 500 {
lib.EntryWrite("", batch...)
batch = batch[:0]
}
}
if len(batch) > 0 { if len(batch) > 0 {
lib.EntryWrite("", batch...) lib.EntryWrite("", batch...)
} }

View File

@ -314,12 +314,14 @@ func handleMCPToolsList(w http.ResponseWriter, req mcpRequest) {
tools := []map[string]interface{}{ tools := []map[string]interface{}{
{ {
"name": "list_dossiers", "name": "list_dossiers",
"title": "List Dossiers",
"description": "List all patient dossiers accessible to this account.", "description": "List all patient dossiers accessible to this account.",
"inputSchema": map[string]interface{}{"type": "object", "properties": map[string]interface{}{}}, "inputSchema": map[string]interface{}{"type": "object", "properties": map[string]interface{}{}},
"annotations": readOnly, "annotations": readOnly,
}, },
{ {
"name": "list_categories", "name": "list_categories",
"title": "List Categories",
"description": "List data categories for a dossier with entry counts. Start here to see what's available before querying specific data.", "description": "List data categories for a dossier with entry counts. Start here to see what's available before querying specific data.",
"inputSchema": map[string]interface{}{ "inputSchema": map[string]interface{}{
"type": "object", "type": "object",
@ -332,25 +334,27 @@ func handleMCPToolsList(w http.ResponseWriter, req mcpRequest) {
}, },
{ {
"name": "list_entries", "name": "list_entries",
"description": "List entries by category, type, or parent. All data is hierarchical — use parent to navigate deeper. For imaging: list studies (category='imaging'), then series (parent=study_id), then slices (parent=series_id). For labs: use search_key with LOINC code (e.g., '718-7'). For genome: search_key with gene name (e.g., 'MTHFR').", "title": "Query Entries",
"description": "List entries by navigating the hierarchy. Always start with parent=<dossier_id> to get top-level entries, then use returned entry IDs to go deeper. For imaging: dossier → root → studies → series. To view slices, use fetch_contact_sheet on a series, then fetch_image with the slice ID. For labs: dossier → test groups → results. Use search_key for LOINC codes (labs) or gene names (genome).",
"inputSchema": map[string]interface{}{ "inputSchema": map[string]interface{}{
"type": "object", "type": "object",
"properties": map[string]interface{}{ "properties": map[string]interface{}{
"dossier": map[string]interface{}{"type": "string", "description": "Dossier ID (16-char hex)"}, "dossier": map[string]interface{}{"type": "string", "description": "Dossier ID (16-char hex)"},
"parent": map[string]interface{}{"type": "string", "description": "Parent entry ID — start with the dossier ID, then navigate deeper"},
"category": map[string]interface{}{"type": "string", "description": "Category name (use list_categories to discover)"}, "category": map[string]interface{}{"type": "string", "description": "Category name (use list_categories to discover)"},
"type": map[string]interface{}{"type": "string", "description": "Entry type within category"}, "type": map[string]interface{}{"type": "string", "description": "Entry type within category"},
"search_key": map[string]interface{}{"type": "string", "description": "LOINC code for labs, gene name for genome"}, "search_key": map[string]interface{}{"type": "string", "description": "LOINC code for labs, gene name for genome"},
"parent": map[string]interface{}{"type": "string", "description": "Parent entry ID for hierarchical navigation"},
"from": map[string]interface{}{"type": "string", "description": "Timestamp start (Unix seconds)"}, "from": map[string]interface{}{"type": "string", "description": "Timestamp start (Unix seconds)"},
"to": map[string]interface{}{"type": "string", "description": "Timestamp end (Unix seconds)"}, "to": map[string]interface{}{"type": "string", "description": "Timestamp end (Unix seconds)"},
"limit": map[string]interface{}{"type": "number", "description": "Maximum results"}, "limit": map[string]interface{}{"type": "number", "description": "Maximum results"},
}, },
"required": []string{"dossier"}, "required": []string{"dossier", "parent"},
}, },
"annotations": readOnly, "annotations": readOnly,
}, },
{ {
"name": "fetch_image", "name": "fetch_image",
"title": "Fetch Image",
"description": "Fetch slice image as base64 PNG. Optionally set window/level.", "description": "Fetch slice image as base64 PNG. Optionally set window/level.",
"inputSchema": map[string]interface{}{ "inputSchema": map[string]interface{}{
"type": "object", "type": "object",
@ -366,6 +370,7 @@ func handleMCPToolsList(w http.ResponseWriter, req mcpRequest) {
}, },
{ {
"name": "fetch_contact_sheet", "name": "fetch_contact_sheet",
"title": "Fetch Contact Sheet",
"description": "Fetch contact sheet (thumbnail grid) for NAVIGATION ONLY. Use to identify slices, then fetch at full resolution. NEVER diagnose from thumbnails.", "description": "Fetch contact sheet (thumbnail grid) for NAVIGATION ONLY. Use to identify slices, then fetch at full resolution. NEVER diagnose from thumbnails.",
"inputSchema": map[string]interface{}{ "inputSchema": map[string]interface{}{
"type": "object", "type": "object",
@ -381,12 +386,14 @@ func handleMCPToolsList(w http.ResponseWriter, req mcpRequest) {
}, },
{ {
"name": "fetch_document", "name": "fetch_document",
"title": "Fetch Document",
"description": "Fetch full document content including extracted text, findings, and metadata. Use after finding documents via list_entries.", "description": "Fetch full document content including extracted text, findings, and metadata. Use after finding documents via list_entries.",
"inputSchema": map[string]interface{}{ "inputSchema": map[string]interface{}{
"type": "object", "type": "object",
"properties": map[string]interface{}{ "properties": map[string]interface{}{
"dossier": map[string]interface{}{"type": "string", "description": "Dossier ID (16-char hex)"}, "dossier": map[string]interface{}{"type": "string", "description": "Dossier ID (16-char hex)"},
"entry_id": map[string]interface{}{"type": "string", "description": "Document entry ID (16-char hex)"}, "entry_id": map[string]interface{}{"type": "string", "description": "Document entry ID (16-char hex)"},
"format": map[string]interface{}{"type": "string", "description": "Output format: 'original' (default, raw JSON), 'markdown' (formatted), 'translation' (English translation via AI)"},
}, },
"required": []string{"dossier", "entry_id"}, "required": []string{"dossier", "entry_id"},
}, },
@ -394,6 +401,7 @@ func handleMCPToolsList(w http.ResponseWriter, req mcpRequest) {
}, },
{ {
"name": "get_version", "name": "get_version",
"title": "Server Version",
"description": "Get server version info.", "description": "Get server version info.",
"inputSchema": map[string]interface{}{"type": "object", "properties": map[string]interface{}{}}, "inputSchema": map[string]interface{}{"type": "object", "properties": map[string]interface{}{}},
"annotations": readOnly, "annotations": readOnly,
@ -448,6 +456,10 @@ func handleMCPToolsCall(w http.ResponseWriter, req mcpRequest, accessToken, doss
typ, _ := params.Arguments["type"].(string) typ, _ := params.Arguments["type"].(string)
searchKey, _ := params.Arguments["search_key"].(string) searchKey, _ := params.Arguments["search_key"].(string)
parent, _ := params.Arguments["parent"].(string) parent, _ := params.Arguments["parent"].(string)
if parent == "" {
sendMCPResult(w, req.ID, mcpTextContent("ERROR: parent is required. Start with parent="+dossier+" (the dossier ID) to list top-level entries, then use returned entry IDs to navigate deeper."))
return
}
from, _ := params.Arguments["from"].(string) from, _ := params.Arguments["from"].(string)
to, _ := params.Arguments["to"].(string) to, _ := params.Arguments["to"].(string)
limit, _ := params.Arguments["limit"].(float64) limit, _ := params.Arguments["limit"].(float64)
@ -493,16 +505,17 @@ func handleMCPToolsCall(w http.ResponseWriter, req mcpRequest, accessToken, doss
case "fetch_document": case "fetch_document":
dossier, _ := params.Arguments["dossier"].(string) dossier, _ := params.Arguments["dossier"].(string)
entryID, _ := params.Arguments["entry_id"].(string) entryID, _ := params.Arguments["entry_id"].(string)
format, _ := params.Arguments["format"].(string)
if dossier == "" || entryID == "" { if dossier == "" || entryID == "" {
sendMCPError(w, req.ID, -32602, "dossier and entry_id required") sendMCPError(w, req.ID, -32602, "dossier and entry_id required")
return return
} }
result, err := mcpFetchDocument(dossierID, dossier, entryID) result, err := mcpFetchDocument(dossierID, dossier, entryID, format)
if err != nil { if err != nil {
sendMCPError(w, req.ID, -32000, err.Error()) sendMCPError(w, req.ID, -32000, err.Error())
return return
} }
sendMCPResult(w, req.ID, mcpTextContent(result)) sendMCPResult(w, req.ID, result)
case "get_version": case "get_version":
sendMCPResult(w, req.ID, mcpTextContent(fmt.Sprintf("Server: %s v%s", mcpServerName, mcpServerVersion))) sendMCPResult(w, req.ID, mcpTextContent(fmt.Sprintf("Server: %s v%s", mcpServerName, mcpServerVersion)))

View File

@ -8,6 +8,7 @@ import (
"net/http" "net/http"
"net/url" "net/url"
"strconv" "strconv"
"strings"
"inou/lib" "inou/lib"
) )
@ -97,7 +98,7 @@ func mcpListDossiers(accessorID string) (string, error) {
} }
func mcpQueryEntries(accessorID, dossier, category, typ, searchKey, parent, from, to string, limit int) (string, error) { func mcpQueryEntries(accessorID, dossier, category, typ, searchKey, parent, from, to string, limit int) (string, error) {
cat := 0 cat := -1 // any category
if category != "" { if category != "" {
cat = lib.CategoryFromString[category] cat = lib.CategoryFromString[category]
} }
@ -144,10 +145,25 @@ func formatEntries(entries []*lib.Entry) string {
"parent_id": e.ParentID, "parent_id": e.ParentID,
"category": lib.CategoryName(e.Category), "category": lib.CategoryName(e.Category),
"type": e.Type, "type": e.Type,
"value": e.Value,
"summary": e.Summary, "summary": e.Summary,
"ordinal": e.Ordinal, "ordinal": e.Ordinal,
"timestamp": e.Timestamp, "timestamp": e.Timestamp,
} }
if e.Data != "" {
var d map[string]any
if json.Unmarshal([]byte(e.Data), &d) == nil {
entry["data"] = d
}
}
switch e.Type {
case "root":
entry["hint"] = "Use list_entries with parent=" + e.EntryID + " to list studies"
case "study":
entry["hint"] = "Use list_entries with parent=" + e.EntryID + " to list series"
case "series":
entry["hint"] = "Use fetch_contact_sheet with series=" + e.EntryID + " to browse slices, then fetch_image with the slice ID"
}
result = append(result, entry) result = append(result, entry)
} }
pretty, _ := json.MarshalIndent(result, "", " ") pretty, _ := json.MarshalIndent(result, "", " ")
@ -193,34 +209,161 @@ func mcpFetchContactSheet(accessToken, dossier, series string, wc, ww float64) (
} }
// --- Document fetch: returns extracted text + metadata from Data field --- // --- Document fetch: returns extracted text + metadata from Data field ---
// mcpFetchDocument returns a full MCP content map.
// format: "original" = base64 PDF, "markdown" = formatted text, "translation" = translated text
func mcpFetchDocument(accessorID, dossier, entryID string) (string, error) { func mcpFetchDocument(accessorID, dossier, entryID, format string) (map[string]interface{}, error) {
entries, err := lib.EntryRead(accessorID, dossier, &lib.Filter{EntryID: entryID}) // Use EntryGet (by ID only) — EntryRead with Category=0 default would exclude non-profile entries.
e, err := lib.EntryGet(&lib.AccessContext{AccessorID: accessorID}, entryID)
if err != nil {
return nil, err
}
if e == nil {
return nil, fmt.Errorf("document not found")
}
// Verify the entry belongs to the requested dossier.
if e.DossierID != dossier {
return nil, fmt.Errorf("document not found")
}
// Parse the Data field (populated by doc-processor).
var data map[string]interface{}
if e.Data != "" {
_ = json.Unmarshal([]byte(e.Data), &data)
}
if format == "" {
format = "original"
}
switch format {
case "markdown":
text := docToMarkdown(e, data)
return mcpTextContent(text), nil
case "translation":
text, err := docToTranslation(e, data)
if err != nil {
return nil, err
}
return mcpTextContent(text), nil
default: // "original" — return base64-encoded PDF
return docToOriginalPDF(e, data)
}
}
// docToOriginalPDF decrypts the source PDF and returns it as base64 MCP content.
func docToOriginalPDF(e *lib.Entry, data map[string]interface{}) (map[string]interface{}, error) {
sourceUpload, _ := data["source_upload"].(string)
if sourceUpload == "" {
return nil, fmt.Errorf("no PDF available for this document")
}
uploadEntry, err := lib.EntryGet(nil, sourceUpload)
if err != nil || uploadEntry == nil {
return nil, fmt.Errorf("upload entry not found")
}
var uploadData struct {
Path string `json:"path"`
}
if err := json.Unmarshal([]byte(uploadEntry.Data), &uploadData); err != nil || uploadData.Path == "" {
return nil, fmt.Errorf("no file path in upload entry")
}
pdfBytes, err := lib.DecryptFile(uploadData.Path)
if err != nil {
return nil, fmt.Errorf("decrypt failed: %w", err)
}
b64 := base64.StdEncoding.EncodeToString(pdfBytes)
summary := e.Summary
if summary == "" {
summary = "document"
}
return map[string]interface{}{
"content": []map[string]interface{}{
{
"type": "resource",
"resource": map[string]interface{}{
"uri": "data:application/pdf;base64," + b64,
"mimeType": "application/pdf",
"text": summary,
},
},
},
}, nil
}
// docToMarkdown returns the pre-rendered markdown stored by doc-processor.
func docToMarkdown(e *lib.Entry, data map[string]interface{}) string {
if md, ok := data["markdown"].(string); ok && md != "" {
return md
}
// Fallback: summary only
return e.Summary
}
// docToTranslation returns the pre-translated markdown if available,
// otherwise translates the markdown field on-the-fly via Claude.
func docToTranslation(e *lib.Entry, data map[string]interface{}) (string, error) {
// Use pre-translated version if already stored by doc-processor.
if tr, ok := data["markdown_translated"].(string); ok && tr != "" {
return tr, nil
}
// Fall back to on-the-fly translation.
src, _ := data["markdown"].(string)
if src == "" {
src = e.Summary
}
if src == "" {
return "", fmt.Errorf("no text content to translate")
}
if lib.AnthropicKey == "" {
return "", fmt.Errorf("translation unavailable: no Anthropic API key configured")
}
prompt := "Translate the following medical document (markdown format) to English. Preserve all markdown formatting, medical terminology, values, and structure. Output only the translated markdown, no explanation.\n\n" + src
reqBody, _ := json.Marshal(map[string]interface{}{
"model": "claude-haiku-4-5",
"max_tokens": 4096,
"messages": []map[string]interface{}{
{"role": "user", "content": prompt},
},
})
req, err := http.NewRequest("POST", "https://api.anthropic.com/v1/messages", strings.NewReader(string(reqBody)))
if err != nil { if err != nil {
return "", err return "", err
} }
if len(entries) == 0 { req.Header.Set("Content-Type", "application/json")
return "", fmt.Errorf("document not found") req.Header.Set("x-api-key", lib.AnthropicKey)
} req.Header.Set("anthropic-version", "2023-06-01")
e := entries[0]
result := map[string]any{ resp, err := http.DefaultClient.Do(req)
"id": e.EntryID, if err != nil {
"type": e.Type, return "", err
"summary": e.Summary,
"timestamp": e.Timestamp,
} }
defer resp.Body.Close()
// Merge Data fields (extracted text, findings, etc.) into result var result struct {
if e.Data != "" { Content []struct {
var data map[string]interface{} Text string `json:"text"`
if json.Unmarshal([]byte(e.Data), &data) == nil { } `json:"content"`
for k, v := range data { Error struct {
result[k] = v Message string `json:"message"`
} } `json:"error"`
}
} }
if err := json.NewDecoder(resp.Body).Decode(&result); err != nil {
pretty, _ := json.MarshalIndent(result, "", " ") return "", err
return string(pretty), nil }
if resp.StatusCode != 200 {
return "", fmt.Errorf("translation API error: %s", result.Error.Message)
}
if len(result.Content) == 0 {
return "", fmt.Errorf("empty translation response")
}
return result.Content[0].Text, nil
} }

View File

@ -47,21 +47,33 @@ func oauthJSON(w http.ResponseWriter, data any) {
json.NewEncoder(w).Encode(data) json.NewEncoder(w).Encode(data)
} }
// handleOAuthAuthorize handles GET /oauth/authorize // handleOAuthAuthorize handles GET/POST /oauth/authorize
// Parameters: client_id, redirect_uri, response_type, state, code_challenge, code_challenge_method // GET: validates params, shows consent screen
// POST: user approves/denies, generates code or returns error
func handleOAuthAuthorize(w http.ResponseWriter, r *http.Request) { func handleOAuthAuthorize(w http.ResponseWriter, r *http.Request) {
if r.Method != "GET" { if r.Method != "GET" && r.Method != "POST" {
oauthError(w, "invalid_request", "Method must be GET", http.StatusMethodNotAllowed) oauthError(w, "invalid_request", "Method must be GET or POST", http.StatusMethodNotAllowed)
return return
} }
// Parse parameters // Parse parameters (from query on GET, form on POST)
clientID := r.URL.Query().Get("client_id") var clientID, redirectURI, responseType, state, codeChallenge, codeChallengeMethod string
redirectURI := r.URL.Query().Get("redirect_uri") if r.Method == "GET" {
responseType := r.URL.Query().Get("response_type") clientID = r.URL.Query().Get("client_id")
state := r.URL.Query().Get("state") redirectURI = r.URL.Query().Get("redirect_uri")
codeChallenge := r.URL.Query().Get("code_challenge") responseType = r.URL.Query().Get("response_type")
codeChallengeMethod := r.URL.Query().Get("code_challenge_method") state = r.URL.Query().Get("state")
codeChallenge = r.URL.Query().Get("code_challenge")
codeChallengeMethod = r.URL.Query().Get("code_challenge_method")
} else {
r.ParseForm()
clientID = r.FormValue("client_id")
redirectURI = r.FormValue("redirect_uri")
responseType = r.FormValue("response_type")
state = r.FormValue("state")
codeChallenge = r.FormValue("code_challenge")
codeChallengeMethod = r.FormValue("code_challenge_method")
}
// Validate required parameters // Validate required parameters
if clientID == "" { if clientID == "" {
@ -114,7 +126,39 @@ func handleOAuthAuthorize(w http.ResponseWriter, r *http.Request) {
return return
} }
// User is logged in - generate authorization code // GET: show consent screen
if r.Method == "GET" {
render(w, r, PageData{
Page: "consent",
Lang: getLang(r),
Dossier: dossier,
ClientName: client.Name,
ClientID: clientID,
RedirectURI: redirectURI,
ResponseType: responseType,
State: state,
CodeChallenge: codeChallenge,
CodeChallengeMethod: codeChallengeMethod,
UserName: dossier.Name,
})
return
}
// POST: handle consent decision
if r.FormValue("action") == "deny" {
redirectURL, _ := url.Parse(redirectURI)
q := redirectURL.Query()
q.Set("error", "access_denied")
q.Set("error_description", "User denied access")
if state != "" {
q.Set("state", state)
}
redirectURL.RawQuery = q.Encode()
http.Redirect(w, r, redirectURL.String(), http.StatusSeeOther)
return
}
// User approved - generate authorization code
code, err := lib.OAuthCodeCreate( code, err := lib.OAuthCodeCreate(
clientID, clientID,
dossier.DossierID, dossier.DossierID,

Binary file not shown.

After

Width:  |  Height:  |  Size: 18 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 19 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 30 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 39 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 54 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 35 KiB

View File

@ -0,0 +1,6 @@
<svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 64 64" width="64" height="64">
<rect width="64" height="64" fill="#FFFFFF"/>
<g fill="#b45309" transform="translate(26.175, 12.775)">
<path d="M3.65 48.50L3.65 21.10L11.65 21.10L11.65 48.50L3.65 48.50M0 27L0 21.10L11.65 21.10L11.65 27L0 27M6.75 18.40Q4.50 18.40 3.43 17.22Q2.35 16.05 2.35 14.25Q2.35 12.40 3.43 11.23Q4.50 10.05 6.75 10.05Q9 10.05 10.07 11.23Q11.15 12.40 11.15 14.25Q11.15 16.05 10.07 17.22Q9 18.40 6.75 18.40Z"/>
</g>
</svg>

After

Width:  |  Height:  |  Size: 505 B

View File

@ -1567,9 +1567,12 @@ a:hover {
.sg-profile-card.border-moderate { border-left-color: var(--accent); } .sg-profile-card.border-moderate { border-left-color: var(--accent); }
.sg-profile-card.border-rich { border-left-color: var(--success); } .sg-profile-card.border-rich { border-left-color: var(--success); }
.sg-profile-card h3 { font-size: 1.25rem; margin-bottom: 4px; } .sg-profile-card h3 { font-size: 1.25rem; margin-bottom: 4px; }
.card-actions { position: absolute; top: 14px; right: 14px; display: flex; gap: 4px; } .card-name-row { display: flex; align-items: baseline; gap: 8px; }
.card-actions a { color: var(--text-muted); text-decoration: none; padding: 2px 5px; font-size: 1.1rem; line-height: 1; border-radius: 4px; } .card-name-row h3 { flex: 1; min-width: 0; }
.card-actions a:hover { color: var(--accent); background: var(--accent-light); } .card-actions { display: flex; gap: 4px; flex-shrink: 0; }
.card-actions a, .card-actions button { color: var(--text-muted); text-decoration: none; padding: 2px 5px; font-size: 1.1rem; line-height: 1; border-radius: 4px; position: relative; }
.card-actions a:hover, .card-actions button:hover { color: var(--accent); background: var(--accent-light); }
[data-tooltip]:hover::after { content: attr(data-tooltip); position: absolute; bottom: 100%; left: 50%; transform: translateX(-50%); padding: 4px 8px; background: var(--text); color: var(--bg); font-size: 0.7rem; white-space: nowrap; border-radius: 4px; pointer-events: none; }
.sg-profile-card .card-meta { margin-bottom: 0; } .sg-profile-card .card-meta { margin-bottom: 0; }
.card-context { font-size: 0.8rem; color: var(--text-subtle); font-style: italic; margin: 0; } .card-context { font-size: 0.8rem; color: var(--text-subtle); font-style: italic; margin: 0; }
.card-flag { font-size: 0.85rem; vertical-align: middle; } .card-flag { font-size: 0.85rem; vertical-align: middle; }

View File

@ -0,0 +1,33 @@
{{define "consent"}}
<div class="sg-container" style="justify-content: center;">
<div style="flex: 1; display: flex; align-items: center; justify-content: center;">
<div class="data-card" style="padding: 48px; max-width: 440px; width: 100%;">
<div style="text-align: center; margin-bottom: 24px; font-size: 1.5rem;"><span style="font-weight: 700; color: var(--accent);">inou</span> <span style="font-weight: 400; color: var(--text-muted);">health</span></div>
<h1 style="font-size: 1.75rem; font-weight: 700; text-align: center; margin-bottom: 8px;">Authorize Access</h1>
<p style="text-align: center; color: var(--text-muted); font-weight: 300; margin-bottom: 32px;">
<strong>{{.ClientName}}</strong> wants to access your health data as <strong>{{.UserName}}</strong>.
</p>
<div style="background: var(--bg-surface); border-radius: 8px; padding: 16px; margin-bottom: 24px;">
<p style="font-size: 0.9rem; color: var(--text); margin: 0;">This application will be able to read all health data in your dossier.</p>
</div>
<form action="/oauth/authorize" method="POST">
<input type="hidden" name="client_id" value="{{.ClientID}}">
<input type="hidden" name="redirect_uri" value="{{.RedirectURI}}">
<input type="hidden" name="response_type" value="{{.ResponseType}}">
<input type="hidden" name="state" value="{{.State}}">
<input type="hidden" name="code_challenge" value="{{.CodeChallenge}}">
<input type="hidden" name="code_challenge_method" value="{{.CodeChallengeMethod}}">
<button type="submit" name="action" value="allow" class="btn btn-primary btn-full" style="margin-bottom: 12px;">Allow</button>
<button type="submit" name="action" value="deny" class="btn btn-full" style="background: transparent; color: var(--text-muted); border: 1px solid var(--border);">Deny</button>
</form>
</div>
</div>
{{template "footer"}}
</div>
{{end}}

View File

@ -1,51 +1,25 @@
{{define "dashboard"}} {{define "dashboard"}}
<div class="sg-container"> <div class="sg-container">
<h1 style="font-size: 2.5rem; font-weight: 700;">{{.T.dossiers}}</h1> <h1 style="font-size: 2.5rem; font-weight: 700;">{{.T.dossiers}}</h1>
<p class="intro" style="font-size: 1.15rem; font-weight: 300; line-height: 1.8;">{{.T.dossiers_intro}}</p>
<div class="profiles-grid" style="grid-template-columns: repeat(auto-fill, minmax(300px, 1fr));"> <div class="profiles-grid" style="grid-template-columns: repeat(auto-fill, minmax(300px, 1fr));">
<!-- Self dossier -->
<div class="card sg-profile-card {{borderLevel .SelfStats.TotalCount}}" style="position: relative;">
<div class="card-actions">
<a href="/dossier/{{.Dossier.DossierID}}/upload" title="{{.T.upload_files}}">&#8682;</a>
<a href="/dossier/{{.Dossier.DossierID}}/edit" title="{{.T.edit}}">&#9998;</a>
</div>
<a href="/dossier/{{.Dossier.DossierID}}" style="text-decoration: none; color: inherit; display: contents;">
<div class="profile-header">
<div class="avatar" style="background: {{initialColor .Dossier.DossierID}};">{{initials .Dossier.Name}}</div>
<div>
<h3>{{.Dossier.Name}}{{with langFlag .Dossier.Preferences.Language}} <span class="card-flag">{{.}}</span>{{end}}</h3>
<p class="card-meta">{{.T.you}}</p>
</div>
</div>
<p class="sg-profile-dob">{{printf "%.10s" .Dossier.DateOfBirth}}{{with age .Dossier.DateOfBirth}} · {{.}}{{end}}{{if .Dossier.Sex}} · {{sexT .Dossier.Sex .Lang}}{{end}}</p>
<div class="sg-profile-stats">
{{if .SelfStats.Chips}}
{{range .SelfStats.Chips}}<span class="sg-profile-chip {{.Color}}">{{.Icon}} {{if .Count}}{{.Count}} {{end}}{{.Label}}</span>{{end}}
{{if .SelfStats.OverflowCount}}<span class="sg-profile-chip chip-muted">+{{.SelfStats.OverflowCount}} more</span>{{end}}
{{else}}
<span class="sg-profile-chip chip-muted">No data yet</span>
{{end}}
</div>
</a>
</div>
<!-- Accessible dossiers -->
{{range .AccessibleDossiers}} {{range .AccessibleDossiers}}
{{if .NewGroup}}<hr class="grid-separator">{{end}} {{if .NewGroup}}<hr class="grid-separator">{{end}}
<div class="card sg-profile-card {{borderLevel .Stats.TotalCount}}" style="position: relative;"> <div class="card sg-profile-card {{borderLevel .Stats.TotalCount}}">
{{if .CanEdit}}<div class="card-actions">
<a href="/dossier/{{.DossierID}}/upload" title="{{$.T.upload_files}}">&#8682;</a>
<a href="/dossier/{{.DossierID}}/edit" title="{{$.T.edit}}">&#9998;</a>
</div>{{end}}
{{if eq .RelationInt 99}}<form method="POST" action="/dossier/{{.DossierID}}/revoke" style="position: absolute; top: 16px; right: 16px; margin: 0;" onsubmit="return confirm('Remove demo dossier from your list?')"><input type="hidden" name="accessor_id" value="{{$.Dossier.DossierID}}"><button type="submit" class="edit-link" title="{{$.T.remove}}" style="background: none; border: none; color: var(--text-muted); cursor: pointer; padding: 4px;">&#10005;</button></form>{{end}}
<a href="/dossier/{{.DossierID}}" style="text-decoration: none; color: inherit; display: contents;"> <a href="/dossier/{{.DossierID}}" style="text-decoration: none; color: inherit; display: contents;">
<div class="profile-header"> <div class="profile-header">
<div class="avatar" style="background: {{initialColor .DossierID}};">{{initials .Name}}</div> <div class="avatar" style="background: {{initialColor .DossierID}};">{{initials .Name}}</div>
<div> <div style="flex: 1; min-width: 0;">
<h3>{{.Name}}{{with langFlag .Lang}} <span class="card-flag">{{.}}</span>{{end}}</h3> <div class="card-name-row">
<p class="card-meta">{{if eq .RelationInt 99}}{{$.T.role}}: {{.Relation}}{{else}}{{$.T.my_role}}: {{.Relation}}{{if .IsCareReceiver}} · <span class="badge badge-care">{{$.T.care}}</span>{{end}}{{end}}</p> <h3>{{.Name}}{{with langFlag .Lang}} <span class="card-flag">{{.}}</span>{{end}}</h3>
{{if .Context}}<p class="card-context">{{.Context}}</p>{{end}} {{if .CanEdit}}<span class="card-actions" onclick="event.preventDefault(); event.stopPropagation();">
<a href="/dossier/{{.DossierID}}/upload" data-tooltip="{{$.T.upload_files}}">&#8682;</a>
<a href="/dossier/{{.DossierID}}/edit" data-tooltip="{{$.T.edit}}">&#9998;</a>
</span>{{end}}
{{if eq .RelationInt 99}}<form method="POST" action="/dossier/{{.DossierID}}/revoke" class="card-actions" style="margin: 0;" onclick="event.stopPropagation();" onsubmit="event.stopPropagation(); return confirm('Remove demo dossier from your list?')"><input type="hidden" name="accessor_id" value="{{$.Dossier.DossierID}}"><button type="submit" class="edit-link" data-tooltip="{{$.T.remove}}" style="background: none; border: none; color: var(--text-muted); cursor: pointer; padding: 2px 5px; font-size: 1.1rem;">&#10005;</button></form>{{end}}
</div>
<p class="card-meta">{{if .IsSelf}}{{$.T.you}}{{else if eq .RelationInt 99}}{{$.T.role}}: {{.Relation}}{{else}}{{$.T.my_role}}: {{.Relation}}{{if .IsCareReceiver}} · <span class="badge badge-care">{{$.T.care}}</span>{{end}}{{end}}</p>
<p class="card-context">{{if .Context}}{{.Context}}{{else}}&nbsp;{{end}}</p>
</div> </div>
</div> </div>
<p class="sg-profile-dob">{{printf "%.10s" .DateOfBirth}}{{with age .DateOfBirth}} · {{.}}{{end}}{{if .Sex}} · {{sexT .Sex $.Lang}}{{end}}</p> <p class="sg-profile-dob">{{printf "%.10s" .DateOfBirth}}{{with age .DateOfBirth}} · {{.}}{{end}}{{if .Sex}} · {{sexT .Sex $.Lang}}{{end}}</p>

234
portal/templates/docs.tmpl Normal file
View File

@ -0,0 +1,234 @@
{{define "docs"}}
<style>
.docs-container {
max-width: 1200px;
margin: 0 auto;
padding: 48px 24px 80px;
}
.docs-card {
background: var(--bg-card);
border: 1px solid var(--border);
border-radius: 8px;
padding: 48px;
margin-bottom: 24px;
}
.docs-card h1 {
font-size: 2.5rem;
font-weight: 700;
color: var(--text);
margin-bottom: 16px;
}
.docs-card .intro {
font-size: 1.15rem;
font-weight: 300;
color: var(--text-muted);
line-height: 1.8;
margin-bottom: 0;
}
.docs-card h2 {
font-size: 1.4rem;
font-weight: 600;
color: var(--text);
margin-top: 0;
margin-bottom: 24px;
}
.docs-card h3 {
font-size: 1.1rem;
font-weight: 600;
color: var(--text);
margin-top: 24px;
margin-bottom: 8px;
}
.docs-card h3:first-child { margin-top: 0; }
.docs-card p, .docs-card li {
font-size: 1rem;
font-weight: 300;
color: var(--text-muted);
line-height: 1.8;
}
.docs-card p { margin-bottom: 16px; }
.docs-card p:last-child { margin-bottom: 0; }
.docs-card ul { margin-bottom: 16px; padding-left: 24px; }
.docs-card li { margin-bottom: 4px; }
.docs-card strong {
font-weight: 600;
color: var(--text);
}
.docs-card a { color: var(--accent); }
.docs-card code {
background: var(--bg-surface);
padding: 2px 6px;
border-radius: 4px;
font-size: 0.9rem;
}
.docs-card pre {
background: var(--bg-surface);
border: 1px solid var(--border);
border-radius: 8px;
padding: 16px;
overflow-x: auto;
margin-bottom: 16px;
}
.docs-card pre code {
background: none;
padding: 0;
}
.example-box {
background: var(--bg-surface);
border-left: 3px solid var(--accent);
border-radius: 0 8px 8px 0;
padding: 16px 20px;
margin-bottom: 16px;
}
.example-box .prompt {
font-weight: 500;
color: var(--text);
margin-bottom: 8px;
}
.example-box .explanation {
font-size: 0.9rem;
color: var(--text-muted);
margin: 0;
}
.tool-table {
width: 100%;
border-collapse: collapse;
margin-bottom: 16px;
}
.tool-table th, .tool-table td {
text-align: left;
padding: 10px 12px;
border-bottom: 1px solid var(--border);
font-size: 0.95rem;
}
.tool-table th {
font-weight: 600;
color: var(--text);
}
.tool-table td {
color: var(--text-muted);
font-weight: 300;
}
</style>
<div class="docs-container">
<div class="docs-card">
<h1>inou for Claude</h1>
<p class="intro">
<span style="font-weight: 700; color: var(--accent);">inou</span> gives Claude direct access to your health data
for independent medical analysis. Imaging, labs, genomics, and 27 data categories &mdash;
all queryable through a single MCP integration.
</p>
</div>
<div class="docs-card">
<h2>What it does</h2>
<p>
inou connects Claude to your personal health records stored on the inou platform.
Claude can browse your medical imaging (MRI, CT, X-ray), review lab results with trends over time,
analyze genomic variants, and read clinical documents &mdash; forming its own independent medical opinions
from the raw data rather than echoing prior assessments.
</p>
<p>Key capabilities:</p>
<ul>
<li><strong>Medical imaging</strong> &mdash; View DICOM studies (MRI, CT, X-ray) with adjustable window/level, navigate series via contact sheets</li>
<li><strong>Lab results</strong> &mdash; Query by LOINC code, track trends across multiple draws, SI unit normalization</li>
<li><strong>Genomic data</strong> &mdash; Search variants by gene name, review pharmacogenomic and disease-risk markers</li>
<li><strong>Clinical documents</strong> &mdash; Access uploaded documents with extracted text and metadata</li>
<li><strong>27 data categories</strong> &mdash; Medications, diagnoses, surgeries, vitals, family history, and more</li>
</ul>
</div>
<div class="docs-card">
<h2>Setup</h2>
<h3>1. Sign in</h3>
<p>
When you connect inou to Claude, you'll be redirected to <strong>inou.com</strong> to sign in.
Enter your email and verify with the code sent to your inbox. No password needed.
</p>
<h3>2. Authorize</h3>
<p>
Review the access request and click <strong>Allow</strong> to grant Claude read-only access to your health data.
</p>
<h3>3. Start asking</h3>
<p>
Claude will automatically discover your dossiers and available data. Ask about your labs, imaging, genome, or any health topic.
</p>
<p>
<strong>New users:</strong> A demo dossier (Jane Doe) with sample labs, imaging, and genome data
is automatically available so you can explore the integration immediately.
</p>
</div>
<div class="docs-card">
<h2>Available tools</h2>
<table class="tool-table">
<tr><th>Tool</th><th>Description</th></tr>
<tr><td><code>list_dossiers</code></td><td>List all patient dossiers accessible to your account</td></tr>
<tr><td><code>list_categories</code></td><td>See what data categories exist for a dossier with entry counts</td></tr>
<tr><td><code>list_entries</code></td><td>Query entries by category, type, LOINC code, gene, date range, or parent hierarchy</td></tr>
<tr><td><code>fetch_image</code></td><td>Fetch a DICOM slice as PNG with adjustable window/level</td></tr>
<tr><td><code>fetch_contact_sheet</code></td><td>Thumbnail grid for navigating imaging series</td></tr>
<tr><td><code>fetch_document</code></td><td>Retrieve document content with extracted text and metadata</td></tr>
<tr><td><code>get_version</code></td><td>Server version information</td></tr>
</table>
<p>All tools are <strong>read-only</strong>. Claude cannot modify your health data.</p>
</div>
<div class="docs-card">
<h2>Examples</h2>
<div class="example-box">
<p class="prompt">"Review Jane Doe's CBC trend over the past year. Are there any concerning patterns?"</p>
<p class="explanation">Claude queries lab entries by LOINC codes for WBC, RBC, hemoglobin, platelets, and differential. It compares values across four blood draws and identifies the December anomaly: elevated WBC (13.2), low hemoglobin (10.8), microcytic indices (MCV 72.4), and reactive thrombocytosis (452K) &mdash; suggesting iron deficiency with possible infection.</p>
</div>
<div class="example-box">
<p class="prompt">"Look at Jane's brain MRI. Walk me through what you see."</p>
<p class="explanation">Claude lists imaging studies, navigates to the brain MRI series, fetches a contact sheet for orientation, then retrieves individual slices at diagnostic resolution. It describes anatomy, signal characteristics, and any visible findings &mdash; forming its own read independent of any radiologist report.</p>
</div>
<div class="example-box">
<p class="prompt">"What genetic variants does Jane carry that could affect medication metabolism?"</p>
<p class="explanation">Claude queries genome entries filtered by pharmacogenomic genes (CYP2D6, CYP2C19, CYP3A4, etc.), reviews variant classifications and zygosity, and maps findings to drug metabolism implications &mdash; identifying poor/rapid metabolizer status for specific medication classes.</p>
</div>
</div>
<div class="docs-card">
<h2>Security &amp; privacy</h2>
<ul>
<li><strong>Encryption</strong> &mdash; All data encrypted at rest (AES-256-GCM, FIPS 140-3)</li>
<li><strong>OAuth 2.1</strong> &mdash; Authorization code flow with PKCE, no passwords stored</li>
<li><strong>Read-only</strong> &mdash; Claude can only read data, never modify or delete</li>
<li><strong>RBAC</strong> &mdash; Role-based access control enforced at every data access point</li>
<li><strong>Short-lived tokens</strong> &mdash; Access tokens expire in 15 minutes, refresh tokens rotate on use</li>
</ul>
<p>
Read our full <a href="/privacy-policy">Privacy Policy</a>.
For questions, contact <a href="mailto:support@inou.com">support@inou.com</a>.
</p>
</div>
</div>
{{end}}

View File

@ -10,7 +10,13 @@
</p> </p>
{{end}} {{end}}
</div> </div>
<a href="/dashboard" class="btn btn-secondary btn-small">← {{.T.back_to_dossiers}}</a> <div style="display: flex; align-items: center; gap: 12px;">
<div class="card-actions" style="position: static;">
<a href="/dossier/{{.TargetDossier.DossierID}}/upload" title="{{.T.upload_files}}">&#8682;</a>
<a href="/dossier/{{.TargetDossier.DossierID}}/edit" title="{{.T.edit}}">&#9998;</a>
</div>
<a href="/dashboard" class="btn btn-secondary btn-small">← {{.T.back_to_dossiers}}</a>
</div>
</div> </div>
{{if .Error}}<div class="error">{{.Error}}</div>{{end}} {{if .Error}}<div class="error">{{.Error}}</div>{{end}}
@ -597,10 +603,21 @@ function buildSVGChart(name, unit, points, abbr, globalTMin, globalTMax) {
const vals = points.map(p => p.val); const vals = points.map(p => p.val);
let yMin = Math.min(...vals), yMax = Math.max(...vals); let yMin = Math.min(...vals), yMax = Math.max(...vals);
// Include reference range in Y axis bounds if available // Include reference bounds in Y axis — but for one-sided refs, only pull toward
// the boundary if data is within 2x the padding distance of it
if (ref) { if (ref) {
yMin = Math.min(yMin, ref.refLow); const dir = ref.direction || '';
yMax = Math.max(yMax, ref.refHigh); if (dir === 'higher_better') {
// Only show lower bound if data is near it (within 50% of data range)
const range = yMax - yMin || 1;
if (ref.refLow > yMin - range * 0.5) yMin = Math.min(yMin, ref.refLow);
} else if (dir === 'lower_better') {
const range = yMax - yMin || 1;
if (ref.refHigh < yMax + range * 0.5) yMax = Math.max(yMax, ref.refHigh);
} else {
if (ref.refLow > 0) yMin = Math.min(yMin, ref.refLow);
if (ref.refHigh > 0) yMax = Math.max(yMax, ref.refHigh);
}
} }
const yPad = (yMax - yMin) * 0.15 || 1; const yPad = (yMax - yMin) * 0.15 || 1;
yMin -= yPad; yMax += yPad; yMin -= yPad; yMax += yPad;
@ -621,22 +638,39 @@ function buildSVGChart(name, unit, points, abbr, globalTMin, globalTMax) {
// Reference band (drawn first, behind everything) // Reference band (drawn first, behind everything)
let refBand = ''; let refBand = '';
if (ref) { if (ref) {
const bandTop = yScale(ref.refHigh); const dir = ref.direction || '';
const bandBot = yScale(ref.refLow);
const chartTop = PAD.top; const chartTop = PAD.top;
const chartBot = PAD.top + ph; const chartBot = PAD.top + ph;
// Red zones above and below normal range if (dir === 'higher_better') {
if (bandTop > chartTop) { // Only lower bound: red below, green above
refBand += `<rect x="${PAD.left}" y="${chartTop}" width="${pw}" height="${bandTop - chartTop}" fill="#fee2e2" opacity="0.5"/>`; const bandBot = yScale(ref.refLow);
refBand += `<rect x="${PAD.left}" y="${chartTop}" width="${pw}" height="${chartBot - chartTop}" fill="#dcfce7" opacity="0.6"/>`;
if (bandBot < chartBot) {
refBand += `<rect x="${PAD.left}" y="${bandBot}" width="${pw}" height="${chartBot - bandBot}" fill="#fee2e2" opacity="0.5"/>`;
}
refBand += `<line x1="${PAD.left}" y1="${bandBot}" x2="${W - PAD.right}" y2="${bandBot}" stroke="#86efac" stroke-width="1" stroke-dasharray="4,3"/>`;
} else if (dir === 'lower_better') {
// Only upper bound: red above, green below
const bandTop = yScale(ref.refHigh);
refBand += `<rect x="${PAD.left}" y="${chartTop}" width="${pw}" height="${chartBot - chartTop}" fill="#dcfce7" opacity="0.6"/>`;
if (bandTop > chartTop) {
refBand += `<rect x="${PAD.left}" y="${chartTop}" width="${pw}" height="${bandTop - chartTop}" fill="#fee2e2" opacity="0.5"/>`;
}
refBand += `<line x1="${PAD.left}" y1="${bandTop}" x2="${W - PAD.right}" y2="${bandTop}" stroke="#86efac" stroke-width="1" stroke-dasharray="4,3"/>`;
} else {
// Two-sided: red above and below, green in range
const bandTop = yScale(ref.refHigh);
const bandBot = yScale(ref.refLow);
if (bandTop > chartTop) {
refBand += `<rect x="${PAD.left}" y="${chartTop}" width="${pw}" height="${bandTop - chartTop}" fill="#fee2e2" opacity="0.5"/>`;
}
if (bandBot < chartBot) {
refBand += `<rect x="${PAD.left}" y="${bandBot}" width="${pw}" height="${chartBot - bandBot}" fill="#fee2e2" opacity="0.5"/>`;
}
refBand += `<rect x="${PAD.left}" y="${bandTop}" width="${pw}" height="${bandBot - bandTop}" fill="#dcfce7" opacity="0.6"/>`;
refBand += `<line x1="${PAD.left}" y1="${bandTop}" x2="${W - PAD.right}" y2="${bandTop}" stroke="#86efac" stroke-width="1" stroke-dasharray="4,3"/>`;
refBand += `<line x1="${PAD.left}" y1="${bandBot}" x2="${W - PAD.right}" y2="${bandBot}" stroke="#86efac" stroke-width="1" stroke-dasharray="4,3"/>`;
} }
if (bandBot < chartBot) {
refBand += `<rect x="${PAD.left}" y="${bandBot}" width="${pw}" height="${chartBot - bandBot}" fill="#fee2e2" opacity="0.5"/>`;
}
// Green normal range
refBand += `<rect x="${PAD.left}" y="${bandTop}" width="${pw}" height="${bandBot - bandTop}" fill="#dcfce7" opacity="0.6"/>`;
// Boundary lines
refBand += `<line x1="${PAD.left}" y1="${bandTop}" x2="${W - PAD.right}" y2="${bandTop}" stroke="#86efac" stroke-width="1" stroke-dasharray="4,3"/>`;
refBand += `<line x1="${PAD.left}" y1="${bandBot}" x2="${W - PAD.right}" y2="${bandBot}" stroke="#86efac" stroke-width="1" stroke-dasharray="4,3"/>`;
} }
// Y-axis: 4 ticks // Y-axis: 4 ticks
@ -675,7 +709,10 @@ function buildSVGChart(name, unit, points, abbr, globalTMin, globalTMax) {
let dotColor = '#B45309'; // amber default let dotColor = '#B45309'; // amber default
let textColor = '#1f2937'; let textColor = '#1f2937';
if (ref) { if (ref) {
const inRange = p.val >= ref.refLow && p.val <= ref.refHigh; const dir = ref.direction || '';
const inRange = dir === 'higher_better' ? p.val >= ref.refLow :
dir === 'lower_better' ? p.val <= ref.refHigh :
p.val >= ref.refLow && p.val <= ref.refHigh;
if (inRange) { if (inRange) {
dotColor = '#16a34a'; // green dotColor = '#16a34a'; // green
} else { } else {
@ -711,6 +748,38 @@ function buildSVGChart(name, unit, points, abbr, globalTMin, globalTMax) {
</svg>`; </svg>`;
} }
// Render vitals charts on page load
document.querySelectorAll('[data-chart]').forEach(wrapper => {
const metrics = JSON.parse(wrapper.dataset.chart);
const body = wrapper.querySelector('.filter-chart-body');
if (!metrics || metrics.length === 0) { wrapper.style.display = 'none'; return; }
// Calculate global time range
let globalTMin = Infinity, globalTMax = -Infinity;
for (const m of metrics) {
for (const p of m.points) {
const t = p.date * 1000;
if (t < globalTMin) globalTMin = t;
if (t > globalTMax) globalTMax = t;
}
}
globalTMax = Math.max(globalTMax, Date.now());
// Populate labRefData so buildSVGChart picks up refs
for (const m of metrics) {
if (m.ref) labRefData[m.type] = m.ref;
}
let html = '';
for (const m of metrics) {
const points = m.points.map(p => ({ date: new Date(p.date * 1000), val: p.val }));
points.sort((a, b) => a.date - b.date);
if (points.length === 0) continue;
html += buildSVGChart(m.name, m.unit, points, m.type, globalTMin, globalTMax);
}
body.innerHTML = html;
});
// Genetics dynamic loading (if genetics section exists) // Genetics dynamic loading (if genetics section exists)
{{if .HasGenome}} {{if .HasGenome}}
const i18n = { const i18n = {
@ -998,6 +1067,13 @@ loadGeneticsCategories();
</div> </div>
{{end}} {{end}}
{{if .ChartData}}
<div class="filter-chart collapsed" id="{{.ID}}-charts" data-chart='{{.ChartData}}'>
<div class="filter-chart-header" onclick="this.parentNode.classList.toggle('collapsed')"><span class="filter-chart-toggle">&#9660;</span> Trends</div>
<div class="filter-chart-body"></div>
</div>
{{end}}
{{if .Dynamic}} {{if .Dynamic}}
<div class="data-table" id="{{.ID}}-content"></div> <div class="data-table" id="{{.ID}}-content"></div>
{{else if .Items}} {{else if .Items}}

View File

@ -13,6 +13,46 @@
margin-bottom: 24px; margin-bottom: 24px;
} }
/* Carousel */
.carousel {
position: relative;
width: 100%;
aspect-ratio: 16/9;
overflow: hidden;
border-radius: 6px;
margin-bottom: 32px;
}
.carousel-track {
display: flex;
height: 100%;
transition: transform 0.5s ease;
}
.carousel-slide {
min-width: 100%;
height: 100%;
background-size: cover;
background-position: center;
}
.carousel-dots {
display: flex;
justify-content: center;
gap: 8px;
margin-bottom: 32px;
}
.carousel-dot {
width: 8px;
height: 8px;
border-radius: 50%;
background: var(--border);
border: none;
padding: 0;
cursor: pointer;
transition: background 0.2s;
}
.carousel-dot.active {
background: var(--accent);
}
/* Hero - Block 1 */ /* Hero - Block 1 */
.hero-sources { .hero-sources {
@ -45,12 +85,12 @@
.hero-answer { .hero-answer {
text-align: center; text-align: center;
font-size: 1.25rem; font-size: 1.7rem;
font-weight: 400; font-weight: 500;
color: var(--text); color: var(--text);
line-height: 1.8; line-height: 1.5;
margin-top: 16px; margin-top: 16px;
margin-bottom: 32px; margin-bottom: 8px;
} }
.hero-answer .inou { .hero-answer .inou {
font-weight: 700; font-weight: 700;
@ -59,10 +99,20 @@
.hero-tagline { .hero-tagline {
text-align: center; text-align: center;
font-size: 1.3rem; font-size: 2.8rem;
font-weight: 600; font-weight: 700;
color: var(--text); color: var(--text);
margin-bottom: 32px; margin-bottom: 12px;
}
.carousel-caption {
text-align: center;
font-size: 0.95rem;
color: var(--muted);
line-height: 1.5;
min-height: 3em;
padding: 0 24px;
margin-bottom: 24px;
} }
.hero-cta { margin-bottom: 0; text-align: center; } .hero-cta { margin-bottom: 0; text-align: center; }
@ -260,8 +310,10 @@
} }
.hero-pivot .emphasis { font-size: 1.3rem; } .hero-pivot .emphasis { font-size: 1.3rem; }
.hero-answer { .hero-answer {
text-align: center; font-size: 1.05rem; margin-top: 16px; text-align: center; font-size: 1.2rem; margin-top: 16px;
margin-bottom: 32px; } margin-bottom: 8px; }
.hero-tagline { font-size: 2rem; margin-bottom: 8px; }
.carousel-caption { font-size: 0.85rem; }
.hero-cta .btn { padding: 14px 40px; } .hero-cta .btn { padding: 14px 40px; }
.story-pair .data { font-size: 1rem; } .story-pair .data { font-size: 1rem; }
.story-pair .reality { font-size: 0.95rem; } .story-pair .reality { font-size: 0.95rem; }
@ -289,8 +341,27 @@
<div class="landing-card"> <div class="landing-card">
<div class="hero"> <div class="hero">
<div class="hero-answer"><span class="inou">inou</span> organizes and shares your health dossier with your AI — securely and privately.</div>
<div class="hero-tagline">Your health, understood.</div> <div class="hero-tagline">Your health, understood.</div>
<div class="hero-answer">All your health data — organized, private, and ready for your AI.</div>
<div class="carousel">
<div class="carousel-track">
<div class="carousel-slide" style="background-image: url('/static/carousel-1.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-2.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-3.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-4.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-5.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-6.webp')"></div>
</div>
</div>
<div class="carousel-caption" id="carousel-caption">Track your lab trends over time — see exactly what your AI sees when it flags a change.</div>
<div class="carousel-dots">
<button class="carousel-dot active" data-index="0"></button>
<button class="carousel-dot" data-index="1"></button>
<button class="carousel-dot" data-index="2"></button>
<button class="carousel-dot" data-index="3"></button>
<button class="carousel-dot" data-index="4"></button>
<button class="carousel-dot" data-index="5"></button>
</div>
<div class="hero-cta"> <div class="hero-cta">
{{if .Dossier}}<a href="/invite" class="btn btn-primary">Invite a friend</a>{{else}}<a href="/start" class="btn btn-primary">Sign in</a>{{end}} {{if .Dossier}}<a href="/invite" class="btn btn-primary">Invite a friend</a>{{else}}<a href="/start" class="btn btn-primary">Sign in</a>{{end}}
{{if .Error}}<div class="error" style="margin-top: 24px;">{{.Error}}</div>{{end}} {{if .Error}}<div class="error" style="margin-top: 24px;">{{.Error}}</div>{{end}}
@ -418,6 +489,31 @@
{{template "footer"}} {{template "footer"}}
</div> </div>
<script>
(function() {
var track = document.querySelector('.carousel-track');
var dots = document.querySelectorAll('.carousel-dot');
var caption = document.getElementById('carousel-caption');
var captions = [
'Track your lab trends over time \u2014 see exactly what your AI sees when it flags a change.',
'Your labs, scans, and genome in one place \u2014 browse everything your AI has access to.',
'View your own MRI \u2014 zoom into the same slices your AI analyzed.',
'Your brain scan in 3D \u2014 navigate every plane, verify every finding your AI made.',
'Your AI connects the dots across labs and genome \u2014 and explains it in plain language.',
'Your X-ray, full resolution \u2014 zoom in on the findings your AI flagged.'
];
var count = dots.length;
var current = 0;
function go(i) {
current = i;
track.style.transform = 'translateX(-' + (i * 100) + '%)';
dots.forEach(function(d, j) { d.classList.toggle('active', j === i); });
caption.textContent = captions[i];
}
dots.forEach(function(d) {
d.addEventListener('click', function() { go(+d.dataset.index); });
});
setInterval(function() { go((current + 1) % count); }, 8000);
})();
</script>
{{end}} {{end}}
<!-- -->
<!-- test -->

View File

@ -1,121 +1,519 @@
{{define "landing_fr"}} {{define "landing_fr"}}
<style> <style>
.landing-card { background: var(--bg-card); border: 1px solid var(--border); border-radius: 8px; padding: 48px; width: 100%; margin-left: auto; margin-right: auto; margin-bottom: 24px; }
.hero-answer { text-align: center; font-size: 1.25rem; font-weight: 400; color: var(--text); line-height: 1.8; margin-top: 16px; margin-bottom: 32px; } .landing-card {
.hero-answer .inou { font-weight: 700; color: var(--accent); } background: var(--bg-card);
.hero-tagline { text-align: center; font-size: 1.3rem; font-weight: 600; color: var(--text); margin-bottom: 32px; } border: 1px solid var(--border);
border-radius: 8px;
padding: 48px;
width: 100%;
margin-left: auto;
margin-right: auto;
margin-bottom: 24px;
}
/* Carousel */
.carousel {
position: relative;
width: 100%;
aspect-ratio: 16/9;
overflow: hidden;
border-radius: 6px;
margin-bottom: 32px;
}
.carousel-track {
display: flex;
height: 100%;
transition: transform 0.5s ease;
}
.carousel-slide {
min-width: 100%;
height: 100%;
background-size: cover;
background-position: center;
}
.carousel-dots {
display: flex;
justify-content: center;
gap: 8px;
margin-bottom: 32px;
}
.carousel-dot {
width: 8px;
height: 8px;
border-radius: 50%;
background: var(--border);
border: none;
padding: 0;
cursor: pointer;
transition: background 0.2s;
}
.carousel-dot.active {
background: var(--accent);
}
/* Hero - Block 1 */
.hero-sources {
font-size: 1.1rem;
font-weight: 300;
color: var(--text-muted);
line-height: 1.9;
margin-bottom: 32px;
}
.hero-sources span { display: block; }
.hero-sources .different {
}
.hero-pivot {
font-size: 1.15rem;
font-weight: 400;
color: var(--text);
line-height: 1.8;
margin-bottom: 32px;
}
.hero-pivot span { display: block; }
.hero-pivot .emphasis {
font-size: 1.3rem;
font-weight: 600;
margin-top: 12px;
}
.hero-answer {
text-align: center;
font-size: 1.7rem;
font-weight: 500;
color: var(--text);
line-height: 1.5;
margin-top: 16px;
margin-bottom: 8px;
}
.hero-answer .inou {
font-weight: 700;
color: var(--accent);
}
.hero-tagline {
text-align: center;
font-size: 2.8rem;
font-weight: 700;
color: var(--text);
margin-bottom: 12px;
}
.carousel-caption {
text-align: center;
font-size: 0.95rem;
color: var(--muted);
line-height: 1.5;
min-height: 3em;
padding: 0 24px;
margin-bottom: 24px;
}
.hero-cta { margin-bottom: 0; text-align: center; } .hero-cta { margin-bottom: 0; text-align: center; }
.hero-cta .btn { padding: 18px 56px; font-size: 0.9rem; font-weight: 500; letter-spacing: 0.08em; text-transform: uppercase; border-radius: 4px; } .hero-cta .btn {
.story-prose.warm { font-size: 1.1rem; line-height: 1.8; color: var(--text); } padding: 18px 56px;
.story-prose.warm p { margin-bottom: 20px; } font-size: 0.9rem;
.story-prose.warm .emphasis { font-weight: 600; font-size: 1.15rem; } font-weight: 500;
.story-title { font-size: 1.25rem; font-weight: 600; color: var(--text); margin-bottom: 32px; } letter-spacing: 0.08em;
.story-pair { margin-bottom: 32px; } text-transform: uppercase;
.story-pair .data { font-size: 1.1rem; font-weight: 400; color: var(--text); margin-bottom: 4px; } border-radius: 4px;
.story-pair .reality { font-size: 1rem; font-weight: 300; font-style: italic; color: var(--text-muted); } }
.story-transition { font-size: 1.25rem; font-weight: 400; color: var(--text); line-height: 1.8; margin: 32px 0; padding: 24px 0; border-top: 1px solid var(--border); border-bottom: 1px solid var(--border); }
.story-gaps { font-size: 1rem; font-weight: 300; color: var(--text-muted); line-height: 1.8; margin-bottom: 32px; } /* Story - Block 2 */
.story-prose.warm {
font-size: 1.1rem;
line-height: 1.8;
color: var(--text);
}
.story-prose.warm p {
margin-bottom: 20px;
}
.story-prose.warm .emphasis {
font-weight: 600;
font-size: 1.15rem;
}
.story-title {
font-size: 1.25rem;
font-weight: 600;
color: var(--text);
margin-bottom: 32px;
}
.story-pair {
margin-bottom: 32px;
}
.story-pair .data {
font-size: 1.1rem;
font-weight: 400;
color: var(--text);
margin-bottom: 4px;
}
.story-pair .reality {
font-size: 1rem;
font-weight: 300;
font-style: italic;
color: var(--text-muted);
}
.story-transition {
font-size: 1.25rem;
font-weight: 400;
color: var(--text);
line-height: 1.8;
margin: 32px 0;
padding: 24px 0;
border-top: 1px solid var(--border);
border-bottom: 1px solid var(--border);
}
.story-gaps {
font-size: 1rem;
font-weight: 300;
color: var(--text-muted);
line-height: 1.8;
margin-bottom: 32px;
}
.story-gaps span { display: block; } .story-gaps span { display: block; }
.story-gaps .indent { font-style: italic; } .story-gaps .indent { font-style: italic; }
.story-connections { font-size: 1rem; font-weight: 300; color: var(--text-muted); line-height: 1.8; margin-bottom: 32px; }
.story-connections {
font-size: 1rem;
font-weight: 300;
color: var(--text-muted);
line-height: 1.8;
margin-bottom: 32px;
}
.story-connections span { display: block; } .story-connections span { display: block; }
.story-ai { font-size: 1.25rem; font-weight: 400; color: var(--text); line-height: 1.8; margin-bottom: 32px; }
.story-ai {
font-size: 1.25rem;
font-weight: 400;
color: var(--text);
line-height: 1.8;
margin-bottom: 32px;
}
.story-ai span { display: block; } .story-ai span { display: block; }
.story-ai .last { font-style: italic; } .story-ai .last {
.story-prose { font-size: 1rem; font-weight: 300; color: var(--text-muted); line-height: 1.8; margin-bottom: 20px; } font-style: italic;
}
.story-prose {
font-size: 1rem;
font-weight: 300;
color: var(--text-muted);
line-height: 1.8;
margin-bottom: 20px;
}
.story-prose:last-of-type { margin-bottom: 32px; } .story-prose:last-of-type { margin-bottom: 32px; }
.story-prose strong { font-weight: 600; color: var(--text); } .story-prose strong { font-weight: 600; color: var(--text); }
.story-prose .inou { font-weight: 700; color: var(--accent); } .story-prose .inou { font-weight: 700; color: var(--accent); }
.story-closing { font-size: 1.25rem; font-weight: 400; color: var(--text); padding-top: 24px; border-top: 1px solid var(--border); }
.story-closing .inou { font-weight: 700; color: var(--accent); } .story-closing {
.trust-card { width: 100%; margin-left: auto; margin-right: auto; background: var(--bg-card); border: 1px solid var(--border); border-radius: 8px; padding: 32px 48px; margin-bottom: 24px; } font-size: 1.25rem;
.trust-card .section-label { font-size: 0.75rem; font-weight: 600; text-transform: uppercase; letter-spacing: 0.05em; color: var(--text-muted); margin-bottom: 24px; } font-weight: 400;
.trust-grid { display: grid; grid-template-columns: repeat(4, 1fr); gap: 32px; } color: var(--text);
.trust-item { font-size: 0.9rem; font-weight: 300; color: var(--text-muted); line-height: 1.6; } padding-top: 24px;
.trust-item strong { display: block; font-weight: 600; color: var(--text); margin-bottom: 4px; } border-top: 1px solid var(--border);
.landing-footer { padding: 16px 0; border-top: 1px solid var(--border); display: flex; justify-content: space-between; align-items: center; } }
.landing-footer-left { font-size: 0.9rem; color: var(--text-muted); display: flex; gap: 16px; align-items: center; } .story-closing .inou {
.landing-footer-left a { color: var(--text-muted); text-decoration: none; } font-weight: 700;
color: var(--accent);
}
/* Trust section */
.trust-card {
width: 100%;
margin-left: auto;
margin-right: auto;
background: var(--bg-card);
border: 1px solid var(--border);
border-radius: 8px;
padding: 32px 48px;
margin-bottom: 24px;
}
.trust-card .section-label {
font-size: 0.75rem;
font-weight: 600;
text-transform: uppercase;
letter-spacing: 0.05em;
color: var(--text-muted);
margin-bottom: 24px;
}
.trust-grid {
display: grid;
grid-template-columns: repeat(4, 1fr);
gap: 32px;
}
.trust-item {
font-size: 0.9rem;
font-weight: 300;
color: var(--text-muted);
line-height: 1.6;
}
.trust-item strong {
display: block;
font-weight: 600;
color: var(--text);
margin-bottom: 4px;
}
/* Footer */
.landing-footer {
padding: 16px 0;
border-top: 1px solid var(--border);
display: flex;
justify-content: space-between;
align-items: center;
}
.landing-footer-left {
font-size: 0.9rem;
color: var(--text-muted);
display: flex;
gap: 16px;
align-items: center;
}
.landing-footer-left a {
color: var(--text-muted);
text-decoration: none;
}
.landing-footer-left a:hover { color: var(--accent); } .landing-footer-left a:hover { color: var(--accent); }
.landing-footer-right { font-size: 1rem; } .landing-footer-right { font-size: 1rem; }
.landing-footer-right .inou { font-weight: 700; color: var(--accent); } .landing-footer-right .inou {
.landing-footer-right .health { font-weight: 400; color: var(--text-muted); } font-weight: 700;
@media (max-width: 768px) { .trust-card { padding: 24px; } .trust-grid { grid-template-columns: repeat(2, 1fr); gap: 24px; } } color: var(--accent);
@media (max-width: 480px) { .trust-card { padding: 20px 16px; } .trust-grid { grid-template-columns: 1fr; gap: 20px; } .landing-footer { flex-direction: column; gap: 12px; text-align: center; } .landing-footer-left { flex-direction: column; gap: 8px; } } }
.landing-footer-right .health {
font-weight: 400;
color: var(--text-muted);
}
/* Mobile */
@media (max-width: 768px) {
.trust-card {
width: 100%;
margin-left: auto;
margin-right: auto; padding: 24px; }
.hero-sources {
font-size: 1rem; line-height: 1.8; margin-bottom: 32px; }
.hero-pivot { font-size: 1.1rem; margin-bottom: 32px;
}
.hero-pivot .emphasis { font-size: 1.3rem; }
.hero-answer {
text-align: center; font-size: 1.2rem; margin-top: 16px;
margin-bottom: 8px; }
.hero-tagline { font-size: 2rem; margin-bottom: 8px; }
.carousel-caption { font-size: 0.85rem; }
.hero-cta .btn { padding: 14px 40px; }
.story-pair .data { font-size: 1rem; }
.story-pair .reality { font-size: 0.95rem; }
.trust-grid { grid-template-columns: repeat(2, 1fr); gap: 24px; }
}
@media (max-width: 480px) {
.trust-card {
width: 100%;
margin-left: auto;
margin-right: auto; padding: 20px 16px; }
.hero-sources {
font-size: 0.95rem; line-height: 1.9; }
.hero-pivot { font-size: 1rem; }
.hero-pivot .emphasis { font-size: 1.2rem; }
.story-pair { margin-bottom: 24px; }
.trust-grid { grid-template-columns: 1fr; gap: 20px; }
.landing-footer { flex-direction: column; gap: 12px; text-align: center; }
.landing-footer-left { flex-direction: column; gap: 8px; }
}
</style> </style>
<div class="sg-container"> <div class="sg-container">
<div class="landing-card"> <div class="landing-card">
<div class="hero"> <div class="hero">
<div class="hero-answer"><span class="inou">inou</span> organise et partage votre dossier santé avec votre IA — en toute sécurité et confidentialité.</div> <div class="hero-tagline">Ta santé, comprise.</div>
<div class="hero-tagline">Votre santé, comprise.</div> <div class="hero-answer">Toutes tes données de santé — organisées, privées, et prêtes pour ton IA.</div>
<div class="hero-cta">{{if .Dossier}}<a href="/invite" class="btn btn-primary">Inviter un ami</a>{{else}}<a href="/start" class="btn btn-primary">Se connecter</a>{{end}}{{if .Error}}<div class="error" style="margin-top: 24px;">{{.Error}}</div>{{end}}</div> <div class="carousel">
<div class="carousel-track">
<div class="carousel-slide" style="background-image: url('/static/carousel-1.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-2.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-3.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-4.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-5.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-6.webp')"></div>
</div>
</div>
<div class="carousel-caption" id="carousel-caption">Suis l'évolution de tes analyses dans le temps — vois exactement ce que ton IA voit quand elle détecte un changement.</div>
<div class="carousel-dots">
<button class="carousel-dot active" data-index="0"></button>
<button class="carousel-dot" data-index="1"></button>
<button class="carousel-dot" data-index="2"></button>
<button class="carousel-dot" data-index="3"></button>
<button class="carousel-dot" data-index="4"></button>
<button class="carousel-dot" data-index="5"></button>
</div>
<div class="hero-cta">
{{if .Dossier}}<a href="/invite" class="btn btn-primary">Inviter un ami</a>{{else}}<a href="/start" class="btn btn-primary">Se connecter</a>{{end}}
{{if .Error}}<div class="error" style="margin-top: 24px;">{{.Error}}</div>{{end}}
</div>
</div> </div>
</div> </div>
<div class="landing-card"> <div class="landing-card">
<div class="story"> <div class="story">
<h2 class="story-title">Vous avez besoin de l'IA pour votre santé</h2> <h2 class="story-title">Tu as besoin d'une IA pour ta santé</h2>
<div class="story-prose warm"> <div class="story-prose warm">
<p>Vos données de santé sont dispersées dans des dizaines d'endroits — chez votre cardiologue, votre neurologue, le laboratoire, votre montre connectée, vos applications, votre 23andMe. Et vous seul connaissez le reste : ce que vous mangez, ce que vous buvez, quels compléments vous prenez. Votre programme d'entraînement. Vos symptômes. Vos objectifs — que vous essayiez de tomber enceinte, de vous préparer pour un marathon, ou simplement de vous sentir moins fatigué.</p> <p>Tes données de santé sont dispersées dans une dizaines d'endroits différents — chez ton cardiologue, ton neurologue, ton laboratoire, ta montre, tes applications, ton 23andMe. Et toi seul connais le reste : ce que tu manges, ce que tu bois, les suppléments que tu prends. Ton programme d'exercice. Tes symptômes. Tes objectifs — que tu tries de tomber enceinte, de préparer un marathon, ou simplement d'essayer de te sentir moins épuisé.</p>
<p>Que vous soyez en bonne santé et vouliez le rester, que vous naviguiez un diagnostic difficile, ou que vous vous occupiez d'un proche qui ne peut pas se défendre seul — aucun médecin ne voit le tableau complet. Aucun système ne connecte tout.</p>
<p>Mais vous avez accès à tout. Il vous manque juste l'expertise pour tout comprendre.</p> <p>Que tu sois en bonne santé et veuille le rester, que tu navigues un diagnostic difficile, ou que tu prennes soin d'un membre de famille qui ne peut pas s'exprimer pour lui-même — aucun médecin ne voit l'image complète. Aucun système ne connecte tout ça.</p>
<p class="emphasis">Votre IA l'a. inou lui donne le tableau complet.</p>
<p>Mais <strong><em>toi</em></strong> tu as accès à tout ça. Tu n'as juste pas l'expertise pour donner sens à tout ça.</p>
<p class="emphasis">Ton IA, si. inou lui donne l'image complète.</p>
</div> </div>
</div> </div>
</div> </div>
<div class="landing-card"> <div class="landing-card">
<div class="story"> <div class="story">
<h2 class="story-title">Le défi</h2> <h2 class="story-title">Le défi</h2>
<div class="story-pair"><div class="data">Votre IRM contient 4 000 coupes.</div><div class="reality">Elle a été lue en 10 minutes.</div></div> <div class="story-pair">
<div class="story-pair"><div class="data">Votre génome contient des millions de variants.</div><div class="reality">Vous n'avez appris que la couleur de vos yeux et l'origine de vos ancêtres.</div></div> <div class="data">Ton IRM a 4 000 coupes.</div>
<div class="story-pair"><div class="data">Votre bilan sanguin contient des dizaines de marqueurs.</div><div class="reality">Votre médecin a dit "tout va bien."</div></div> <div class="reality">Elle a été analysée en 10 minutes.</div>
<div class="story-pair"><div class="data">Votre montre a enregistré 10 000 heures de sommeil.</div><div class="reality">Votre coach ne sait pas qu'elle existe.</div></div> </div>
<div class="story-pair"><div class="data">Vous avez essayé une centaine de compléments différents.</div><div class="reality">Personne n'a demandé lesquels.</div></div>
<div class="story-transition">Les connexions sont là.<br>Elles sont juste trop complexes pour une seule personne.</div> <div class="story-pair">
<div class="data">Ton génome contient des millions de variants.</div>
<div class="reality">Tout ce que tu as appris, c'est la couleur de tes yeux et d'où viennent tes ancêtres.</div>
</div>
<div class="story-pair">
<div class="data">Tes analyses de sang contiennent des dizaines de marqueurs.</div>
<div class="reality">Ton médecin a dit "tout va bien".</div>
</div>
<div class="story-pair">
<div class="data">Ta montre a suivi 10 000 heures de sommeil.</div>
<div class="reality">Ton coach ne sait même pas que ça existe.</div>
</div>
<div class="story-pair">
<div class="data">Tu as essayé des centaines de suppléments différents.</div>
<div class="reality">Personne n'a demandé lesquels.</div>
</div>
<div class="story-transition">
Les connexions existent.<br>
Elles sont juste trop complexes pour qu'une seule personne les saisisse.
</div>
<div class="story-gaps"> <div class="story-gaps">
<span>Personne ne sait comment votre corps métabolise la Warfarine — pas même vous.</span> <span>Personne ne sait comment ton corps traite la Warfarine — pas même toi.</span>
<span class="indent">Mais la réponse se cache peut-être déjà dans votre 23andMe.</span> <span class="indent">Mais la réponse est peut-être déjà cachée dans ton 23andMe.</span>
<span>Ce "sans particularité" sur votre IRM — quelqu'un a-t-il vraiment regardé les 4 000 coupes attentivement ?</span> <span>Ce "sans particularité" sur ton IRM — quelqu'un a-t-il vraiment regardé les 4 000 coupes de près ?</span>
<span>Votre thyroïde est "dans les normes" — mais personne n'a fait le lien avec votre fatigue, votre poids, le fait que vous avez toujours froid.</span> <span>Ta thyroïde est "dans les normes" — mais personne ne l'a connectée à ta fatigue, ton poids, le fait d'avoir toujours froid.</span>
</div> </div>
<div class="story-connections"> <div class="story-connections">
<span>Personne ne relie votre café de l'après-midi à votre qualité de sommeil.</span> <span>Personne ne connecte ta caféine de l'après-midi à tes scores de sommeil.</span>
<span>Votre taux de fer à votre fatigue à l'entraînement.</span> <span>Ton taux de fer à ta fatigue à l'entraînement.</span>
<span>Votre génétique à votre brouillard mental.</span> <span>Ta génétique à ta brouillard mental.</span>
</div> </div>
<div class="story-ai"> <div class="story-ai">
<span>Votre IA n'oublie pas.</span> <span>Ton IA n'oublie pas.</span>
<span>Ne se précipite pas.</span> <span>Ne se dépêche pas.</span>
<span>Trouve ce qui a été manqué.</span> <span>Trouve ce qui a été manqué.</span>
<span class="last">Ne se spécialise pas — vous voit dans votre globalité.</span> <span class="last">Ne se spécialise pas — voit le toi complet.</span>
</div> </div>
<div class="story-closing"><span class="inou">inou</span> permet à votre IA de tout prendre en compte — chaque coupe, chaque marqueur, chaque variant — de tout connecter et de vous donner enfin des réponses que personne d'autre ne pouvait donner.</div>
<div class="story-closing"><span class="inou">inou</span> permet à ton IA de tout prendre en compte — chaque coupe, chaque marqueur, chaque variant — tout connecter et enfin te donner des réponses que personne d'autre ne pourrait.</div>
</div> </div>
</div> </div>
<div class="landing-card"> <div class="landing-card">
<div class="story"> <div class="story">
<h2 class="story-title">Pourquoi nous avons créé ça</h2> <h2 class="story-title">Pourquoi nous avons built this</h2>
<p class="story-prose">Vous avez collecté des années de données de santé. Des examens de l'hôpital. Des analyses du laboratoire. Des résultats du portail patient. Des données de votre montre. Peut-être même votre ADN.</p>
<p class="story-prose">Et puis il y a tout ce que vous seul savez — votre poids, votre tension, votre programme d'entraînement, les compléments que vous prenez, les symptômes que vous oubliez toujours de mentionner.</p> <p class="story-prose">Tu as collecté des années de données de santé. Des scans de l'hôpital. Des analyses de sang du laboratoire. Les résultats du portail de ton médecin. Les données de ta montre. Peut-être même ton ADN.</p>
<p class="story-prose">Tout est là — mais dispersé dans des systèmes qui ne communiquent pas entre eux, chez des spécialistes qui ne voient que leur partie, ou enfermé dans votre propre tête.</p>
<p class="story-prose">Votre cardiologue ne sait pas ce que votre neurologue a trouvé. Votre coach n'a pas vu vos analyses sanguines. Votre médecin n'a aucune idée des compléments que vous prenez. Et aucun d'entre eux n'a le temps de s'asseoir avec vous pour relier les points.</p> <p class="story-prose">Et puis il y a tout ce que toi seul sais — ton poids, ta tension artérielle, ton programme d'entraînement, les suppléments que tu prends, les symptômes que tu as mentionné.</p>
<p class="story-prose"><strong>L'IA peut enfin le faire.</strong> Elle peut rassembler ce qu'aucun expert seul ne voit — et vous l'expliquer en plus.</p>
<p class="story-prose">Mais ces données ne tiennent pas dans une fenêtre de chat. Et la dernière chose que vous voulez, c'est votre historique médical sur les serveurs de quelqu'un d'autre, entraînant leurs modèles.</p> <p class="story-prose">Tout est là — mais dispersé à travers des systèmes qui ne communiquent pas entre eux, détenus par des spécialistes qui ne voient que leur partie, ou verrouillés dans ta propre tête.</p>
<p class="story-prose"><span class="inou">inou</span> rassemble tout — analyses, imagerie, génétique, constantes, médicaments, compléments — chiffré, privé, et partagé avec absolument personne. Votre IA se connecte en toute sécurité. Vos données restent les vôtres.</p>
<div class="story-closing">Votre santé, comprise.</div> <p class="story-prose">Ton cardiologue ne sait pas ce que ton neurologue a trouvé. Ton coach n'a pas vu tes analyses de sang. Ton médecin n'a aucune idée des suppléments que tu prends. Et aucun d'eux n'a le temps de s'asseoir avec toi et de faire les liens.</p>
<p class="story-prose"><strong>L'IA enfin peut.</strong> Elle peut assembler ce qu'aucun expert seul ne voit — et te l'expliquer vraiment.</p>
<p class="story-prose">Mais ces données ne rentrent pas dans une fenêtre de chat. Et la dernière chose que tu veux, c'est ton historique médical sur les serveurs de quelqu'un d'autre, pour entraîner leurs modèles.</p>
<p class="story-prose"><span class="inou">inou</span> rassemble tout — laboratoires, imagerie, génétique, constantes vitales, médicaments, suppléments — chiffré, privé, et partagé avec absolument personne. Ton IA se connecte de manière sécurisée. Tes données restent tiennes.</p>
<div class="story-closing">Ta santé, comprise.</div>
</div> </div>
</div> </div>
<div class="trust-card"> <div class="trust-card">
<div class="section-label">{{.T.data_yours}}</div> <div class="section-label">{{.T.data_yours}}</div>
<div class="trust-grid"> <div class="trust-grid">
<div class="trust-item"><strong>{{.T.never_training}}</strong>{{.T.never_training_desc}}</div> <div class="trust-item">
<div class="trust-item"><strong>{{.T.never_shared}}</strong>{{.T.never_shared_desc}}</div> <strong>{{.T.never_training}}</strong>
<div class="trust-item"><strong>{{.T.encrypted}}</strong>{{.T.encrypted_desc}}</div> {{.T.never_training_desc}}
<div class="trust-item"><strong>{{.T.delete}}</strong>{{.T.delete_desc}}</div> </div>
<div class="trust-item">
<strong>{{.T.never_shared}}</strong>
{{.T.never_shared_desc}}
</div>
<div class="trust-item">
<strong>{{.T.encrypted}}</strong>
{{.T.encrypted_desc}}
</div>
<div class="trust-item">
<strong>{{.T.delete}}</strong>
{{.T.delete_desc}}
</div>
</div> </div>
</div> </div>
<footer class="landing-footer">
<div class="landing-footer-left"><span>© 2025</span><a href="/privacy-policy">Confidentialité</a></div> {{template "footer"}}
<span class="landing-footer-right"><span class="inou">inou</span> <span class="health">health</span></span>
</footer>
</div> </div>
{{end}} <script>
(function() {
var track = document.querySelector('.carousel-track');
var dots = document.querySelectorAll('.carousel-dot');
var caption = document.getElementById('carousel-caption');
var captions = [
'Suis l\'évolution de tes analyses dans le temps — vois exactement ce que ton IA voit quand elle détecte un changement.',
'Tes labos, tes scans et ton génome en un seul endroit — parcours tout ce à quoi ton IA a accès.',
'Vois ta propre IRM — zoome sur les mêmes coupes que ton IA a analysées.',
'Ton scan cérébral en 3D — navigue dans tous les plans, vérifie chaque finding que ton IA a fait.',
'Ton IA fait le lien entre les analyses et le génome — et t\'explique tout en langage clair.',
'Ta radio, pleine résolution — zoome sur les résultats que ton IA a signalés.'
];
var count = dots.length;
var current = 0;
function go(i) {
current = i;
track.style.transform = 'translateX(-' + (i * 100) + '%)';
dots.forEach(function(d, j) { d.classList.toggle('active', j === i); });
caption.textContent = captions[i];
}
dots.forEach(function(d) {
d.addEventListener('click', function() { go(+d.dataset.index); });
});
setInterval(function() { go((current + 1) % count); }, 8000);
})();
</script>
{{end}}

View File

@ -1,5 +1,6 @@
{{define "landing_nl"}} {{define "landing_nl"}}
<style> <style>
.landing-card { .landing-card {
background: var(--bg-card); background: var(--bg-card);
border: 1px solid var(--border); border: 1px solid var(--border);
@ -12,23 +13,106 @@
margin-bottom: 24px; margin-bottom: 24px;
} }
.hero-answer { /* Carousel */
text-align: center; .carousel {
font-size: 1.25rem; position: relative;
width: 100%;
aspect-ratio: 16/9;
overflow: hidden;
border-radius: 6px;
margin-bottom: 32px;
}
.carousel-track {
display: flex;
height: 100%;
transition: transform 0.5s ease;
}
.carousel-slide {
min-width: 100%;
height: 100%;
background-size: cover;
background-position: center;
}
.carousel-dots {
display: flex;
justify-content: center;
gap: 8px;
margin-bottom: 32px;
}
.carousel-dot {
width: 8px;
height: 8px;
border-radius: 50%;
background: var(--border);
border: none;
padding: 0;
cursor: pointer;
transition: background 0.2s;
}
.carousel-dot.active {
background: var(--accent);
}
/* Hero - Block 1 */
.hero-sources {
font-size: 1.1rem;
font-weight: 300;
color: var(--text-muted);
line-height: 1.9;
margin-bottom: 32px;
}
.hero-sources span { display: block; }
.hero-sources .different {
}
.hero-pivot {
font-size: 1.15rem;
font-weight: 400; font-weight: 400;
color: var(--text); color: var(--text);
line-height: 1.8; line-height: 1.8;
margin-top: 16px;
margin-bottom: 32px; margin-bottom: 32px;
}
.hero-pivot span { display: block; }
.hero-pivot .emphasis {
font-size: 1.3rem;
font-weight: 600;
margin-top: 12px;
}
.hero-answer {
text-align: center;
font-size: 1.7rem;
font-weight: 500;
color: var(--text);
line-height: 1.5;
margin-top: 16px;
margin-bottom: 8px;
}
.hero-answer .inou {
font-weight: 700;
color: var(--accent);
} }
.hero-answer .inou { font-weight: 700; color: var(--accent); }
.hero-tagline { .hero-tagline {
text-align: center; text-align: center;
font-size: 1.3rem; font-size: 2.8rem;
font-weight: 600; font-weight: 700;
color: var(--text); color: var(--text);
margin-bottom: 32px; margin-bottom: 12px;
}
.carousel-caption {
text-align: center;
font-size: 0.95rem;
color: var(--muted);
line-height: 1.5;
min-height: 3em;
padding: 0 24px;
margin-bottom: 24px;
} }
.hero-cta { margin-bottom: 0; text-align: center; } .hero-cta { margin-bottom: 0; text-align: center; }
@ -41,13 +125,20 @@
border-radius: 4px; border-radius: 4px;
} }
/* Story - Block 2 */
.story-prose.warm { .story-prose.warm {
font-size: 1.1rem; font-size: 1.1rem;
line-height: 1.8; line-height: 1.8;
color: var(--text); color: var(--text);
} }
.story-prose.warm p { margin-bottom: 20px; } .story-prose.warm p {
.story-prose.warm .emphasis { font-weight: 600; font-size: 1.15rem; } margin-bottom: 20px;
}
.story-prose.warm .emphasis {
font-weight: 600;
font-size: 1.15rem;
}
.story-title { .story-title {
font-size: 1.25rem; font-size: 1.25rem;
@ -56,7 +147,9 @@
margin-bottom: 32px; margin-bottom: 32px;
} }
.story-pair { margin-bottom: 32px; } .story-pair {
margin-bottom: 32px;
}
.story-pair .data { .story-pair .data {
font-size: 1.1rem; font-size: 1.1rem;
font-weight: 400; font-weight: 400;
@ -67,6 +160,7 @@
font-size: 1rem; font-size: 1rem;
font-weight: 300; font-weight: 300;
font-style: italic; font-style: italic;
color: var(--text-muted); color: var(--text-muted);
} }
@ -89,7 +183,7 @@
margin-bottom: 32px; margin-bottom: 32px;
} }
.story-gaps span { display: block; } .story-gaps span { display: block; }
.story-gaps .indent { font-style: italic; } .story-gaps .indent { font-style: italic; }
.story-connections { .story-connections {
font-size: 1rem; font-size: 1rem;
@ -108,7 +202,9 @@
margin-bottom: 32px; margin-bottom: 32px;
} }
.story-ai span { display: block; } .story-ai span { display: block; }
.story-ai .last { font-style: italic; } .story-ai .last {
font-style: italic;
}
.story-prose { .story-prose {
font-size: 1rem; font-size: 1rem;
@ -128,8 +224,12 @@
padding-top: 24px; padding-top: 24px;
border-top: 1px solid var(--border); border-top: 1px solid var(--border);
} }
.story-closing .inou { font-weight: 700; color: var(--accent); } .story-closing .inou {
font-weight: 700;
color: var(--accent);
}
/* Trust section */
.trust-card { .trust-card {
width: 100%; width: 100%;
@ -167,6 +267,7 @@
margin-bottom: 4px; margin-bottom: 4px;
} }
/* Footer */
.landing-footer { .landing-footer {
padding: 16px 0; padding: 16px 0;
border-top: 1px solid var(--border); border-top: 1px solid var(--border);
@ -181,26 +282,55 @@
gap: 16px; gap: 16px;
align-items: center; align-items: center;
} }
.landing-footer-left a { color: var(--text-muted); text-decoration: none; } .landing-footer-left a {
color: var(--text-muted);
text-decoration: none;
}
.landing-footer-left a:hover { color: var(--accent); } .landing-footer-left a:hover { color: var(--accent); }
.landing-footer-right { font-size: 1rem; } .landing-footer-right { font-size: 1rem; }
.landing-footer-right .inou { font-weight: 700; color: var(--accent); } .landing-footer-right .inou {
.landing-footer-right .health { font-weight: 400; color: var(--text-muted); } font-weight: 700;
color: var(--accent);
}
.landing-footer-right .health {
font-weight: 400;
color: var(--text-muted);
}
/* Mobile */
@media (max-width: 768px) { @media (max-width: 768px) {
.trust-card { .trust-card {
width: 100%; width: 100%;
margin-left: auto; margin-left: auto;
margin-right: auto; padding: 24px; } margin-right: auto; padding: 24px; }
.hero-sources {
font-size: 1rem; line-height: 1.8; margin-bottom: 32px; }
.hero-pivot { font-size: 1.1rem; margin-bottom: 32px;
}
.hero-pivot .emphasis { font-size: 1.3rem; }
.hero-answer {
text-align: center; font-size: 1.2rem; margin-top: 16px;
margin-bottom: 8px; }
.hero-tagline { font-size: 2rem; margin-bottom: 8px; }
.carousel-caption { font-size: 0.85rem; }
.hero-cta .btn { padding: 14px 40px; }
.story-pair .data { font-size: 1rem; }
.story-pair .reality { font-size: 0.95rem; }
.trust-grid { grid-template-columns: repeat(2, 1fr); gap: 24px; } .trust-grid { grid-template-columns: repeat(2, 1fr); gap: 24px; }
} }
@media (max-width: 480px) { @media (max-width: 480px) {
.trust-card { .trust-card {
width: 100%; width: 100%;
margin-left: auto; margin-left: auto;
margin-right: auto; padding: 20px 16px; } margin-right: auto; padding: 20px 16px; }
.hero-sources {
font-size: 0.95rem; line-height: 1.9; }
.hero-pivot { font-size: 1rem; }
.hero-pivot .emphasis { font-size: 1.2rem; }
.story-pair { margin-bottom: 24px; }
.trust-grid { grid-template-columns: 1fr; gap: 20px; } .trust-grid { grid-template-columns: 1fr; gap: 20px; }
.landing-footer { flex-direction: column; gap: 12px; text-align: center; } .landing-footer { flex-direction: column; gap: 12px; text-align: center; }
.landing-footer-left { flex-direction: column; gap: 8px; } .landing-footer-left { flex-direction: column; gap: 8px; }
@ -211,10 +341,29 @@
<div class="landing-card"> <div class="landing-card">
<div class="hero"> <div class="hero">
<div class="hero-answer"><span class="inou">inou</span> organiseert en deelt je gezondheidsdossier met je AI — veilig en privé.</div> <div class="hero-tagline">Je gezondheid, doorgrond.</div>
<div class="hero-tagline">Je gezondheid, begrepen.</div> <div class="hero-answer">Al je gezondheidsdata — georganiseerd, privé, en klaar voor je AI.</div>
<div class="carousel">
<div class="carousel-track">
<div class="carousel-slide" style="background-image: url('/static/carousel-1.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-2.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-3.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-4.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-5.webp')"></div>
<div class="carousel-slide" style="background-image: url('/static/carousel-6.webp')"></div>
</div>
</div>
<div class="carousel-caption" id="carousel-caption">Volg je labtrends over tijd — zie precies wat je AI ziet wanneer het een verandering signaleert.</div>
<div class="carousel-dots">
<button class="carousel-dot active" data-index="0"></button>
<button class="carousel-dot" data-index="1"></button>
<button class="carousel-dot" data-index="2"></button>
<button class="carousel-dot" data-index="3"></button>
<button class="carousel-dot" data-index="4"></button>
<button class="carousel-dot" data-index="5"></button>
</div>
<div class="hero-cta"> <div class="hero-cta">
{{if .Dossier}}<a href="/invite" class="btn btn-primary">Vriend uitnodigen</a>{{else}}<a href="/start" class="btn btn-primary">Inloggen</a>{{end}} {{if .Dossier}}<a href="/invite" class="btn btn-primary">Nodig een vriend uit</a>{{else}}<a href="/start" class="btn btn-primary">Inloggen</a>{{end}}
{{if .Error}}<div class="error" style="margin-top: 24px;">{{.Error}}</div>{{end}} {{if .Error}}<div class="error" style="margin-top: 24px;">{{.Error}}</div>{{end}}
</div> </div>
</div> </div>
@ -225,32 +374,31 @@
<h2 class="story-title">Je hebt AI nodig voor je gezondheid</h2> <h2 class="story-title">Je hebt AI nodig voor je gezondheid</h2>
<div class="story-prose warm"> <div class="story-prose warm">
<p>Je gezondheidsgegevens liggen verspreid over tientallen plekken — bij je cardioloog, je neuroloog, het lab, je smartwatch, je apps, je 23andMe. En dan is er nog alles wat alleen jij weet: wat je eet, wat je drinkt, welke supplementen je slikt. Je trainingsschema. Je klachten. Je doelen — of je nu zwanger wilt worden, traint voor een marathon, of gewoon minder moe wilt zijn.</p> <p>Je gezondheidsdata ligt verspreid over tientallen plekken — bij je cardioloog, je neuroloog, je lab, je horloge, je apps, je 23andMe. En alleen jij weet de rest: wat je eet, wat je drinkt, welke supplementen je slikt. Je trainingsschema. Je klachten. Je doelen — of je nu zwanger probeert te worden, traint voor een marathon, of gewoon minder moe wilt zijn.</p>
<p>Of je nu gezond bent en dat wilt blijven, worstelt met een lastige diagnose, of zorgt voor een familielid dat niet voor zichzelf kan opkomen — geen enkele arts ziet het complete plaatje. Geen enkel systeem verbindt het.</p> <p>Of je nu gezond bent en dat wilt blijven, midden in een lastig diagnosetraject zit, of zorgt voor een familielid dat niet voor zichzelf kan opkomen — geen enkele arts ziet het volledige plaatje. Geen enkel systeem verbindt het.</p>
<p>Maar jij hebt toegang tot alles. Je mist alleen de expertise om er iets van te maken.</p> <p>Maar <strong><em>jij</em></strong> hebt toegang tot alles. Je hebt alleen niet de expertise om er iets van te maken.</p>
<p class="emphasis">Je AI wel. inou geeft het het complete plaatje.</p> <p class="emphasis">Je AI wel. inou geeft het het volledige plaatje.</p>
</div> </div>
</div> </div>
</div> </div>
<div class="landing-card"> <div class="landing-card">
<div class="story"> <div class="story">
<h2 class="story-title">De uitdaging</h2> <h2 class="story-title">De uitdaging</h2>
<div class="story-pair"> <div class="story-pair">
<div class="data">Je MRI heeft 4.000 beelden.</div> <div class="data">Je MRI heeft 4.000 slices.</div>
<div class="reality">Die werd in 10 minuten beoordeeld.</div> <div class="reality">Hij werd in 10 minuten beoordeeld.</div>
</div> </div>
<div class="story-pair"> <div class="story-pair">
<div class="data">Je genoom heeft miljoenen varianten.</div> <div class="data">Je genoom bevat miljoenen varianten.</div>
<div class="reality">Je leerde alleen je oogkleur en waar je voorouders vandaan kwamen.</div> <div class="reality">Je hoorde alleen je oogkleur en waar je voorouders vandaan kwamen.</div>
</div> </div>
<div class="story-pair"> <div class="story-pair">
<div class="data">Je bloedonderzoek heeft tientallen markers.</div> <div class="data">Je bloedonderzoek bevat tientallen markers.</div>
<div class="reality">Je arts zei "alles ziet er goed uit."</div> <div class="reality">Je arts zei "alles ziet er goed uit."</div>
</div> </div>
@ -260,7 +408,7 @@
</div> </div>
<div class="story-pair"> <div class="story-pair">
<div class="data">Je hebt honderd verschillende supplementen geprobeerd.</div> <div class="data">Je hebt honderden supplementen geprobeerd.</div>
<div class="reality">Niemand vroeg welke.</div> <div class="reality">Niemand vroeg welke.</div>
</div> </div>
@ -272,46 +420,47 @@
<div class="story-gaps"> <div class="story-gaps">
<span>Niemand weet hoe jouw lichaam Warfarine verwerkt — jijzelf ook niet.</span> <span>Niemand weet hoe jouw lichaam Warfarine verwerkt — jijzelf ook niet.</span>
<span class="indent">Maar het antwoord zit misschien al in je 23andMe.</span> <span class="indent">Maar het antwoord zit misschien al in je 23andMe.</span>
<span>Die 'geen bijzonderheden' op je MRI — heeft iemand echt alle 4.000 beelden bekeken?</span> <span>Die 'niet-afwijkend' op je MRI — heeft iemand echt alle 4.000 slices bekeken?</span>
<span>Je schildklier is 'binnen de norm' — maar niemand legde het verband met je vermoeidheid, je gewicht, dat je het altijd koud hebt.</span> <span>Je schildklier is 'binnen de norm' — maar niemand legde het verband met je vermoeidheid, je gewicht, dat je het altijd koud hebt.</span>
</div> </div>
<div class="story-connections"> <div class="story-connections">
<span>Niemand verbindt je middagkoffie aan je slaapkwaliteit.</span> <span>Niemand verbindt je middagcafeïne met je slaapscores.</span>
<span>Je ijzergehalte aan je sportvermoeidheid.</span> <span>Je ijzerwaarden met je trainingsvermoeidheid.</span>
<span>Je genetica aan je brain fog.</span> <span>Je genetica met je brain fog.</span>
</div> </div>
<div class="story-ai"> <div class="story-ai">
<span>Je AI vergeet niet.</span> <span>Je AI vergeet niets.</span>
<span>Haast niet.</span> <span>Heeft geen haast.</span>
<span>Vindt wat gemist werd.</span> <span>Vindt wat gemist is.</span>
<span class="last">Specialiseert niet — ziet de complete jij.</span> <span class="last">Specialiseert niet — ziet het complete plaatje.</span>
</div> </div>
<div class="story-closing"><span class="inou">inou</span> laat je AI alles meewegen — elk beeld, elke marker, elke variant — verbindt alles en geeft je eindelijk antwoorden die niemand anders kon geven.</div> <div class="story-closing"><span class="inou">inou</span> laat je AI alles meewegen — elke slice, elke marker, elke variant — verbindt het allemaal en geeft je eindelijk antwoorden die niemand anders kon geven.</div>
</div> </div>
</div> </div>
<div class="landing-card"> <div class="landing-card">
<div class="story"> <div class="story">
<h2 class="story-title">Waarom we dit bouwden</h2> <h2 class="story-title">Waarom we dit gebouwd hebben</h2>
<p class="story-prose">Je hebt jarenlang gezondheidsgegevens verzameld. Scans van het ziekenhuis. Bloedonderzoek van het lab. Uitslagen uit het patiëntenportaal. Data van je horloge. Misschien zelfs je DNA.</p> <p class="story-prose">Je hebt jarenlang gezondheidsdata verzameld. Scans van het ziekenhuis. Bloedonderzoek van het lab. Uitslagen via het patiëntenportaal. Data van je horloge. Misschien zelfs je DNA.</p>
<p class="story-prose">En dan is er nog alles wat alleen jij weet — je gewicht, je bloeddruk, je trainingsschema, de supplementen die je slikt, de klachten die je steeds vergeet te noemen.</p> <p class="story-prose">En dan is er alles wat alleen jij weet — je gewicht, je bloeddruk, je trainingsschema, de supplementen die je slikt, de klachten die je steeds vergeet te noemen.</p>
<p class="story-prose">Het is er allemaal — maar verspreid over systemen die niet met elkaar praten, bij specialisten die alleen hun eigen stukje zien, of opgesloten in je eigen hoofd.</p> <p class="story-prose">Het is er allemaal — maar verspreid over systemen die niet met elkaar praten, vastgehouden door specialisten die alleen hun eigen stukje zien, of opgesloten in je eigen hoofd.</p>
<p class="story-prose">Je cardioloog weet niet wat je neuroloog vond. Je trainer heeft je bloedonderzoek niet gezien. Je huisarts heeft geen idee welke supplementen je slikt. En geen van hen heeft tijd om met je te zitten en de puzzel te leggen.</p> <p class="story-prose">Je cardioloog weet niet wat je neuroloog gevonden heeft. Je trainer heeft je bloedonderzoek niet gezien. Je huisarts heeft geen idee welke supplementen je slikt. En geen van hen heeft de tijd om met je te zitten en de puzzelstukjes te verbinden.</p>
<p class="story-prose"><strong>AI kan dat eindelijk.</strong> Het kan samenbrengen wat geen enkele expert ziet — en het je ook nog uitleggen.</p> <p class="story-prose"><strong>AI kan dat eindelijk wel.</strong> Het kan samenbrengen wat geen enkele specialist overziet — en het je ook nog eens uitleggen.</p>
<p class="story-prose">Maar deze data past niet in een chatvenster. En het laatste wat je wilt is je medische geschiedenis op andermans servers, gebruikt om hun modellen te trainen.</p> <p class="story-prose">Maar deze data past niet in een chatvenster. En het laatste wat je wilt is je medische geschiedenis op andermans servers, gebruikt om hun modellen te trainen.</p>
<p class="story-prose"><span class="inou">inou</span> brengt alles samen — labs, beeldvorming, genetica, vitals, medicatie, supplementen — versleuteld, privé, en met niemand gedeeld. Je AI verbindt veilig. Je data blijft van jou.</p> <p class="story-prose"><span class="inou">inou</span> brengt alles samen — labresultaten, beeldvorming, genetica, vitale functies, medicatie, supplementen — versleuteld, privé, en met absoluut niemand gedeeld. Je AI maakt een beveiligde verbinding. Je data blijft van jou.</p>
<div class="story-closing">Je gezondheid, begrepen.</div> <div class="story-closing">Je gezondheid, doorgrond.</div>
</div> </div>
</div> </div>
@ -340,4 +489,31 @@
{{template "footer"}} {{template "footer"}}
</div> </div>
<script>
(function() {
var track = document.querySelector('.carousel-track');
var dots = document.querySelectorAll('.carousel-dot');
var caption = document.getElementById('carousel-caption');
var captions = [
'Volg je labtrends over tijd \u2014 zie precies wat je AI ziet wanneer het een verandering signaleert.',
'Je labs, scans en genoom op \u00e9\u00e9n plek \u2014 bekijk alles waar je AI toegang toe heeft.',
'Bekijk je eigen MRI \u2014 zoom in op dezelfde slices die je AI geanalyseerd heeft.',
'Je hersenscan in 3D \u2014 navigeer door elk vlak, controleer elke bevinding van je AI.',
'Je AI legt verbanden tussen labs en genoom \u2014 en legt het uit in begrijpelijke taal.',
'Je r\u00f6ntgenfoto, volledige resolutie \u2014 zoom in op de bevindingen die je AI heeft gesignaleerd.'
];
var count = dots.length;
var current = 0;
function go(i) {
current = i;
track.style.transform = 'translateX(-' + (i * 100) + '%)';
dots.forEach(function(d, j) { d.classList.toggle('active', j === i); });
caption.textContent = captions[i];
}
dots.forEach(function(d) {
d.addEventListener('click', function() { go(+d.dataset.index); });
});
setInterval(function() { go((current + 1) % count); }, 8000);
})();
</script>
{{end}} {{end}}

View File

@ -77,6 +77,18 @@
font-weight: 400; font-weight: 400;
} }
.tier-price s {
color: var(--text-muted);
font-weight: 400;
}
.tier-free {
font-size: 0.85rem;
font-weight: 600;
color: #28a745;
margin-top: 4px;
}
.pricing-table tbody tr { .pricing-table tbody tr {
border-bottom: 1px solid var(--border); border-bottom: 1px solid var(--border);
} }
@ -110,12 +122,15 @@
.category-row { .category-row {
background: var(--bg-secondary); background: var(--bg-secondary);
font-weight: 600; font-weight: 700;
color: var(--text); color: var(--text);
} }
.category-row td { .category-row td {
padding: 12px 16px; padding: 16px 16px;
font-size: 1.1rem;
border-left: 3px solid var(--accent);
letter-spacing: 0.02em;
} }
/* Mobile */ /* Mobile */
@ -141,7 +156,7 @@
<div class="pricing-container"> <div class="pricing-container">
<div class="pricing-header"> <div class="pricing-header">
<h1>Pricing</h1> <h1>Pricing</h1>
<p class="tagline">Choose the plan that fits your health journey</p> <p class="tagline">All tiers free until July 1, 2026. No credit card required.</p>
</div> </div>
<div class="pricing-table-wrapper"> <div class="pricing-table-wrapper">
@ -155,11 +170,13 @@
</th> </th>
<th class="tier-header"> <th class="tier-header">
<div class="tier-name">Optimize</div> <div class="tier-name">Optimize</div>
<div class="tier-price">$12<span class="small">/mo</span></div> <div class="tier-price"><s>$12<span class="small">/mo</span></s></div>
<div class="tier-free">free till 7/1/26</div>
</th> </th>
<th class="tier-header"> <th class="tier-header">
<div class="tier-name">Research</div> <div class="tier-name">Research</div>
<div class="tier-price">$35<span class="small">/mo</span></div> <div class="tier-price"><s>$35<span class="small">/mo</span></s></div>
<div class="tier-free">free till 7/1/26</div>
</th> </th>
</tr> </tr>
</thead> </thead>

View File

@ -184,6 +184,9 @@
<h3>FIPS 140-3 encryption.</h3> <h3>FIPS 140-3 encryption.</h3>
<p>FIPS 140-3 is the US government standard for cryptographic security. Your files are encrypted using FIPS 140-3 validated cryptography — tested, audited, and certified by independent labs.</p> <p>FIPS 140-3 is the US government standard for cryptographic security. Your files are encrypted using FIPS 140-3 validated cryptography — tested, audited, and certified by independent labs.</p>
<h3>Security incident notification.</h3>
<p>If a security breach occurs that affects your personal data, we will notify you by email within 72 hours of becoming aware of the incident. That notification will tell you what happened, which data was affected, what we did to contain it, and what steps you can take to protect yourself. We will also notify relevant data protection authorities as required by GDPR, FADP, and applicable law.</p>
<h3>Independent infrastructure.</h3> <h3>Independent infrastructure.</h3>
<p>We don't run on Big Tech clouds. No Google. No Amazon. No Microsoft. Data is stored on servers in the United States. If you access <span class="inou-brand">inou</span> from outside the US, your data crosses international borders. We apply the same security and privacy protections regardless of your location.</p> <p>We don't run on Big Tech clouds. No Google. No Amazon. No Microsoft. Data is stored on servers in the United States. If you access <span class="inou-brand">inou</span> from outside the US, your data crosses international borders. We apply the same security and privacy protections regardless of your location.</p>
</div> </div>
@ -235,7 +238,7 @@
<p>We may update this policy. Registered users will be notified by email of material changes. Continued use after changes constitutes acceptance.</p> <p>We may update this policy. Registered users will be notified by email of material changes. Continued use after changes constitutes acceptance.</p>
<p>Regardless of your jurisdiction, you may request access to your data, correction of inaccuracies, or complete deletion of your account. We will respond within 30 days.</p> <p>Regardless of your jurisdiction, you may request access to your data, correction of inaccuracies, or complete deletion of your account. We will respond within 30 days.</p>
<p>Data Protection Officer: <a href="mailto:privacy@inou.com">privacy@inou.com</a></p> <p>Data Protection Officer: <a href="mailto:privacy@inou.com">privacy@inou.com</a></p>
<p>This policy was last updated on February 8, 2026.</p> <p>This policy was last updated on March 10, 2026.</p>
</div> </div>
{{template "footer"}} {{template "footer"}}

View File

@ -12,6 +12,7 @@ import (
"path/filepath" "path/filepath"
"regexp" "regexp"
"sort" "sort"
"strconv"
"strings" "strings"
"sync" "sync"
"time" "time"
@ -123,6 +124,7 @@ func isGenomeData(data []byte) bool {
return false return false
} }
var sliceRe = regexp.MustCompile(`^(.+?) - (.+?) - slice (\d+)/(\d+)`) var sliceRe = regexp.MustCompile(`^(.+?) - (.+?) - slice (\d+)/(\d+)`)
var fileRe = regexp.MustCompile(`^file (\d+)/(\d+)`)
// Upload represents a file upload entry for display // Upload represents a file upload entry for display
type Upload struct { type Upload struct {
@ -182,8 +184,8 @@ func getUploads(dossierID string) []Upload {
Category: e.Type, Category: e.Type,
Status: status, Status: status,
SizeHuman: formatBytes(int64(size)), SizeHuman: formatBytes(int64(size)),
UploadedAt: time.Unix(e.Timestamp, 0).Format("Jan 2"), UploadedAt: time.Unix(e.Timestamp, 0).UTC().Format("Jan 2"),
ExpiresAt: time.Unix(e.TimestampEnd, 0).Format("Jan 2"), ExpiresAt: time.Unix(e.TimestampEnd, 0).UTC().Format("Jan 2"),
CanUndo: len(createdEntries) > 0, CanUndo: len(createdEntries) > 0,
} }
if e.Status != 0 { if e.Status != 0 {
@ -399,18 +401,11 @@ func handleDeleteFile(w http.ResponseWriter, r *http.Request) {
} }
} }
// Get file info for audit and deletion // Get file info for audit
filePath, fileName, _, _, _ := getUploadEntry(fileID, targetID) _, fileName, _, _, _ := getUploadEntry(fileID, targetID)
if filePath != "" {
os.Remove(filePath)
}
// Mark as deleted using lib.EntryGet + lib.EntryWrite // Delete the upload entry and its file
entry, err := lib.EntryGet(nil, fileID) // nil ctx - internal operation lib.EntryDelete("", targetID, &lib.Filter{EntryID: fileID})
if err == nil && entry != nil && entry.DossierID == targetID {
entry.Status = 1 // Mark as deleted
lib.EntryWrite("", entry) // nil ctx - internal operation
}
lib.AuditLog(p.DossierID, "file_delete", targetID, fileName) lib.AuditLog(p.DossierID, "file_delete", targetID, fileName)
http.Redirect(w, r, fmt.Sprintf("/dossier/%s/upload", formatHexID(targetID)), http.StatusSeeOther) http.Redirect(w, r, fmt.Sprintf("/dossier/%s/upload", formatHexID(targetID)), http.StatusSeeOther)
@ -652,11 +647,11 @@ func runProcessImaging(actorID, targetID string) {
totalFiles := len(pendingDICOM) totalFiles := len(pendingDICOM)
logFn := func(format string, args ...interface{}) { logFn := func(format string, args ...interface{}) {
msg := fmt.Sprintf(format, args...) msg := fmt.Sprintf(format, args...)
if m := sliceRe.FindStringSubmatch(msg); m != nil { if m := fileRe.FindStringSubmatch(msg); m != nil {
processed++ processed, _ = strconv.Atoi(m[1])
} else if m := sliceRe.FindStringSubmatch(msg); m != nil {
processProgress.Store(targetID, map[string]interface{}{ processProgress.Store(targetID, map[string]interface{}{
"stage": "importing", "study": m[1], "series": m[2], "stage": "importing", "study": m[1], "series": m[2],
"slice": m[3], "slice_total": m[4],
"processed": processed, "total": totalFiles, "processed": processed, "total": totalFiles,
}) })
} }
@ -765,6 +760,7 @@ type extractedEntry struct {
Data map[string]interface{} `json:"data"` Data map[string]interface{} `json:"data"`
DataTranslated map[string]interface{} `json:"data_translated,omitempty"` DataTranslated map[string]interface{} `json:"data_translated,omitempty"`
SourceSpans []sourceSpan `json:"source_spans,omitempty"` SourceSpans []sourceSpan `json:"source_spans,omitempty"`
Results []extractedEntry `json:"results,omitempty"` // lab_order children
} }
type sourceSpan struct { type sourceSpan struct {
@ -790,6 +786,7 @@ func langName(code string) string {
func extractionPreamble(targetLang string) string { func extractionPreamble(targetLang string) string {
s := `IMPORTANT RULES (apply to all entries you return): s := `IMPORTANT RULES (apply to all entries you return):
- Do NOT translate. Keep ALL text values (summary, value, data fields) in the ORIGINAL language of the document. - Do NOT translate. Keep ALL text values (summary, value, data fields) in the ORIGINAL language of the document.
- Normalize ALL dates to "Mon YYYY" (e.g. "Nov 2025") or "DD Mon YYYY" (e.g. "21 Nov 2025") in summary fields. Never use numeric-only date formats like 11-21-25 or 11/21/2025.
- For each entry, include "source_spans": an array of {"start": "...", "end": "..."} where start/end are the VERBATIM first and last 5-8 words of the relevant passage(s) in the source markdown. This is used to highlight the source text. Multiple spans are allowed. - For each entry, include "source_spans": an array of {"start": "...", "end": "..."} where start/end are the VERBATIM first and last 5-8 words of the relevant passage(s) in the source markdown. This is used to highlight the source text. Multiple spans are allowed.
- For each entry, include "search_key": a short normalized deduplication key in English lowercase. Format: "thing:qualifier:YYYY-MM" or "thing:qualifier" for undated facts. Examples: "surgery:vp-shunt:2020-07", "device:ommaya-reservoir:2020-04", "diagnosis:hydrocephalus", "provider:peraud:ulm". Same real-world fact across different documents MUST produce the same key. - For each entry, include "search_key": a short normalized deduplication key in English lowercase. Format: "thing:qualifier:YYYY-MM" or "thing:qualifier" for undated facts. Examples: "surgery:vp-shunt:2020-07", "device:ommaya-reservoir:2020-04", "diagnosis:hydrocephalus", "provider:peraud:ulm". Same real-world fact across different documents MUST produce the same key.
` `
@ -826,6 +823,7 @@ func loadExtractionPrompts() map[int]string {
} }
// parseTimestamp tries to parse a date string into Unix timestamp. // parseTimestamp tries to parse a date string into Unix timestamp.
// Uses noon UTC to prevent date shift when displayed in negative-offset timezones.
func parseTimestamp(s string) int64 { func parseTimestamp(s string) int64 {
if s == "" { if s == "" {
return 0 return 0
@ -1016,7 +1014,7 @@ func processDocumentUpload(uploadID, dossierID, filePath, fileName string) {
msgs := []map[string]interface{}{ msgs := []map[string]interface{}{
{"role": "user", "content": prompt}, {"role": "user", "content": prompt},
} }
resp, err := lib.CallFireworks(fireworksTextModel, msgs, 4096) resp, err := lib.CallFireworks(fireworksTextModel, msgs, 8192)
if err != nil { if err != nil {
log.Printf("[doc-import] Category %d failed: %v", catID, err) log.Printf("[doc-import] Category %d failed: %v", catID, err)
return return
@ -1070,44 +1068,58 @@ func processDocumentUpload(uploadID, dossierID, filePath, fileName string) {
// 6. Create entries for each extracted item // 6. Create entries for each extracted item
var createdIDs []string var createdIDs []string
createEntry := func(dossierID, parentID string, category int, e extractedEntry) string {
dataMap := map[string]interface{}{"source_doc_id": docID}
for k, v := range e.Data {
dataMap[k] = v
}
if len(e.SourceSpans) > 0 {
dataMap["source_spans"] = e.SourceSpans
}
if e.SummaryTranslated != "" {
dataMap["summary_translated"] = e.SummaryTranslated
}
if len(e.DataTranslated) > 0 {
dataMap["data_translated"] = e.DataTranslated
}
dataJSON, _ := json.Marshal(dataMap)
ts := now
if parsed := parseTimestamp(e.Timestamp); parsed > 0 {
ts = parsed
}
entry := &lib.Entry{
DossierID: dossierID,
ParentID: parentID,
Category: category,
Type: e.Type,
Value: e.Value,
Summary: e.Summary,
SearchKey: e.SearchKey,
Timestamp: ts,
Data: string(dataJSON),
}
lib.EntryWrite("", entry)
createdIDs = append(createdIDs, entry.EntryID)
return entry.EntryID
}
for _, r := range results { for _, r := range results {
for _, e := range r.Entries { for _, e := range r.Entries {
// Build Data JSON with source reference + extracted fields if e.Type == "lab_order" && len(e.Results) > 0 {
dataMap := map[string]interface{}{ // Hierarchical lab: create parent then children
"source_doc_id": docID, orderID := createEntry(dossierID, docID, r.Category, extractedEntry{
Type: "lab_order", Value: e.Value, Timestamp: e.Timestamp, SourceSpans: e.SourceSpans,
SummaryTranslated: e.SummaryTranslated, DataTranslated: e.DataTranslated,
})
for _, child := range e.Results {
if child.Timestamp == "" {
child.Timestamp = e.Timestamp
}
createEntry(dossierID, orderID, r.Category, child)
}
} else {
createEntry(dossierID, docID, r.Category, e)
} }
for k, v := range e.Data {
dataMap[k] = v
}
if len(e.SourceSpans) > 0 {
dataMap["source_spans"] = e.SourceSpans
}
if e.SummaryTranslated != "" {
dataMap["summary_translated"] = e.SummaryTranslated
}
if len(e.DataTranslated) > 0 {
dataMap["data_translated"] = e.DataTranslated
}
dataJSON, _ := json.Marshal(dataMap)
ts := now
if parsed := parseTimestamp(e.Timestamp); parsed > 0 {
ts = parsed
}
entry := &lib.Entry{
DossierID: dossierID,
ParentID: docID,
Category: r.Category,
Type: e.Type,
Value: e.Value,
Summary: e.Summary,
SearchKey: e.SearchKey,
Timestamp: ts,
Data: string(dataJSON),
}
lib.EntryWrite("", entry)
createdIDs = append(createdIDs, entry.EntryID)
} }
} }

View File

@ -51,7 +51,7 @@ fi
# Check for db.Exec outside allowed files (in lib only) # Check for db.Exec outside allowed files (in lib only)
echo -n "Checking for db.Exec() outside allowed lib files... " echo -n "Checking for db.Exec() outside allowed lib files... "
MATCHES=$(grep -r "db\.Exec" --include="*.go" lib 2>/dev/null | grep -v "lib/db_queries.go" | grep -v "lib/db_schema.go" | grep -v "lib/dbcore.go" | grep -v "lib/stubs.go" | grep -v "lib/migrate" || true) MATCHES=$(grep -r "db\.Exec" --include="*.go" lib 2>/dev/null | grep -v "lib/db_queries.go" | grep -v "lib/db_schema.go" | grep -v "lib/dbcore.go" | grep -v "lib/stubs.go" | grep -v "lib/migrate" | grep -v "_test.go" || true)
if [ -n "$MATCHES" ]; then if [ -n "$MATCHES" ]; then
echo -e "${RED}FAILED${NC}" echo -e "${RED}FAILED${NC}"
echo "$MATCHES" echo "$MATCHES"
@ -62,7 +62,7 @@ fi
# Check for db.Query outside allowed files (in lib only) # Check for db.Query outside allowed files (in lib only)
echo -n "Checking for db.Query() outside allowed lib files... " echo -n "Checking for db.Query() outside allowed lib files... "
MATCHES=$(grep -r "db\.Query" --include="*.go" lib 2>/dev/null | grep -v "lib/db_queries.go" | grep -v "lib/db_schema.go" | grep -v "lib/dbcore.go" | grep -v "lib/stubs.go" | grep -v "lib/migrate" || true) MATCHES=$(grep -r "db\.Query" --include="*.go" lib 2>/dev/null | grep -v "lib/db_queries.go" | grep -v "lib/db_schema.go" | grep -v "lib/dbcore.go" | grep -v "lib/stubs.go" | grep -v "lib/migrate" | grep -v "_test.go" || true)
if [ -n "$MATCHES" ]; then if [ -n "$MATCHES" ]; then
echo -e "${RED}FAILED${NC}" echo -e "${RED}FAILED${NC}"
echo "$MATCHES" echo "$MATCHES"
@ -73,7 +73,7 @@ fi
# Check for db.QueryRow outside allowed files (in lib only) # Check for db.QueryRow outside allowed files (in lib only)
echo -n "Checking for db.QueryRow() outside allowed lib files... " echo -n "Checking for db.QueryRow() outside allowed lib files... "
MATCHES=$(grep -r "db\.QueryRow" --include="*.go" lib 2>/dev/null | grep -v "lib/db_queries.go" | grep -v "lib/db_schema.go" | grep -v "lib/dbcore.go" | grep -v "lib/stubs.go" | grep -v "lib/migrate" || true) MATCHES=$(grep -r "db\.QueryRow" --include="*.go" lib 2>/dev/null | grep -v "lib/db_queries.go" | grep -v "lib/db_schema.go" | grep -v "lib/dbcore.go" | grep -v "lib/stubs.go" | grep -v "lib/migrate" | grep -v "_test.go" || true)
if [ -n "$MATCHES" ]; then if [ -n "$MATCHES" ]; then
echo -e "${RED}FAILED${NC}" echo -e "${RED}FAILED${NC}"
echo "$MATCHES" echo "$MATCHES"
@ -120,7 +120,7 @@ echo -n "Checking for CREATE TABLE in code... "
MATCHES="" MATCHES=""
for dir in $CORE_DIRS; do for dir in $CORE_DIRS; do
if [ -d "$dir" ]; then if [ -d "$dir" ]; then
FOUND=$(grep -ri "CREATE TABLE" --include="*.go" "$dir" 2>/dev/null | grep -v "lib/db_queries.go" | grep -v "rateDB" || true) FOUND=$(grep -ri "CREATE TABLE" --include="*.go" "$dir" 2>/dev/null | grep -v "lib/db_queries.go" | grep -v "rateDB" | grep -v "_test.go" || true)
MATCHES="$MATCHES$FOUND" MATCHES="$MATCHES$FOUND"
fi fi
done done

View File

@ -5,6 +5,9 @@ import (
"encoding/csv" "encoding/csv"
"encoding/json" "encoding/json"
"fmt" "fmt"
"log"
"net"
"net/http"
"os" "os"
"strings" "strings"
@ -28,9 +31,14 @@ func main() {
os.Exit(1) os.Exit(1)
} }
format := "json"
args := os.Args[1:] args := os.Args[1:]
if len(args) > 0 && (args[0] == "-serve" || args[0] == "--serve") {
serveHTTP()
return
}
format := "json"
if len(args) > 0 && (args[0] == "-csv" || args[0] == "--csv") { if len(args) > 0 && (args[0] == "-csv" || args[0] == "--csv") {
format = "csv" format = "csv"
args = args[1:] args = args[1:]
@ -58,28 +66,41 @@ func main() {
} }
defer db.Close() defer db.Close()
rows, err := db.Query(query) cols, results, err := queryDB(db, query)
if err != nil { if err != nil {
fmt.Fprintf(os.Stderr, "query: %v\n", err) fmt.Fprintf(os.Stderr, "query: %v\n", err)
os.Exit(1) os.Exit(1)
} }
switch format {
case "csv":
outputCSV(cols, results)
case "table":
outputTable(cols, results)
default:
out, _ := json.MarshalIndent(results, "", " ")
fmt.Println(string(out))
}
}
func queryDB(db *sql.DB, query string) ([]string, []map[string]interface{}, error) {
rows, err := db.Query(query)
if err != nil {
return nil, nil, err
}
defer rows.Close() defer rows.Close()
cols, _ := rows.Columns() cols, _ := rows.Columns()
colTypes, _ := rows.ColumnTypes()
var results []map[string]interface{} var results []map[string]interface{}
for rows.Next() { for rows.Next() {
// Scan as raw interface{} to preserve type info ([]byte for BLOBs, int64 for INTs, string for TEXT)
vals := make([]interface{}, len(cols)) vals := make([]interface{}, len(cols))
ptrs := make([]interface{}, len(cols)) ptrs := make([]interface{}, len(cols))
for i := range vals { for i := range vals {
ptrs[i] = &vals[i] ptrs[i] = &vals[i]
} }
if err := rows.Scan(ptrs...); err != nil { if err := rows.Scan(ptrs...); err != nil {
fmt.Fprintf(os.Stderr, "scan: %v\n", err) return nil, nil, err
os.Exit(1)
} }
row := make(map[string]interface{}) row := make(map[string]interface{})
@ -87,10 +108,8 @@ func main() {
v := vals[i] v := vals[i]
switch val := v.(type) { switch val := v.(type) {
case []byte: case []byte:
// Try Unpack (new packed BLOBs)
if unpacked := lib.Unpack(val); unpacked != nil { if unpacked := lib.Unpack(val); unpacked != nil {
s := string(unpacked) s := string(unpacked)
// If it looks like JSON, parse it
if strings.HasPrefix(s, "{") || strings.HasPrefix(s, "[") { if strings.HasPrefix(s, "{") || strings.HasPrefix(s, "[") {
var parsed interface{} var parsed interface{}
if json.Unmarshal(unpacked, &parsed) == nil { if json.Unmarshal(unpacked, &parsed) == nil {
@ -101,7 +120,6 @@ func main() {
row[col] = s row[col] = s
continue continue
} }
// Try old CryptoDecrypt (legacy base64 strings)
s := string(val) s := string(val)
decrypted := s decrypted := s
for j := 0; j < 10; j++ { for j := 0; j < 10; j++ {
@ -126,7 +144,6 @@ func main() {
case nil: case nil:
row[col] = nil row[col] = nil
case string: case string:
// Try old CryptoDecrypt for legacy TEXT columns
decrypted := val decrypted := val
for j := 0; j < 10; j++ { for j := 0; j < 10; j++ {
next := lib.CryptoDecrypt(decrypted) next := lib.CryptoDecrypt(decrypted)
@ -153,23 +170,10 @@ func main() {
} }
results = append(results, row) results = append(results, row)
} }
if err := rows.Err(); err != nil { if err := rows.Err(); err != nil {
fmt.Fprintf(os.Stderr, "rows: %v\n", err) return nil, nil, err
os.Exit(1)
}
_ = colTypes // reserved for future use
switch format {
case "csv":
outputCSV(cols, results)
case "table":
outputTable(cols, results)
default:
out, _ := json.MarshalIndent(results, "", " ")
fmt.Println(string(out))
} }
return cols, results, nil
} }
func formatValue(val interface{}) string { func formatValue(val interface{}) string {
@ -242,3 +246,82 @@ func outputTable(cols []string, results []map[string]interface{}) {
fmt.Println() fmt.Println()
} }
} }
const stagingIP = "192.168.1.253"
func isStaging() bool {
addrs, err := net.InterfaceAddrs()
if err != nil {
return false
}
for _, a := range addrs {
if ipnet, ok := a.(*net.IPNet); ok && ipnet.IP.String() == stagingIP {
return true
}
}
return false
}
func serveHTTP() {
if !isStaging() {
fmt.Fprintln(os.Stderr, "dbquery -serve: refused (not staging)")
os.Exit(1)
}
if err := lib.CryptoInit(lib.KeyPathDefault); err != nil {
fmt.Fprintf(os.Stderr, "crypto init: %v\n", err)
os.Exit(1)
}
db, err := sql.Open("sqlite3", dbPath)
if err != nil {
fmt.Fprintf(os.Stderr, "db open: %v\n", err)
os.Exit(1)
}
http.HandleFunc("/query", func(w http.ResponseWriter, r *http.Request) {
if r.Method != "POST" {
http.Error(w, "POST only", http.StatusMethodNotAllowed)
return
}
var req struct{ SQL string `json:"sql"` }
if err := json.NewDecoder(r.Body).Decode(&req); err != nil || req.SQL == "" {
http.Error(w, `{"error":"sql required"}`, http.StatusBadRequest)
return
}
_, results, err := queryDB(db, req.SQL)
if err != nil {
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(http.StatusBadRequest)
json.NewEncoder(w).Encode(map[string]string{"error": err.Error()})
return
}
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(results)
})
http.HandleFunc("/exec", func(w http.ResponseWriter, r *http.Request) {
if r.Method != "POST" {
http.Error(w, "POST only", http.StatusMethodNotAllowed)
return
}
var req struct{ SQL string `json:"sql"` }
if err := json.NewDecoder(r.Body).Decode(&req); err != nil || req.SQL == "" {
http.Error(w, `{"error":"sql required"}`, http.StatusBadRequest)
return
}
result, err := db.Exec(req.SQL)
if err != nil {
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(http.StatusBadRequest)
json.NewEncoder(w).Encode(map[string]string{"error": err.Error()})
return
}
affected, _ := result.RowsAffected()
w.Header().Set("Content-Type", "application/json")
json.NewEncoder(w).Encode(map[string]int64{"affected": affected})
})
log.Printf("dbquery serving on :9124 (staging only)")
log.Fatal(http.ListenAndServe(":9124", nil))
}

207
tools/rquery/main.go Normal file
View File

@ -0,0 +1,207 @@
package main
import (
"encoding/hex"
"encoding/json"
"fmt"
"os"
"os/exec"
"path/filepath"
"strings"
"inou/lib"
)
var hosts = map[string]string{
"dev": "johan@192.168.1.253",
"prod": "johan@192.168.100.2",
}
const (
remoteKeyPath = "/tank/inou/master.key"
remoteDBPath = "/tank/inou/data/inou.db"
cacheDir = ".cache/inou"
)
func main() {
if len(os.Args) < 3 {
fmt.Fprintln(os.Stderr, "Usage: rquery <dev|prod> <SQL>")
fmt.Fprintln(os.Stderr, " Runs SQL via SSH, decrypts locally.")
fmt.Fprintln(os.Stderr, " rquery dev \"SELECT * FROM entries LIMIT 5\"")
os.Exit(1)
}
env := os.Args[1]
host, ok := hosts[env]
if !ok {
fmt.Fprintf(os.Stderr, "Unknown env %q (use dev or prod)\n", env)
os.Exit(1)
}
query := strings.Join(os.Args[2:], " ")
if err := initCrypto(env, host); err != nil {
fmt.Fprintf(os.Stderr, "crypto: %v\n", err)
os.Exit(1)
}
// Get column names from LIMIT 1, then re-run with hex() on all columns
colCmd := exec.Command("ssh", host,
fmt.Sprintf(`sqlite3 -json %s "SELECT * FROM (%s) LIMIT 1"`, remoteDBPath, escSQL(query)))
colOut, err := colCmd.Output()
if err != nil {
// Fallback: just run the query as-is
runRaw(host, query)
return
}
var sample []map[string]interface{}
if err := json.Unmarshal(colOut, &sample); err != nil || len(sample) == 0 {
runRaw(host, query)
return
}
// Build hex-wrapped query
var cols []string
for col := range sample[0] {
cols = append(cols, col)
}
var hexCols []string
for _, col := range cols {
hexCols = append(hexCols, fmt.Sprintf(`hex("%s") as "%s"`, col, col))
}
hexQuery := fmt.Sprintf(`SELECT %s FROM (%s)`, strings.Join(hexCols, ", "), query)
cmd := exec.Command("ssh", host,
fmt.Sprintf(`sqlite3 -json %s "%s"`, remoteDBPath, escSQL(hexQuery)))
out, err := cmd.Output()
if err != nil {
if ee, ok := err.(*exec.ExitError); ok {
fmt.Fprintf(os.Stderr, "%s\n", ee.Stderr)
}
fmt.Fprintf(os.Stderr, "ssh: %v\n", err)
os.Exit(1)
}
var rows []map[string]interface{}
if err := json.Unmarshal(out, &rows); err != nil {
fmt.Print(string(out))
return
}
// Decode hex and decrypt
for _, row := range rows {
for col, v := range row {
row[col] = decodeAndDecrypt(v)
}
}
enc := json.NewEncoder(os.Stdout)
enc.SetIndent("", " ")
enc.Encode(rows)
}
func escSQL(s string) string {
return strings.ReplaceAll(s, `"`, `\"`)
}
func runRaw(host, query string) {
cmd := exec.Command("ssh", host,
fmt.Sprintf(`sqlite3 -json %s "%s"`, remoteDBPath, escSQL(query)))
out, _ := cmd.Output()
fmt.Print(string(out))
}
func initCrypto(env, host string) error {
home, _ := os.UserHomeDir()
dir := filepath.Join(home, cacheDir)
os.MkdirAll(dir, 0700)
keyFile := filepath.Join(dir, env+".key")
if _, err := os.Stat(keyFile); err != nil {
cmd := exec.Command("ssh", host, "cat", remoteKeyPath)
key, err := cmd.Output()
if err != nil {
return fmt.Errorf("fetch key from %s: %w", env, err)
}
if err := os.WriteFile(keyFile, key, 0600); err != nil {
return fmt.Errorf("cache key: %w", err)
}
}
return lib.CryptoInit(keyFile)
}
func decodeAndDecrypt(v interface{}) interface{} {
s, ok := v.(string)
if !ok {
return v
}
// Decode hex
raw, err := hex.DecodeString(s)
if err != nil {
return s // not hex, return as-is
}
// Empty
if len(raw) == 0 {
return nil
}
// Try as integer (sqlite integers are stored as variable-length big-endian in hex)
// Check: if it's short and all the original was digits, it was likely a plain integer
// Actually, hex(123) = "313233" (hex of ASCII "123") for TEXT,
// but hex(123) for INTEGER = "7B". Let's handle both.
// Try Unpack first (packed BLOBs)
if unpacked := lib.Unpack(raw); unpacked != nil {
str := string(unpacked)
if strings.HasPrefix(str, "{") || strings.HasPrefix(str, "[") {
var parsed interface{}
if json.Unmarshal(unpacked, &parsed) == nil {
return parsed
}
}
return str
}
// Try as UTF-8 text, then CryptoDecrypt
str := string(raw)
// Check if it's a plain integer
isNum := len(str) > 0
for _, c := range str {
if c < '0' || c > '9' {
isNum = false
break
}
}
if isNum {
var n json.Number
n = json.Number(str)
if i, err := n.Int64(); err == nil {
return i
}
}
// Try CryptoDecrypt chain
decrypted := str
for i := 0; i < 10; i++ {
next := lib.CryptoDecrypt(decrypted)
if next == "" || next == decrypted {
break
}
decrypted = next
}
if decrypted != str {
if strings.HasPrefix(decrypted, "{") || strings.HasPrefix(decrypted, "[") {
var parsed interface{}
if json.Unmarshal([]byte(decrypted), &parsed) == nil {
return parsed
}
}
return decrypted
}
return str
}