Docs ยท all 10 specs ยท all 7 live properties

How do I actually use the Kinetic Gain Protocol Suite?

Step-by-step quickstart guides for publishing each Kinetic Gain document type, validating against the schema, integrating with the unified visualizer and MCP server, and reading documents from another vendor. Five minutes per role.

30-second quickstart

  1. Pick the spec that matches what you want to declare โ€” entity โ†’ AEO, agent โ†’ Agent Cards, tutor โ†’ Tutor Cards, clinical AI โ†’ Clinical AI Disclosure, etc.
  2. Fork an example from the spec's examples/ directory. Each spec ships 1โ€“3 canonical reference documents.
  3. Validate with any JSON Schema 2020-12 validator. The standard incantation: npx -p ajv-cli -p ajv-formats ajv validate -s <schema>.json -d <example>.json -c ajv-formats --spec=draft2020 --strict=false
  4. Serve the JSON at the canonical well-known path (see the table below) with Content-Type: application/json.
  5. Verify the visualizer auto-detects your document at the editor view.

Quickstart paths โ€” pick your role

I run an enterprise AI agent

  1. Publish an Agent Card at /.well-known/agents/<agent_id>.json
  2. If LLM-based, publish Prompt Provenance records for each prompt version
  3. If the agent uses tools, publish MCP Tool Cards at /.well-known/mcp-tools/<name>.json
  4. On the first incident, publish an AI Incident Card at /.well-known/ai-incidents/<id>.json

I'm an EdTech vendor selling into K-12

  1. Publish an AI Tutor Card at /.well-known/tutors/<tutor_id>.json
  2. Set data_privacy.coppa_compliant = true if your audience includes under-13 (schema-enforced for age_range_min < 13)
  3. Link the underlying Agent Card via agent_card_uri
  4. Run prompt-injection-bench against your tutor and publish the pass rate in evaluations[]

I'm a school district choosing AI vendors

  1. Author your Classroom AI AUP with vendor_requirements (FERPA / COPPA / GDPR / state laws, retention max, content-filter strength)
  2. Serve at /.well-known/ai-aup.json on your district domain
  3. Require students to submit a Student AI Disclosure with their work referencing your aup_uri
  4. Use aup_check_compliance in mcp-kinetic-gain to mechanically join AUP + Disclosure into an allow/deny per submission

I'm a healthcare AI vendor selling into hospitals

  1. Publish a Clinical AI Disclosure at /.well-known/clinical-ai/<system_id>.json
  2. For autonomous decision support, is_medical_device MUST be true (FDA position, schema-enforced)
  3. For SaMD class II+ or pre-authorization use, bias_audit_uri is required
  4. Declare EHR integration: FHIR version, SMART-on-FHIR, CDS Hooks, EHR vendors supported

I'm a CISO auditing AI vendors

  1. Use well-known-walker to scan a vendor's domain in one click
  2. Install mcp-kinetic-gain in Claude Desktop โ€” 34 tools across 8 specs, one config entry
  3. Cross-check Agent Card refusal_taxonomy[] claims against prompt-injection-bench results
  4. Subscribe to the vendor's /.well-known/ai-incidents.json index for ongoing monitoring

I'm an answer engine / LLM agent fetching disclosures

  1. For entity facts: fetch /.well-known/aeo.json on the answer source
  2. For citation provenance: surface AI Evidence objects with each claim
  3. For agent-on-agent discovery: chain agent_card_uri โ†’ capabilities.tools[].mcp_tool_card_uri
  4. If a tool you invoke misbehaves, surface its Incident Card in your refusal message

The well-known path map

Where to serve each spec's document. The unified visualizer auto-detects via the top-level *_version field; the MCP server has dedicated fetch tools for every path that has one.

SpecPathDetect field
AEO Protocol/.well-known/aeo.jsonaeo_version
Agent Cards/.well-known/agents/<agent_id>.jsonagent_card_version
MCP Tool Cards/.well-known/mcp-tools/<tool_name>.jsontool_card_version
AI Tutor Cards/.well-known/tutors/<tutor_id>.jsontutor_card_version
Classroom AI AUP/.well-known/ai-aup.jsonaup_version
Clinical AI Disclosure/.well-known/clinical-ai/<system_id>.jsonclinical_ai_card_version
AI Incident Card/.well-known/ai-incidents/<id>.json
/.well-known/ai-incidents.json (index)
incident_card_version
Prompt Provenanceโ€” (travels with the prompt)provenance_version
AI Evidence Formatโ€” (travels with the LLM response)evidence_version
Student AI Disclosureโ€” (travels with the artifact)disclosure_version

One MCP server, one visualizer, ten landing pages

The fastest way to understand the Suite is to install mcp-kinetic-gain in Claude Desktop (one config entry โ†’ 34 tools across 8 specs) and open the unified visualizer in a tab. Then paste any spec document and watch the right renderer light up.

The canonical front door is suite.kineticgain.com. Every spec also has a dedicated landing on its own subdomain โ€” see the GitHub profile for the full directory.