- Python 97.6%
- Dockerfile 2.4%
|
|
||
|---|---|---|
| .github | ||
| connector | ||
| .env.example | ||
| .gitignore | ||
| docker-compose.yml | ||
| Dockerfile | ||
| README.md | ||
| requirements.txt | ||
cortex-m-matrix-connector
A single-file Python connector that bridges a Matrix room to a Cortex-M assistant instance using the Connector Protocol over WebSockets.
What it does
sequenceDiagram
participant U as Matrix User
participant B as connector.py (Matrix bot)
participant C as Cortex-M
U->>B: sends message in room
B->>C: GET /connector → sessionId
B->>C: WS /connector/<sessionId>
B->>C: assistant.message.inbound (CloudEvent)
C-->>B: assistant.message.outbound (CloudEvent)
B-->>U: replies in room (Markdown → HTML)
Two concurrent async loops run at all times:
| Loop | Description |
|---|---|
ws_loop() |
Persistent WebSocket to Cortex-M with exponential-backoff reconnect (2 s → 60 s) and in-flight message re-queuing |
client.sync_forever() |
Matrix long-poll sync, receiving RoomMessageText events via matrix-nio |
Requirements
- Python 3.12+
- A Matrix homeserver (e.g. Synapse, Dendrite, or a hosted service like matrix.org)
- A Cortex-M instance reachable at
CORTEX_M_URL - Docker + Docker Compose (for containerised deployment)
Setup
1. Create a Matrix bot account
Register a dedicated bot account on your homeserver. Most homeservers expose a registration endpoint; with Synapse you can use the admin API or register_new_matrix_user:
register_new_matrix_user -c /etc/matrix-synapse/homeserver.yaml \
--user bot --password 'super-secret-password' --no-admin
Invite the bot to the rooms it should monitor:
/invite @bot:example.org
Alternatively, set MATRIX_AUTO_JOIN=true so the bot automatically accepts any invite it receives. When MATRIX_ALLOWLIST is configured, only invites from listed users will be accepted.
2. (Optional) Obtain an access token
If you prefer token-based auth (avoids storing a password):
curl -X POST https://matrix.example.org/_matrix/client/v3/login \
-H 'Content-Type: application/json' \
-d '{"type":"m.login.password","user":"@bot:example.org","password":"super-secret-password"}'
Copy access_token and device_id from the response and set them as MATRIX_ACCESS_TOKEN / MATRIX_DEVICE_ID.
Configuration
Copy .env.example to .env and fill in the values:
cp .env.example .env
$EDITOR .env
| Variable | Required | Default | Description |
|---|---|---|---|
CORTEX_M_URL |
✅ | — | Base URL, e.g. http://cortex-m:8080/api/cortex-m/v1 |
MATRIX_HOMESERVER |
✅ | — | Homeserver URL, e.g. https://matrix.example.org |
MATRIX_USER |
✅ | — | Full MXID of the bot, e.g. @bot:example.org |
MATRIX_PASSWORD |
✅* | — | Bot account password (*required if no access token) |
MATRIX_ACCESS_TOKEN |
❌ | — | Pre-existing access token (skips password login) |
MATRIX_DEVICE_ID |
❌ | — | Device ID associated with the access token |
CONNECTOR_ID |
❌ | matrix-1 |
Stable connector identifier sent in CloudEvents |
MATRIX_ALLOWLIST |
❌ | — | Comma-separated allowed MXIDs, e.g. @alice:example.org,@bob:example.org |
CORTEX_M_TIMEOUT |
❌ | 180 |
Reply timeout in seconds |
MATRIX_AUTO_JOIN |
❌ | false |
Automatically join rooms when the bot is invited |
GITHUB_REPOSITORY |
❌ | — | Used by Docker Compose to pull the GHCR image |
⚠️ Security: If
MATRIX_ALLOWLISTis not set, the bot will respond to all users in its joined rooms. Always restrict access in production.
Running
Local (virtualenv)
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
cp .env.example .env
# edit .env …
export $(grep -v '^#' .env | xargs)
python -m connector
Docker Compose
cp .env.example .env
# edit .env …
# Build and start (uses local Dockerfile if image not found)
docker compose up -d
# View logs
docker compose logs -f
The docker-compose.yml will try to pull ghcr.io/${GITHUB_REPOSITORY}:latest and fall back to a local build if GITHUB_REPOSITORY is not set.
Publishing the Docker image to GHCR
# Authenticate
echo "$GITHUB_TOKEN" | docker login ghcr.io -u "$GITHUB_ACTOR" --password-stdin
export GITHUB_REPOSITORY=workaround-org/cortex-m-matrix
# Build
docker build -t ghcr.io/${GITHUB_REPOSITORY}:latest .
# Push
docker push ghcr.io/${GITHUB_REPOSITORY}:latest
To automate this with GitHub Actions, add a workflow that builds and pushes on every push to main.
Connector Protocol
The connector follows the standard Cortex-M Connector Protocol:
- Session —
GET <CORTEX_M_URL>/connectorreturns a plain-text UUID session ID. - WebSocket — Connect to
<CORTEX_M_URL_as_ws>/connector/<sessionId>. - Inbound event — Send a
assistant.message.inboundCloudEvent JSON frame for each Matrix message. - Outbound event — Receive
assistant.message.outboundCloudEvent JSON frames and reply to the room. - Keepalive — Send plain-text
pingevery 30 s; server repliespong. - Reconnect — Exponential backoff from 2 s to 60 s; in-flight messages are re-queued on reconnect.
Broadcast
When Cortex-M sends a response with conversationId == "broadcast", the connector forwards the message to all Matrix rooms it has seen activity in during the current session.
Dependencies
| Package | Version | Purpose |
|---|---|---|
matrix-nio |
0.* |
Async Matrix client |
httpx |
0.* |
HTTP client (session endpoint) |
websockets |
13.* |
WebSocket client (Cortex-M) |
markdown-it-py |
3.* |
Markdown → HTML conversion |
Commit conventions
This project follows Gitmoji commit conventions, e.g.:
- ✨ new features
- 🐛 bug fixes
- 📝 documentation
- 🔧 configuration
- ♻️ refactoring
- 🐳 Docker changes