| # Scentience Skills |
|
|
| Agent skill definitions for olfaction-aware AI — part of the [Scentience](https://scentience.ai) |
| open olfaction platform. Built to the [agentskills.io](https://agentskills.io) open standard; |
| compatible with Claude Code, OpenAI Codex, and Google Antigravity. |
|
|
| ## Skill Registry |
|
|
| | Skill | Purpose | |
| |-------|---------| |
| | [ble-device](./ble-device/SKILL.md) | Connect to Reconnaisscent or Scentinel via BLE; sample or stream raw sensor readings | |
| | [olfactory-navigation](./olfactory-navigation/SKILL.md) | Plume contact classification and chemical source localization | |
| | [olfactory-inertial-odometry](./olfactory-inertial-odometry/SKILL.md) | GPS-denied position estimation by fusing olfaction signals with IMU | |
| | [colip-embeddings](./colip-embeddings/SKILL.md) | Cross-modal retrieval and semantic labeling via COLIP models | |
|
|
| ## Decision Guide |
|
|
| | Intent | Use | |
| |--------|-----| |
| | Connect to hardware, read sensor data | [ble-device](./ble-device/SKILL.md) | |
| | I have live OPU readings — where should I go? | [olfactory-navigation](./olfactory-navigation/SKILL.md) | |
| | Navigate without GPS using smell + motion | [olfactory-inertial-odometry](./olfactory-inertial-odometry/SKILL.md) | |
| | Semantically label or match a smell episode | [colip-embeddings](./colip-embeddings/SKILL.md) | |
| | Classify a chemical compound from sensor data | [colip-embeddings](./colip-embeddings/SKILL.md) (`ovl-classifier` model) | |
|
|
| ## Typical Workflows |
|
|
| **Real-time robotic navigation:** |
| `ble-device` → `olfactory-navigation` → motor controller |
|
|
| **GPS-denied localization:** |
| `ble-device` → `olfactory-inertial-odometry` → path planner |
|
|
| **Semantic labeling pipeline:** |
| `ble-device` → `colip-embeddings` → natural language annotation |
|
|
| **Full hybrid system:** |
| `ble-device` → `olfactory-navigation` + `olfactory-inertial-odometry` + `colip-embeddings` |
|
|
| ## Design Principles |
|
|
| - **Olfaction as first-class modality** — chemical sensing receives the same structural |
| rigor as vision or audio; no scalar collapsing of temporal signals |
| - **Temporal fidelity** — all skills preserve signal history, intermittency, and trend |
| context across the full reading window |
| - **Calibrated uncertainty** — every output carries a `confidence` field; no skill |
| returns conclusions without explicit bounds |
| - **Control ≠ semantics** — navigation decisions (`olfactory-navigation`, |
| `olfactory-inertial-odometry`) and semantic interpretation (`colip-embeddings`) are |
| deliberately separate skills with separate output schemas |
| - **SDK-grounded** — all examples use the official Scentience SDK; no pseudocode |
|
|
| ## SDK |
|
|
| | Language | Package | Install | |
| |----------|---------|---------| |
| | Python | [scentience](https://pypi.org/project/scentience/) | `pip install scentience` | |
| | JavaScript | [scentience](https://npmjs.com/package/scentience) | `npm install scentience` | |
| | Rust | [scentience](https://crates.io/crates/scentience) | `cargo add scentience` | |
| | C++ | [scentience/2.0.0](https://scentience.jfrog.io/ui/packages/conan:%2F%2Fscentience/2.0.0) | Conan | |
|
|
| API docs: [scentience.github.io/docs-api](https://scentience.github.io/docs-api/) · |
| API keys: [dashboard.scentience.ai](https://dashboard.scentience.ai) |
|
|