Simplicity is beautiful — NeuroChain DSL makes it real. AI operations are part of the code, not hidden layers. Scripts and the CLI are the primary way to work, the WebUI is a light demo/test surface. Run ONNX models on the CPU and keep full control.
# Sentiment
set mood from AI: "This product is great!"
if mood == "Positive":
say "👍"
NeuroChain DSL unifies AI and logic into one language so you can build solutions smartly and simply.
SST-2, Toxic, FactCheck, Intent trained models run on the CPU without heavy infrastructure.
No external cloud layers smaller attack surface, more control.
AI built-ins are part of the syntax no hidden layers.
AI: "models/sst2/model.onnx"
set mood from AI: "I love this movie."
if mood == "Positive": neuro "👍"
No heavy environments or unnecessary dependencies just code and run.
Enter NeuroChain DSL commands and run them directly in the browser. Pick a model and see results and logs instantly.
.nc filesNeuroChain drives Snake in real time: the intent model reads commands and the game reacts instantly.
[Engine (NeuroChain)]
Lexer → Parser → Interpreter → Inference (ONNX Runtime, CPU)
# Local inference — no external cloud layers
[API (Rust, Axum)]
POST /api/analyze # parses NeuroChain DSL and runs ONNX inference (CPU)
POST /api/generate # optional: produces NeuroChain DSL
Try the WebUI or add your own script and get moving right away.