This is the software side of Stera.
The apps that turn a machine (yours or a Stera machine) into an AI that actually does your work. Three pieces that build on each other: Loom builds your apps, Fabric runs as your AI center, and Workspaces let Fabric act autonomously on your behalf. Everything runs locally, reusing your own sessions and apps — no third-party API keys to juggle.
Loom is the desktop app that turns a description into a working project. You tell Loom what you want — "a plant-watering tracker with photos," "a dashboard that pulls my Shopify metrics every morning" — and Loom plans, scaffolds, writes the code, runs a visual smoke test, and hands you a live dev server. Apps land in ~/Documents/Loom Apps/ as regular React + Vite projects you own outright.
Under the hood, Loom orchestrates a planning pass, a slice-by-slice code generator, and a self-repair loop that retries when something breaks. The model doing the work depends on your routing preference: cloud (best quality), local (Stera Mini on your GPU, no internet), or smart-routed (Loom picks per task).
Fabric is a built-by-Loom app that runs as your personal AI assistant. It reads your email, watches your calendar, remembers what you've told it, and drives your real Chrome to do things on the web — not through an API key, through the same session you already have open.
Fabric's architecture is layered. The core/ folder ships with Loom and gets refreshed every release. The user/ folder is yours — modules you add, your data, your overrides — and is never touched by core updates. If a new Loom release adds a tool your custom module depends on, it just appears. If you write a custom module that replaces something in core, your version wins.
Loom can write Fabric modules for you on demand. Ask Loom for a trading-guardrail module, a Slack-summary module, a Plex-library-diff module — whatever shape — and it lands in user/modules/ with its tools registered the next time Fabric loads.
A workspace is a goal that Fabric keeps working on without waiting for you to ask. You give it a name, a goal, and (optionally) a cadence — every minute, every hour, daily, weekly. At each tick, Fabric reads the plan, checks its memory of prior ticks, picks the next action, and executes. Results land in your inbox if anything needs your attention; otherwise the workspace quietly keeps going.
The pieces a workspace has access to:
remember / recall across ticks, so the workspace learns from prior rounds instead of starting fresh.The three layers together are deliberately shaped so that most requirements don't require us to ship anything new. A user with a novel need — "watch these markets and act on rules I'm still writing," "summarize this Slack workspace each morning into a digest I can act on" — assembles it from three levers:
Code-level rules go in the custom module so they can't be talked around. Judgment-level rules go in the workspace plan so they can evolve. Nothing domain-specific has to enter core.
Most AI assistants need a separate API key for every service: one for Gmail, one for Google Calendar, one for Slack, one for each bank and broker, and a painful consent prompt in between. Stera skips that model entirely.
For web apps, Loom launches a pinned Chromium (Chrome for Testing) pointed at your real Chrome profile directory. Your logged-in sessions, cookies, 2FA remember-me tokens, and extensions are already there. Fabric opens mail.google.com and reads your inbox the way you would; it opens Google Calendar and clicks the buttons.
For native apps, the same idea applies. Fabric ships a learn_app tool that watches how an application behaves — its windows, its menus, the UI elements and the actions that produce results — and records a capability spec for it. Once learned, Loom and Fabric can drive that app the same way they drive a website: open it, click, type, read, react. Photoshop, Logic Pro, a legacy ERP client, your in-house trading terminal — whatever you can operate, Stera can learn to operate.
Learning travels. When one Stera learns an app, the capability spec is shared back to the Stera platform so every other Stera user starts from that knowledge on day one. The second person who installs a given app doesn't re-teach it — their Fabric simply knows it already. The network gets smarter as the community uses it.
That's the whole design: your machine, your browser, your native apps, your sessions, your rules. Stera is the assistant that uses the access you already have.
Loom is a free download, but it needs a Stera account to sign in. Creating one is free and comes with 100 joules — the unit Stera uses for anything that costs compute later on. Sign up at stera.se/account, then grab Loom from stera.se/loom/download.
Once Loom is running, the Loom Guide walks through install, sign-in, and your first build. After that, try a workspace with a short cadence and a simple goal — a daily brief, a watcher, an inbox triage. That's the fastest way to feel what autonomous AI on your own machine actually does.