ThinClaw Host
Share your GPU to power AI inference
Choose a Model
Model
Select a model...
Load Model & Start Hosting
Preparing...
Offline
Your host code — tell users to run:
/model host:<code>
Session Stats
0
Requests
0
Tokens In
0
Tokens Out
-
Avg tok/s
Stop Hosting