Slow AI
  • About
    • Introduction
    • Background
    • Roadmap
  • AI Inference
    • Getting Started
    • Llama
    • Stable Diffusion
  • Nodes
    • Getting Start
    • Windows - GPU Slowly Node
    • Ubuntu - GPU Slowly Node
  • Nodes
    • Windows - GPU Slowly Node
    • Ubuntu - GPU Slowly Node
  • Protocols
    • Getting Started
    • Staking
    • Rewards
    • Pools
    • Nodes
    • Token
  • Links
    • Website
    • Twitter
    • Telegram
    • Github
Powered by GitBook
On this page
  1. AI Inference

Getting Started

Slowly AI is a platform designed for running AI inference workloads. This section provides an introduction to the platform and explains how to get started. Firstly, you will need to obtain $SLOWLY tokens, which can be acquired from various sources. Once you have obtained $SLOWLY tokens, you can start running AI inference workloads, such as LLama and Stable Diffusion. This section contains detailed documentation to assist you in setting up your workloads quickly and easily.

PreviousRoadmapNextLlama

Last updated 1 year ago