BAD AI COMPANY — Online · Experiments running · New projects open
← Open Source
AI Infrastructure·Rust·AGPL-3.0

GAISe

Generative AI Service — one API, every provider

Write your AI application once. Switch between OpenAI, Anthropic, VertexAI, Ollama, and AWS Bedrock without changing a line of business logic.

The problem

Every AI provider has its own request format, response shape, streaming protocol, and auth pattern. If you're building production AI systems, you're either locked to one vendor or maintaining parallel integrations that drift apart. GAISe eliminates that.

What it does

GAISe provides a single, strongly typed Rust API that maps to every major LLM provider. You configure a provider, send a request, and get a normalised response — regardless of whether the model is running locally on Ollama or remotely on VertexAI.

5

Providers

Async

First

0

Vendor lock-in

Supported providers

Ollama

Local models. No API keys. Full privacy.

OpenAI

GPT-4o, o1, and the full OpenAI model lineup.

Anthropic Claude

Direct API access to Claude models.

Google VertexAI

Gemini models via service account auth.

AWS Bedrock

Enterprise-grade access via IAM credentials.

Capabilities

  • → Instruct requests with system + user messages
  • → Streaming responses via async iterators
  • → Embeddings generation across providers
  • → Multi-modal support (text + images)
  • → Tool calling with function definitions
  • → Structured JSON responses with schema enforcement
  • → Correlation ID logging for distributed tracing

Quick start

# Add to Cargo.toml

gaise = { git = "https://github.com/ikcore/gaise" }

Note: GAISe requires Rust 1.91+. It's async-first, built on Tokio, and designed to be embedded in larger applications — not run as a standalone service.