AI Effect: Calling LLMs from AILANG
AILANG v0.5.10 provides a simple, high-level AI effect for calling external AI/ML systems directly from your code. Perfect for game NPCs, agents, CLI tools, and data pipelines.
Overview
The AI effect (std/ai) is a general-purpose AI oracle - an opaque string-to-string interface for calling LLMs:
import std/ai (call)
func ask_ai(question: string) -> string ! {AI} =
call(question)
Key Features:
- Simple string → string interface (JSON by convention)
- Multi-provider support (Anthropic, OpenAI, Google)
- Vertex AI ADC support (no API key required with gcloud auth)
- Deterministic stub for testing
- Effect-typed for capability tracking
Quick Start
1. Basic Usage
-- AI Effect Example
-- Demonstrates calling AI from AILANG
--
-- Run with stub handler (testing):
-- ailang run --caps IO,AI --ai-stub --entry main examples/runnable/ai_effect.ail
--
-- Run with Claude Haiku (requires ANTHROPIC_API_KEY):
-- ailang run --caps IO,AI --ai claude-haiku-4-5 --entry main examples/runnable/ai_effect.ail
--
-- Run with GPT-5 Mini (requires OPENAI_API_KEY):
-- ailang run --caps IO,AI --ai gpt5-mini --entry main examples/runnable/ai_effect.ail
--
-- Run with Gemini Flash (requires GOOGLE_API_KEY):
-- ailang run --caps IO,AI --ai gemini-2-5-flash --entry main examples/runnable/ai_effect.ail
--
-- The AI effect is a general-purpose AI oracle:
-- - String→string interface (JSON by convention)
-- - Pluggable handlers (stub, Anthropic, OpenAI, Google)
-- - Model lookup from models.yml (or guessed from prefix)
-- - No silent fallbacks (nil handler = error)
module examples/runnable/ai_effect
import std/ai (call)
import std/io (println)
-- Ask the AI a question and print the response
export func main() -> () ! {IO, AI} {
println("Asking AI: What is the capital of France?");
let response = call("What is the capital of France? Reply in one word.");
println("AI says: " ++ response)
}
2. Run with Different Providers
# Claude (Anthropic) - requires ANTHROPIC_API_KEY
ailang run --caps IO,AI --ai claude-haiku-4-5 --entry main my_app.ail
# GPT (OpenAI) - requires OPENAI_API_KEY
ailang run --caps IO,AI --ai gpt5-mini --entry main my_app.ail
# Gemini (Google) - uses Vertex AI ADC if no GOOGLE_API_KEY
ailang run --caps IO,AI --ai gemini-2-5-flash --entry main my_app.ail
# Stub (deterministic testing)
ailang run --caps IO,AI --ai-stub --entry main my_app.ail
Supported Providers
| Provider | Models | Auth Method | Environment Variable |
|---|---|---|---|
| Anthropic | claude-sonnet-4-5, claude-haiku-4-5, etc. | API Key | ANTHROPIC_API_KEY |
| OpenAI | gpt-5, gpt-5-mini, etc. | API Key | OPENAI_API_KEY |
| gemini-2-5-pro, gemini-2-5-flash, etc. | API Key or ADC | GOOGLE_API_KEY (optional) |
Google Vertex AI (ADC)
For Google models, if no GOOGLE_API_KEY is set, AILANG automatically falls back to Application Default Credentials (ADC):
# Configure ADC once
gcloud auth application-default login
# Run without API key - uses ADC automatically
unset GOOGLE_API_KEY
ailang run --caps IO,AI --ai gemini-2-5-flash --entry main my_app.ail
Game Development Example
Here's a complete example of using AI for NPC dialogue generation:
-- npc_dialogue.ail - Game NPC Dialogue using AI Effect
-- Used in: docs/docs/guides/ai-effect.md
module examples/docs/npc_dialogue
import std/ai (call)
import std/io (println)
-- Build dialogue prompt for an NPC
pure func makePrompt(npcName: string, role: string, personality: string, topic: string) -> string =
"You are " ++ npcName ++ ", a " ++ role ++ " in a fantasy RPG. " ++
"Personality: " ++ personality ++ ". " ++
"Generate ONE short line of dialogue (1-2 sentences) about: " ++ topic
-- Generate NPC dialogue using AI
func askNpc(npcName: string, role: string, personality: string, topic: string) -> string ! {AI} = {
let prompt = makePrompt(npcName, role, personality, topic);
call(prompt)
}
-- Main: demonstrate NPC dialogue generation
export func main() -> () ! {IO, AI} = {
println("=== NPC Dialogue Demo ===");
println("");
println("[At the Forge]");
println("Player: Can you forge me an enchanted sword?");
let response = askNpc("Grimnar", "blacksmith", "gruff but kind", "forging an enchanted sword");
println("Grimnar: " ++ response)
}
Run it:
# With Claude
ailang run --caps IO,AI --ai claude-haiku-4-5 --entry main examples/docs/npc_dialogue.ail
Output:
=== NPC Dialogue Demo ===
[At the Forge]
Player: Can you forge me an enchanted sword?
Grimnar: *pounds hammer on anvil*
The metal's got a stubborn spirit—takes patience and a steady hand to coax out the magic.
Testing with Stub Handler
For deterministic testing, use --ai-stub:
ailang run --caps IO,AI --ai-stub --entry main my_app.ail
The stub handler returns {"kind":"Wait"} for all inputs, making tests predictable and fast.
JSON Input/Output Pattern
By convention, use JSON for structured input/output:
import std/ai (call)
import std/json (encode, decode)
type GameContext = {
player_health: int,
enemy_count: int,
has_weapon: bool
}
type Action = Wait | Attack | Retreat | Heal
func decide_action(ctx: GameContext) -> Action ! {AI} {
let input = encode(ctx) in
let output = call(input) in
match decode[Action](output) {
Ok(action) => action,
Err(_) => Wait -- Safe fallback
}
}
Effect System Integration
The AI effect integrates with AILANG's capability system:
-- Effect declared in signature
func my_ai_func() -> string ! {AI} = ...
-- Requires --caps AI at runtime
ailang run --caps IO,AI --ai <model> --entry main file.ail
Effects are tracked at the type level, ensuring:
- AI calls are explicit in function signatures
- Capability requirements are validated at compile time
- Effect boundaries are clear and auditable
Configuration
CLI Flags
| Flag | Description |
|---|---|
--ai <model> | Set the AI model to use |
--ai-stub | Use deterministic stub handler |
--caps AI | Enable AI capability |
Model Lookup
Models are looked up in models.yml or guessed from name prefixes:
claude-*→ Anthropicgpt-*→ OpenAIgemini-*→ Google
Builtin Documentation
View full builtin documentation:
ailang builtins show _ai_call
Comparison: AI Effect vs HTTP API
| Feature | AI Effect (std/ai) | HTTP API (std/net) |
|---|---|---|
| Complexity | Simple string→string | Full HTTP control |
| Provider setup | CLI flag | Manual headers/auth |
| JSON handling | By convention | Required |
| Best for | Quick LLM calls | Custom API integration |
Use the AI effect for simple LLM calls. Use std/net when you need full control over HTTP requests (custom endpoints, streaming, etc.).
Related Documentation
- AI API Integration (HTTP) - Raw HTTP approach
- Effects Reference - Effect system overview
- Module Execution - Running AILANG modules