Getting Started

Get up and running with BIAS in minutes. Follow this guide to install and integrate BIAS into your LLM applications.

Quick Start

Three simple steps to start saving 44-52% on LLM token costs.

1️⃣

Install BIAS Core

Download and run the BIAS gRPC server (biasd) on your infrastructure. Available for Linux, macOS, and Windows.

2️⃣

Install Language Bindings

Install the BIAS client library for your language (Python, TypeScript, JavaScript). Simple npm/pip install.

3️⃣

Start Encoding

Replace JSON.stringify() with bias.encode(). That's it! Immediate 44-52% token savings on every LLM call.

Step 1: Install BIAS Core (Server)

Prerequisites

Binaries Available on Request

BIAS Core server binaries are available by contacting the BIAS Foundation. Pre-built binaries for Linux, macOS, and Windows are provided for production use.

Contact for Access

For production deployments, licensing, or enterprise support, please contact: agentforgelabs@gmail.com

Verify Installation

Bash
# Check if server is running
curl http://localhost:50051

# Or use grpcurl
grpcurl -plaintext localhost:50051 list

# Expected output:
# grpc.health.v1.Health
# bias.BiasService

Run as System Service (Linux)

Bash
# Create systemd service file
sudo tee /etc/systemd/system/biasd.service <<EOF
[Unit]
Description=BIAS gRPC Server
After=network.target

[Service]
Type=simple
User=bias
WorkingDirectory=/opt/bias
ExecStart=/opt/bias/target/release/biasd
Restart=always

[Install]
WantedBy=multi-user.target
EOF

# Start service
sudo systemctl daemon-reload
sudo systemctl enable biasd
sudo systemctl start biasd

# Check status
sudo systemctl status biasd

Step 2: Install Language Bindings

Python

Bash
# Clone bindings repository
git clone https://gitlab.com/agentforgelabs-group/BIAS-Bindings.git
cd BIAS-Bindings/python

# Install dependencies
pip install grpcio grpcio-tools

# Install BIAS client
pip install -e .

# Verify installation
python -c "from bias_bindings import bias_pb2; print('BIAS Python bindings installed!')"

TypeScript/JavaScript (Node.js)

Bash
# Clone bindings repository
git clone https://gitlab.com/agentforgelabs-group/BIAS-Bindings.git
cd BIAS-Bindings/typescript

# Install dependencies
npm install

# Build TypeScript
npm run build

# Link for local development
npm link

# Use in your project
cd /your/project
npm link @bias/client

Verify Bindings

Python
# test_bias.py
import grpc
from bias_bindings import bias_pb2, bias_pb2_grpc
import json

# Connect to BIAS server
channel = grpc.insecure_channel('localhost:50051')
stub = bias_pb2_grpc.BiasServiceStub(channel)

# Test encoding
data = {"test": "Hello BIAS!", "value": 123}
request = bias_pb2.EncodeRequest(
    json_input=json.dumps(data),
    format="inline"
)
response = stub.Encode(request)

print("Original:", json.dumps(data))
print("Encoded:", response.bias_output)
print("✓ BIAS is working!")

Step 3: First Integration

Basic Python Example

Python
import grpc
from bias_bindings import bias_pb2, bias_pb2_grpc
import json
import openai

# Setup BIAS client
channel = grpc.insecure_channel('localhost:50051')
bias = bias_pb2_grpc.BiasServiceStub(channel)

# Your data
context = {
    "user": {"id": 123, "name": "Alice"},
    "items": [
        {"id": 1, "name": "Widget", "price": 9.99},
        {"id": 2, "name": "Gadget", "price": 19.99}
    ]
}

# Encode with BIAS
request = bias_pb2.EncodeRequest(
    json_input=json.dumps(context),
    format="inline"
)
encoded = bias.Encode(request).bias_output

print(f"Original tokens: ~{len(json.dumps(context).split())}")
print(f"BIAS tokens: ~{len(encoded.split())}")
print(f"Savings: ~{(1 - len(encoded)/len(json.dumps(context)))*100:.0f}%")

# Use with OpenAI
response = openai.ChatCompletion.create(
    model="gpt-4",
    messages=[{
        "role": "user",
        "content": f"Analyze this data (BIAS format):\n{encoded}"
    }]
)

print(response.choices[0].message.content)

TypeScript Example

TypeScript
import { BiasClient } from '@bias/client';
import Anthropic from '@anthropic-ai/sdk';

const bias = new BiasClient({ endpoint: 'localhost:50051' });
const anthropic = new Anthropic({ apiKey: process.env.ANTHROPIC_API_KEY });

async function main() {
  // Your data
  const context = {
    user: { id: 123, name: 'Alice' },
    items: [
      { id: 1, name: 'Widget', price: 9.99 },
      { id: 2, name: 'Gadget', price: 19.99 }
    ]
  };

  // Encode with BIAS
  const encoded = await bias.encode(context, { format: 'inline' });

  console.log('Original:', JSON.stringify(context).length, 'bytes');
  console.log('BIAS:', encoded.length, 'bytes');
  console.log('Savings:', ((1 - encoded.length / JSON.stringify(context).length) * 100).toFixed(0) + '%');

  // Use with Claude
  const message = await anthropic.messages.create({
    model: 'claude-3-opus-20240229',
    max_tokens: 1024,
    messages: [{
      role: 'user',
      content: `Analyze this data (BIAS format):\n${encoded}`
    }]
  });

  console.log(message.content[0].text);
}

main();

Configuration

Server Configuration

TOML - config.toml
[server]
host = "0.0.0.0"
port = 50051
workers = 4

[limits]
max_depth = 128
max_entities = 100000
max_payload_size = "10MB"

[performance]
enable_caching = true
cache_size = 1000

[logging]
level = "info"
format = "json"

Client Configuration

Python
import grpc
from bias_bindings import bias_pb2_grpc

# Configure connection options
options = [
    ('grpc.max_send_message_length', 10 * 1024 * 1024),  # 10MB
    ('grpc.max_receive_message_length', 10 * 1024 * 1024),
    ('grpc.keepalive_time_ms', 10000),
]

# Create channel with options
channel = grpc.insecure_channel('localhost:50051', options=options)
bias = bias_pb2_grpc.BiasServiceStub(channel)

Deployment Options

🐳 Docker

docker pull bias/core:latest
docker run -p 50051:50051 bias/core:latest

☸️ Kubernetes

kubectl apply -f bias-deployment.yaml
kubectl expose deployment bias --port=50051

☁️ Cloud Run / Lambda

Deploy as serverless function with automatic scaling. BIAS starts in <100ms with minimal memory footprint.

Troubleshooting

Connection Refused

Error: "Failed to connect to localhost:50051"
Solution: Ensure biasd is running and port 50051 is accessible. Check firewall rules and SELinux settings.

Import Errors

Error: "No module named 'bias_bindings'"
Solution: Reinstall bindings with pip install -e . and verify PYTHONPATH includes the bindings directory.

Encoding Errors

Error: "Max depth exceeded"
Solution: Reduce nesting depth or increase max_depth limit in server config. Check for circular references.

Performance Issues

Symptom: Slow encoding/decoding
Solution: Enable caching, increase worker count, or use connection pooling. Check server resource usage.

Next Steps

📚

Explore Examples

Check out real-world code examples for Python, TypeScript, and JavaScript.

View Examples
📊

Review Benchmarks

See detailed performance data and cost analysis across different use cases.

See Benchmarks
🔧

Advanced Features

Learn about multi-format support, streaming, caching, and optimization techniques.

View Features

Community & Support

📖 Documentation

Code examples and integration guides available in the public bindings repository.

View Examples

🐛 Issue Tracker

Report bugs, request features, or get help from the community.

Report Issue

💬 Contributing

Contribute to BIAS development. Public bindings repository accepts PRs.

Contribute

Need Help?

Check out our examples and benchmarks, or visit the GitLab repository for documentation.