Skip to content

redis-developer/ai-in-the-browser

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI in the Browser

A collection of three distinct projects demonstrating how to run AI models entirely in the browser without any server dependencies. Each project showcases different approaches to client-side machine learning using modern web technologies.

Projects Overview

1. ONNX Runtime Web - Sentiment Analysis

Real-time sentiment analysis using ONNX Runtime Web

  • Tech Stack: TypeScript, Vite, ONNX Runtime Web, DistilBERT
  • Features: Real-time text sentiment classification with confidence scores
  • Model: Pre-trained DistilBERT for sentiment analysis
  • Use Case: Demonstrates ONNX model deployment in browsers

2. Transformers.js - Multi-Tool AI Suite

Five AI tools powered by Hugging Face Transformers.js

  • Tech Stack: Svelte 5, TypeScript, Vite, Tailwind CSS, Transformers.js
  • Features: Text summarization, language translation, image captioning, object detection, text-to-speech
  • Models: Multiple specialized models (DistilBART, OPUS-MT, ViT-GPT2, DETR, SpeechT5)
  • Use Case: Comprehensive showcase of browser-based AI capabilities

3. WebLLM + LangChain - Chat Interface

Conversational AI chat using WebLLM and LangChain

  • Tech Stack: TypeScript, Vite, WebLLM, LangChain
  • Features: Interactive chat with local large language model
  • Model: Phi-3-mini-4k-instruct (quantized for web)
  • Use Case: Full conversational AI without external APIs

Quick Start

Each project is self-contained and can be run independently:

# Choose any project directory
cd [onnx-runtime-web|transformers-js|webllm-langchain]

# Install dependencies
npm install

# Start development server
npm run dev

What Makes This Special

  • No Server Required: All AI processing happens locally in your browser
  • Privacy-First: Your data never leaves your device
  • Offline Capable: Works without internet after initial model downloads
  • Modern Tech Stack: Built with the latest web technologies
  • Educational: Perfect for learning about browser-based AI

Technology Comparison

Project Primary Library Model Type Complexity Best For
ONNX Runtime Web ONNX Runtime Single model (DistilBERT) Simple ONNX model deployment
Transformers.js Hugging Face Multiple specialized Moderate Multi-task AI suite
WebLLM + LangChain WebLLM Large language model Advanced Conversational AI

Requirements

  • Node.js 14+ (18+ recommended)
  • Modern browser with WebAssembly support
  • RAM: 4GB+ recommended (8GB+ for WebLLM project)
  • Storage: ~500MB-2GB for model caching (varies by project)

Learning Path

  1. Start with ONNX Runtime Web - Simple single-model deployment
  2. Explore Transformers.js - Multiple AI tasks and model types
  3. Try WebLLM + LangChain - Advanced conversational AI

Key Features Across Projects

  • Client-side inference - No external API calls
  • Model caching - Models downloaded once, cached locally
  • Type safety - Full TypeScript support
  • Responsive design - Works on desktop and mobile
  • Error handling - Robust error management and user feedback

Development

Each project uses similar tooling:

  • Vite for fast development and building
  • TypeScript for type safety
  • Modern ES modules for clean imports
  • Hot reload for rapid development

License

MIT License - Use these projects for learning, building, or inspiration!


Ready to explore the future of browser-based AI? Pick a project, clone the repo, and start experimenting!

This README was generated with Claude Code.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published