Changelog

Major features and improvements in this fork of llmchat.co

Temporary Chat Visual Redesign

Updated temporary chat styling to match project UI consistency.

Improvement
  • Simplified indicator to a single-line centered text (no card/border)
  • Greeting now matches normal chat greeting style with same typography
  • Removed complex gradients, shadows, and decorative elements
  • Chat input stays centered like normal chat mode
  • Added subtle "End session" and "Switch to normal" links

Image Generation Reliability

Improved Gemini image generation stability and resilience.

Fix
  • Removed unsupported response MIME configuration for Gemini 2.5 Flash Image Preview
  • Added exponential backoff retries for Gemini rate limit and transient errors
  • Surfaced clearer messaging when prompts are blocked or no images are returned

Development & Build Improvements

Enhanced development experience and build stability.

Chore
  • Configured successful production build process
  • Updated dependencies and resolved conflicts
  • Improved TypeScript configuration
  • Added proper error handling for build compatibility
  • Updated project structure and organization
  • Enhanced analytics and tracking configuration

Authentication & Security

Enhanced user authentication with custom sign-up and improved security.

Feature
  • Implemented CustomSignUp component for user registration
  • Added email verification system
  • Enhanced sign-in process with better UX
  • Improved user profile management
  • Updated authentication flow for mobile compatibility

Mobile Support Enhancement

Full mobile support with responsive design and touch-friendly interface.

Feature
  • Removed "desktop only" restriction
  • Added mobile hamburger menu navigation
  • Implemented responsive padding and layouts
  • Enhanced sidebar functionality for mobile devices
  • Optimized message display for smaller screens
  • Added mobile-first design principles

UI Component Simplification

Replaced complex UI library components with native solutions for better stability.

Improvement
  • Replaced Dialog and Popover components with custom modals
  • Converted DropdownMenu components to native solutions
  • Simplified Command components with basic alternatives
  • Replaced HoverCard with custom hover state management
  • Removed complex UI library dependencies for stable builds
  • Enhanced mobile responsiveness and layout

Cost System Removal

Eliminated credit and cost systems for a completely free experience.

Improvement
  • Removed all credit-related UI components and logic
  • Eliminated API key requirements for model access
  • Removed cost calculations and credit displays
  • Simplified chat mode options without cost considerations
  • Updated settings to remove API key and credits sections

AI Provider Overhaul

Streamlined AI model support with focus on Google Gemini and OpenRouter integration.

Feature
  • Removed legacy providers (OpenAI, Anthropic, Together AI, Fireworks)
  • Added comprehensive Google Gemini provider support
  • Integrated OpenRouter for community models access
  • Added 5 new OpenRouter free models: Grok 4 Fast, GLM 4.5 Air, DeepSeek Chat v3.1, GPT-OSS 120B, Dolphin Mistral 24B Venice
  • Updated Gemini model identifiers for 2.5 Pro support
  • Simplified model selection to Gemini 2.0 and 2.5 Flash
  • Increased max tokens to 500,000 for all modes

Fork Origin

This project is a fork of the original llmchat.co, focused on simplification and enhanced AI provider support.

Update
  • Forked from llmchat.co codebase
  • Rebranded to focus on core chatbot functionality
  • Maintained privacy-first approach with local data storage

This is a fork of llmchat.co, enhanced with simplified AI provider support, mobile compatibility, and cost-free access. View the full commit history on GitHub.