A modern, feature-rich AI chat application built with Next.js 15, featuring persistent conversations, intelligent memory context, and seamless Ollama integration.
- About The Project
- Features
- Tech Stack
- Prerequisites
- Installation
- Configuration
- Running the Application
- Project Structure
- API Documentation
- Contributing
- Troubleshooting
- License
- Contact
Cogenix is an intelligent conversational AI platform designed to provide seamless interactions with local AI models through Ollama. Built with modern web technologies, it offers a production-ready solution for deploying AI chat interfaces with enterprise-grade features.
The primary goal of Cogenix is to:
- Democratize AI Access: Provide an intuitive interface for interacting with locally-run AI models via Ollama
- Preserve Context: Maintain conversation history and context across sessions using MongoDB persistence
- Enable Organization: Offer thread-based conversation management for better workflow organization
- Enhance User Experience: Deliver real-time streaming responses with a beautiful, responsive UI
- Ensure Privacy: Keep all conversations local and secure with self-hosted infrastructure
- Personal AI Assistant: Daily tasks, brainstorming, and research
- Development Tool: Code assistance, debugging, and technical documentation
- Learning Platform: Educational conversations and knowledge exploration
- Enterprise Solution: Internal AI-powered support and documentation systems
- π¬ Real-time Streaming Chat: Experience AI responses as they're generated with server-sent events
- π§ Intelligent Memory Context: AI remembers and references previous conversations
- ποΈ Thread Management: Organize conversations into separate threads with persistent storage
- π¨ Advanced Theme System:
- Light Mode - Clean, professional interface
- Dark Mode - Eye-friendly, reduced strain
- System Mode - Automatically syncs with OS preferences
- π Token Statistics: Track prompt and completion tokens for usage monitoring
- π Data Persistence: All conversations securely stored in MongoDB
- π― Model Selection: Switch between different Ollama models on-the-fly
- β‘ Performance Optimized: Built with Next.js 15 Turbopack for lightning-fast development
- π Type-Safe: Full TypeScript implementation for robust code quality
- Server-Sent Events (SSE) for real-time streaming
- React Query for efficient state management and caching
- Feature-based architecture for scalability
- Responsive design with Tailwind CSS 4
- MongoDB with Mongoose ODM for data modeling
- RESTful API architecture
- Error handling and retry logic
- Connection pooling and optimization
- Framework: Next.js 15.5.5 - React framework with App Router
- UI Library: React 19.1.0 - Latest React with concurrent features
- Language: TypeScript 5.x - Type-safe development
- Styling: Tailwind CSS 4 - Utility-first CSS framework
- State Management: TanStack React Query 5.90.3 - Server state management
- HTTP Client: Axios 1.12.2 - Promise-based HTTP client
- Database: MongoDB - NoSQL document database
- ODM: Mongoose 8.19.1 - MongoDB object modeling
- AI Backend: Ollama - Local AI model runtime
- API: Next.js API Routes - Serverless API endpoints
- Build Tool: Turbopack - Next.js native bundler
- Package Manager: npm/yarn/pnpm - Dependency management
- Code Quality: ESLint, Prettier - Code formatting and linting
Before you begin, ensure you have the following installed:
-
Node.js (v20.x or higher)
# Check version node --version # Should output: v20.x.x or higher
-
npm (v10.x or higher) or yarn or pnpm
npm --version
-
MongoDB (v6.0 or higher)
- MongoDB Community Edition
- Or use MongoDB Atlas (free tier available)
-
Ollama (Latest version)
- Install from Ollama Official Website
- Ensure it's running on
http://localhost:11434
# Verify Ollama installation ollama --version # Pull a model (e.g., llama3) ollama pull llama3 # Verify model is available ollama list
- Git: For version control
- VS Code: Recommended IDE with TypeScript support
- MongoDB Compass: GUI for MongoDB database management
# Using HTTPS
git clone https://github.com/yourusername/cogenix.git
# Or using SSH
git clone git@github.com:yourusername/cogenix.git
# Navigate to project directory
cd cogenix# Using npm
npm install
# Using yarn
yarn install
# Using pnpm
pnpm installThis will install all required dependencies listed in package.json.
Create your environment configuration file:
# Copy the template
cp env.template .env.localEdit .env.local with your actual values:
# =================================
# Backend Configuration
# =================================
# Ollama Backend URL
OLLAMA_URL=http://localhost:11434
# =================================
# Model Configuration
# =================================
# Default AI model (must be pulled in Ollama)
DEFAULT_MODEL=llama3
# =================================
# Database Configuration
# =================================
# MongoDB Connection String
# Local MongoDB:
MONGODB_URI=mongodb://localhost:27017/cogenix
# Or MongoDB Atlas (recommended for production):
# MONGODB_URI=mongodb+srv://username:password@cluster.mongodb.net/cogenix?retryWrites=true&w=majority
# =================================
# Application Settings
# =================================
# Node environment
NODE_ENV=development
# =================================
# Optional: Additional Configuration
# =================================
# Uncomment if using external AI services
# OPENAI_API_KEY=sk-...
# ANTHROPIC_API_KEY=sk-ant-...
# Uncomment if implementing authentication
# JWT_SECRET=your-super-secret-key-here
# Public variables (exposed to browser)
# NEXT_PUBLIC_APP_URL=http://localhost:3000Ensure Ollama is running and has models available:
# Start Ollama (if not already running)
ollama serve
# In another terminal, verify connection
curl http://localhost:11434/api/tags
# Pull required models
ollama pull llama3 # Recommended default
ollama pull mistral # Alternative option
ollama pull codellama # For code-focused tasks# If using local MongoDB, ensure it's running
mongosh
# Or test MongoDB Atlas connection
mongosh "your_mongodb_connection_string"Start the development server with hot-reload:
# Using npm
npm run dev
# Using yarn
yarn dev
# Using pnpm
pnpm devThe application will be available at:
- Frontend: http://localhost:3000
- API: http://localhost:3000/api
Build and start the production server:
# Build the application
npm run build
# Start production server
npm run start- Open the application: Navigate to
http://localhost:3000 - Check Ollama connection: You should see available models in the UI
- Send a test message: Type "Hello" and verify you get a response
- Check thread persistence: Refresh the page and verify your conversation is saved
cogenix/
βββ public/ # Static assets
β βββ file.svg
β βββ globe.svg
β βββ ...
βββ src/
β βββ app/ # Next.js App Router
β β βββ api/ # API Routes
β β β βββ chat/ # Chat endpoints
β β β β βββ route.ts # POST /api/chat
β β β βββ memory/ # Memory management
β β β β βββ clear/
β β β β βββ route.ts
β β β βββ models/ # Model listing
β β β β βββ route.ts
β β β βββ threads/ # Thread management
β β β βββ route.ts # GET/POST threads
β β β βββ [id]/
β β β βββ route.ts
β β β βββ messages/
β β β βββ route.ts
β β βββ chat/ # Chat pages
β β β βββ [threadId]/
β β β βββ page.tsx
β β βββ layout.tsx # Root layout
β β βββ page.tsx # Home page
β β βββ providers.tsx # Context providers
β β
β βββ components/ # Shared components
β β βββ index.ts
β β βββ SettingsModal.tsx
β β
β βββ config/ # Configuration
β β βββ database.ts # MongoDB connection
β β βββ env.ts # Environment variables
β β βββ index.ts
β β
β βββ constants/ # App constants
β β βββ api.ts # API endpoints
β β βββ index.ts
β β
β βββ contexts/ # React contexts
β β βββ ThemeContext.tsx # Theme management
β β βββ index.ts
β β
β βββ features/ # Feature modules
β β βββ chat/ # Chat feature
β β βββ components/ # Chat components
β β β βββ controls/
β β β β βββ ModelSelector.tsx
β β β βββ layout/
β β β β βββ ChatContainer.tsx
β β β β βββ ChatContainerWithPersistence.tsx
β β β βββ messages/
β β β β βββ ChatInput.tsx
β β β β βββ MemoryDisplay.tsx
β β β β βββ MessageBubble.tsx
β β β βββ sidebar/
β β β βββ ThreadSidebar.tsx
β β βββ hooks/ # Custom React hooks
β β β βββ useChat.ts
β β β βββ useMemory.ts
β β β βββ useModels.ts
β β β βββ useThread.ts
β β β βββ useThreads.ts
β β βββ models/ # Data models
β β β βββ Thread.ts # MongoDB Thread model
β β βββ services/ # API services
β β β βββ chat.service.ts
β β β βββ memory.service.ts
β β β βββ model.service.ts
β β β βββ thread.service.ts
β β βββ types/ # TypeScript types
β β βββ database.ts
β β βββ index.ts
β β
β βββ lib/ # Utility libraries
β β βββ axios.ts # Axios configuration
β β βββ index.ts
β β
β βββ styles/ # Global styles
β β βββ globals.css
β β
β βββ types/ # Global TypeScript types
β βββ api.ts
β
βββ .env.local # Environment variables (create from template)
βββ env.template # Environment template
βββ next.config.ts # Next.js configuration
βββ package.json # Dependencies and scripts
βββ postcss.config.mjs # PostCSS configuration
βββ tailwind.config.js # Tailwind CSS configuration
βββ tsconfig.json # TypeScript configuration
βββ README.md # This file
The project uses a feature-based architecture where each feature (like chat) contains:
- Components: UI components specific to the feature
- Hooks: Custom React hooks for business logic
- Services: API communication layer
- Types: TypeScript type definitions
- Models: Database schemas and models
This structure provides:
- Modularity: Easy to add/remove features
- Scalability: Clear separation of concerns
- Maintainability: Related code stays together
- Testability: Isolated units for testing
http://localhost:3000/api
POST /api/chat
Content-Type: application/json
{
"messages": [
{
"role": "user",
"content": "Hello, how are you?"
}
],
"model": "llama3",
"stream": true,
"threadId": "optional-thread-id"
}Response (Streaming):
data: {"content": "Hello", "done": false}
data: {"content": "! I'm", "done": false}
data: {"content": " doing well", "done": false}
data: [DONE]
Response (Non-Streaming):
{
"content": "Hello! I'm doing well, thank you for asking.",
"model": "llama3",
"context": [],
"tokens": {
"prompt": 10,
"completion": 20,
"total": 30
}
}GET /api/modelsResponse:
{
"models": [
{
"name": "llama3",
"size": "7B",
"modified_at": "2024-01-15T10:30:00Z"
}
]
}GET /api/threadsResponse:
{
"threads": [
{
"_id": "thread-id",
"title": "Conversation Title",
"aiModel": "llama3",
"messages": [...],
"createdAt": "2024-01-15T10:30:00Z",
"updatedAt": "2024-01-15T11:00:00Z"
}
]
}POST /api/threads
Content-Type: application/json
{
"title": "New Conversation",
"aiModel": "llama3"
}GET /api/threads/:id/messagesDELETE /api/threads/:idPOST /api/memory/clearWe welcome contributions from the community! Here's how you can help make Cogenix better.
-
Fork the Repository
# Click the 'Fork' button on GitHub -
Clone Your Fork
git clone https://github.com/your-username/cogenix.git cd cogenix -
Create a Feature Branch
git checkout -b feature/your-feature-name # or git checkout -b fix/your-bug-fix -
Make Your Changes
- Write clean, maintainable code
- Follow the existing code style
- Add comments for complex logic
- Update documentation if needed
-
Test Your Changes
# Ensure the application runs without errors npm run dev # Test all affected features manually # TODO: Add automated tests when available
-
Commit Your Changes
git add . git commit -m "feat: add amazing new feature"
Commit Message Convention:
feat:- New featurefix:- Bug fixdocs:- Documentation changesstyle:- Code style changes (formatting)refactor:- Code refactoringtest:- Adding testschore:- Maintenance tasks
-
Push to Your Fork
git push origin feature/your-feature-name
-
Create a Pull Request
- Go to the original repository
- Click "New Pull Request"
- Select your fork and branch
- Fill in the PR template
- Wait for review
// β
Good
interface ChatMessage {
role: 'user' | 'assistant';
content: string;
timestamp: Date;
}
// β
Use descriptive names
const fetchUserMessages = async (userId: string) => {
// Implementation
};
// β Avoid
const a = async (b: string) => {
// Implementation
};// β
Good - Use functional components with TypeScript
interface ButtonProps {
label: string;
onClick: () => void;
variant?: 'primary' | 'secondary';
}
export const Button: React.FC<ButtonProps> = ({
label,
onClick,
variant = 'primary'
}) => {
return (
<button onClick={onClick} className={`btn-${variant}`}>
{label}
</button>
);
};- One component per file
- Co-locate related files (component + styles + tests)
- Use index.ts for clean exports
- Keep files under 300 lines when possible
- π§ͺ Testing: Setting up Jest and React Testing Library
- π± Mobile Optimization: Improving mobile responsiveness
- π Internationalization: Adding multi-language support
- βΏ Accessibility: Improving ARIA labels and keyboard navigation
- π Documentation: Improving inline code documentation
- π Bug Fixes: Check the Issues page
- β¨ Features: See Feature Requests
When reporting bugs, please include:
- Description: Clear description of the issue
- Steps to Reproduce: Detailed steps to reproduce the bug
- Expected Behavior: What should happen
- Actual Behavior: What actually happens
- Environment:
- OS: (e.g., Windows 11, macOS 14, Ubuntu 22.04)
- Node version: (e.g., v20.10.0)
- Browser: (e.g., Chrome 120, Firefox 121)
- Ollama version: (e.g., 0.1.20)
- Screenshots: If applicable
- Logs: Console errors or server logs
We love new ideas! When suggesting features:
- Use Case: Explain why this feature is needed
- Description: Detailed description of the feature
- Mockups: Wireframes or mockups if applicable
- Alternatives: Alternative solutions you've considered
All PRs go through review:
- Automated Checks: Linting and build checks (coming soon)
- Code Review: At least one maintainer reviews the code
- Testing: Manual testing of the feature
- Approval: PR is merged after approval
Solution:
# 1. Verify Ollama is running
curl http://localhost:11434/api/tags
# 2. If not running, start Ollama
ollama serve
# 3. Check if the port is correct in .env.local
OLLAMA_URL=http://localhost:11434Solution:
# 1. Check if MongoDB is running
mongosh
# 2. Verify connection string in .env.local
MONGODB_URI=mongodb://localhost:27017/cogenix
# 3. If using MongoDB Atlas, check:
# - IP whitelist includes your IP
# - Username and password are correct
# - Database name is correctSolution:
# 1. Delete node_modules and package-lock.json
rm -rf node_modules package-lock.json
# 2. Clear npm cache
npm cache clean --force
# 3. Reinstall dependencies
npm installSolution:
# Option 1: Kill the process using port 3000
# On macOS/Linux:
lsof -ti:3000 | xargs kill -9
# On Windows:
netstat -ano | findstr :3000
taskkill /PID <PID> /F
# Option 2: Use a different port
PORT=3001 npm run devSolution:
- Check system resources (RAM, CPU)
- Ensure you're using an appropriate model size for your hardware
- Try a smaller model:
ollama pull llama2:7b - Check Ollama logs:
ollama logs
If you're still experiencing issues:
- Check existing issues: GitHub Issues
- Create a new issue: Use the bug report template
- Join discussions: GitHub Discussions
- Click the button above or go to Vercel
- Import your repository
- Configure environment variables in Vercel dashboard:
MONGODB_URI: Your MongoDB Atlas connection stringOLLAMA_URL: Your hosted Ollama instance (or use a cloud provider)DEFAULT_MODEL: Your default AI model
- Deploy
Note: For production, you'll need:
- MongoDB Atlas (free tier available)
- Hosted Ollama instance or alternative AI backend
The application can be deployed to any platform that supports Next.js:
- Netlify
- Railway
- DigitalOcean App Platform
- AWS Amplify
- Google Cloud Run
See Next.js deployment documentation for platform-specific guides.
This project is licensed under the MIT License - see the LICENSE file for details.
Project Maintainer: Ashim
Project Link: https://github.com/yourusername/cogenix
Issues: https://github.com/yourusername/cogenix/issues
- Next.js - The React framework for production
- Ollama - Run large language models locally
- MongoDB - Database platform
- Vercel - Hosting platform
- TanStack Query - Data synchronization
- Tailwind CSS - Utility-first CSS framework
Current Version: 0.1.0
Status: Active Development
Roadmap:
- Add automated testing (Jest, React Testing Library)
- Implement user authentication
- Add support for multiple AI backends (OpenAI, Anthropic)
- Create Docker deployment configuration
- Add conversation export functionality
- Implement voice input/output
- Add plugin system for extensibility
- Create mobile applications (React Native)
Made with β€οΈ by the Cogenix Team
β Star us on GitHub β it motivates us to keep improving!