An interactive portfolio that leverages the Gemini API to provide a dynamic, conversational experience. This is not just a static portfolio; it's an interactive application where users can chat with an AI assistant to learn more about my work.
Traditional portfolios are static and passive. This project transforms the conventional portfolio into an engaging, interactive experience, allowing visitors to directly query an AI assistant about projects and skills, providing a deeper, more personalized understanding of my work.
- β¨ Features
- π οΈ Technology Stack
- ποΈ Architecture
- π§ͺ Testing
- π Documentation
- π API Access Model & Security
- π Quick Start
- π³ Docker
- πΈ Visual Demo
- π€ Contributing
- π License
- π Contact
-
π€ Conversational AI Chatbot: Engage directly with an AI assistant powered by the cutting-edge Gemini API to explore projects and gain insights.
-
π¨ Dynamic Project Showcase: A clean, modern interface designed to beautifully present diverse portfolio projects.
-
π Intelligent Semantic Search: Leverage AI to semantically search for projects based on natural language queries, providing highly relevant results. This now includes a robust keyword fallback and graceful handling of API quota errors, ensuring search functionality remains available and user-friendly.
-
π Seamless Contact Integration: The chatbot is designed to intuitively guide users to an interactive contact form, simplifying communication. Note: This feature is planned for future implementation.
-
πΎ Session-based Conversations: Chat history is automatically saved to
sessionStorage
, ensuring continuity within a single browser tab and clearing upon tab closure. The full conversation history is now sent with each request to the worker, ensuring the AI model maintains context. -
π€ Intuitive Voice Input: The application is designed to allow hands-free interaction with the chatbot using integrated voice-to-text functionality via the Web Speech API. Note: This feature is planned for future implementation.
-
π Adaptive Light/Dark Mode: Personalize your viewing experience with a toggle for light and dark themes.
This project is built with a selection of modern and efficient technologies, chosen for their performance, flexibility, and developer experience.
- Frontend: TypeScript, HTML5, CSS3 (No framework, uses JavaScript template literals for HTML templating)
- AI Layer: Cloudflare Workers (secure API proxy, distributed KV-backed rate limiting, refined guardrails, embedding generation with caching, calling Google Gemini API directly), Google Gemini API (using
gemini-2.0-flash
model,embedding-001
model) - Testing: Playwright (for End-to-End testing), Vitest (for Worker unit testing)
- Speech Recognition: Web Speech API
The application is a client-side, single-page application (SPA) that interacts directly with the Google Gemini API from the user's browser.
For a better viewing experience on GitHub, the diagram is rendered from a .mmd
file. It includes icons, which require Font Awesome to be available in the rendering environment.
flowchart LR
subgraph "Browser"
A[Vite SPA]
end
subgraph "Cloudflare"
B[Worker]
C[KV RATE_LIMIT_KV]
G[Guardrails]
T[Tools]
E[KV PROJECT_EMBEDDINGS_KV]
end
subgraph "Google Cloud"
D[Gemini API]
end
%% Connections
A -- "POST /chat (prompt, history)" --> B
A -- "POST /api/generateEmbedding" --> B
B -- "Auth & Rate Limit" --> C
B -- "Apply Guardrails" --> G
G -- "If safe, proceed" --> B
B -- "generateContent (with tools, history)" --> D
D -- "response (text/tool_call)" --> B
B -- "Execute Tool (e.g., projectSearch)" --> T
T -- "Tool Output (projects, notice)" --> B
B -- "Cache Query Embedding" --> E
E -- "Retrieve Project Embeddings" --> T
B -- "Streaming SSE (text/tool_response)" --> A
The Cloudflare Worker acts as a secure proxy and backend for AI-related functionalities, exposing the following key endpoints:
/chat
: Handles conversational requests, forwarding them to the Gemini API, applying rate limiting, and enforcing guardrails to prevent sensitive content injection./api/generateEmbedding
: Generates vector embeddings for text, also protected by rate limiting and guardrails. This endpoint is designed for internal use by the application (e.g., for semantic search) and not for direct client access.
- Technologies: Vanilla TypeScript, HTML, CSS.
- Responsibilities: Renders the main portfolio page, including the header, hero section, and project cards. It also provides the user interface for the chatbot, including the chat window, message history, and input form. All UI manipulation is handled directly via the DOM.
- Technologies: TypeScript.
- Responsibilities: This is the core of the application, running entirely in the user's browser.
- State Management: Manages the application state, such as the conversation history.
- AI Integration: Handles communication with the Cloudflare Worker, which processes and simplifies the Gemini API's raw response before sending a clean, structured response to the frontend.
- Orchestration Logic: Contains the logic to interpret user intent based on keywords.
- Data Persistence: Uses the browser's
localStorage
to save and load the chat history.
- Project Data: Project information is sourced from
frontend/projects.ts
and sent with each chat request to the worker. - Conversation History: Stored in a JavaScript array in memory during the session and persisted to
localStorage
. - Vector Embeddings: Project embeddings for semantic search are generated by the Cloudflare Worker (via the
/api/generateEmbedding
endpoint) and cached in frontend memory (projectEmbeddings
) on application load.
- Technologies: Docker, Nginx, GitHub Pages, Cloudflare Workers.
- Responsibilities: The application includes a multi-stage
Dockerfile
for containerization and is configured for automated deployment to GitHub Pages via GitHub Actions. The AI backend is deployed as a Cloudflare Worker.
Frontend Browser -> Cloudflare Worker -> Google Gemini API
β Enhanced Security: The
GEMINI_API_KEY
andALLOWED_ORIGINS
are securely stored as Cloudflare Worker secrets, preventing their exposure. TheVITE_WORKER_URL
for the frontend is stored as a **GitHub repository secret. This robust approach is suitable for production environments. The Cloudflare Worker also implements refined guardrails with an adjusted
TRIPWIRE` regex to prevent false positives while maintaining strong protection against sensitive content injection.
To ensure the reliability and quality of the application, a comprehensive testing strategy is employed:
- End-to-End (E2E) Testing with Playwright:
- Simulates real user interactions in a browser to validate the entire application workflow, including UI, application logic, and API integrations.
- Covers key scenarios like general conversation, contact form submission, rate limiting, and guardrail enforcement. It also includes comprehensive security tests to validate guardrails against various attack scenarios.
- All E2E tests are currently passing.
- To run E2E tests:
npx playwright test
- Worker Unit Testing with Vitest:
- Ensures the individual components and logic of the Cloudflare Worker function correctly.
- All worker unit tests are currently passing. A critical bug related to the Gemini model was recently identified and fixed, and all tests continue to pass after the resolution, ensuring the chatbot's stability.
- To run Worker unit tests:
npm test --prefix worker
-
Install dependencies:
- Run
npm install
in the project root. - Run
npm install --prefix worker
in the project root to install worker-specific dependencies.
- Run
-
Set up Environment Variables (Development):
-
In the
frontend
directory, create a.env.local
file with the following content:VITE_WORKER_URL="http://127.0.0.1:8787"
-
In the
worker
directory, create a.dev.vars
file with the following content:GEMINI_API_KEY="YOUR_GOOGLE_AI_STUDIO_KEY_HERE" ALLOWED_ORIGINS="http://localhost:5173,http://127.0.0.1:5173" RATE_LIMIT_KV_ID="YOUR_KV_NAMESPACE_ID_HERE"
-
-
Set up Environment Variables (Production):
- Cloudflare Worker Secrets:
GEMINI_API_KEY
: Your Google AI Studio key (set vianpx wrangler secret put GEMINI_API_KEY
).ALLOWED_ORIGINS
: Your GitHub Pages URL (e.g.,https://gmpho.github.io
) (set vianpx wrangler secret put ALLOWED_ORIGINS
).RATE_LIMIT_KV_ID
: The ID of yourRATE_LIMIT_KV
namespace (set vianpx wrangler secret put RATE_LIMIT_KV_ID
).
- GitHub Repository Secret:
VITE_WORKER_URL
: The URL of your deployed Cloudflare Worker (e.g.,https://ai-powered-static-portfolio-worker.<YOUR_ACCOUNT_NAME>.workers.dev
) (set viagh secret set VITE_WORKER_URL
).
- Cloudflare Worker Secrets:
-
Run the development servers:
-
In one terminal, start the frontend server:
npm run dev
-
In a second terminal, start the worker server from the project root:
npx wrangler dev worker/src/index.ts
-
For detailed troubleshooting, refer to the Debugging and Troubleshooting section in GEMINI.md
and the Known Issues document for specific resolutions.
Containerize this application for consistent and isolated environments using Docker.
Build the image:
# The frontend Docker image does not require the API key.
docker build -t ai-portfolio .
Run the container:
docker run -p 8080:80 ai-portfolio
The application will be available at http://localhost:8080
.
Experience the interactive AI-powered portfolio in action:
Contributions are welcome! Please see the CONTRIBUTING.md for guidelines.
This project is licensed under the MIT License - see the LICENSE file for details.
For questions or feedback, please open an issue or contact me directly.