This project focuses on modernizing IT infrastructures by combining DevOps practices, automation, and Artificial Intelligence. It delivers a MERN stack portal with a dashboard and an intelligent chatbot that can answer technical questions using real infrastructure data stored in MongoDB. The backend, built with Node.js and Express.js, manages queries and interacts with automated workflows powered by n8n, which dynamically generate prompts for a local AI agent running on Ollama LLM. The entire solution is fully containerized with Docker, ensuring modular deployment of the frontend, backend, database, automation engine, AI components, and chatbot interface. It is divided into two main components:
-
Infrastructure Modernization – Analysis of a traditional server-based infrastructure, identification of its limitations (performance, maintenance, scalability), and proposal of modern solutions (virtualization, hyperconvergence, private cloud) through technical and economic benchmarking.
-
Automation & AI Integration – Development of a MERN stack application (MongoDB, Express.js, React.js, Node.js) replicating an infrastructure portal with a dashboard and an intelligent chatbot. The chatbot provides answers to technical questions using real infrastructure data stored in MongoDB.
- Data imported from heterogeneous files (
.csv,.json,.txt) located inshared/ - Node.js/Express backend for query handling
- Automated workflows with n8n connected to a local AI agent powered by Ollama LLM
- Fully dockerized deployment (frontend, backend, database, n8n, AI, chatbot UI)
- Data imported from heterogeneous files (
This project provides a practical demonstration of how infrastructure modernization, automation, containerization, and AI can be combined to deliver innovative, efficient, and scalable IT solutions.
- Monorepo layout:
backend/,frontend/ - Container support:
docker-compose.yml - OS/Shell (example): Windows 10, PowerShell
- Docker and Docker Compose (or Docker Desktop on Windows)
- Git
- Optional (for local, non‑Docker runs): language/runtime and package managers used by your backend and frontend
# from the repository root
docker compose up -d --build
# view logs
docker compose logs -f
# stop
docker compose downDefault service URLs depend on your compose configuration. Common examples:
- Frontend: http://localhost:3000
- Backend API: http://localhost:8000 or http://localhost:5000
Check the compose file for actual ports.
Run services independently from the repo root:
# backend (example)
cd backend
# install deps & start dev server (adjust to your stack)
# e.g. npm install && npm run dev
# frontend (example)
cd ../frontend
# install deps & start dev server (adjust to your stack)
# e.g. npm install && npm start- Create environment files as required by
backend/andfrontend/(e.g.,.env). - Mirror any required values in your
docker-compose.ymlor Docker environment if using containers.
chat_bot/
backend/ # server-side code (API, services, workers)
frontend/ # client-side app (UI)
docker-compose.yml # container orchestration for local dev
README.md # this file
- Install dependencies: run the package manager commands inside
backend/andfrontend/. - Lint/format: use the commands defined by each package (e.g.,
npm run lint,npm run format). - Test: run test scripts for backend and frontend (e.g.,
npm test).
- Create a new branch from
main. - Make focused changes with clear commit messages.
- Ensure lint/tests pass.
- Open a pull request with a concise description and screenshots/logs where helpful.
- Containerized: build and push images, or deploy via your preferred platform using
docker-compose.ymlas reference. - Non‑containerized: follow your backend/frontend deployment guides.
# initialize (if not already a repo)
git init
git add .
git commit -m "chore: initial commit"
# create a new GitHub repo, then set the remote and push
git remote add origin https://github.com/<your-username>/<your-repo>.git
git branch -M main
git push -u origin mainChoose and add a LICENSE file (MIT, Apache-2.0, etc.) if you haven’t already.
Please open an issue or discussion in the repository for questions and support.