This project is a production-ready, scalable, and secure Telegram bot built to translate .srt subtitle files using Gemini AI. Designed with microservices architecture and full support for asynchronous tasks, caching, monitoring, and more.
- Translate
.srtsubtitle files using Gemini AI API. - Async architecture with
aiogramandaiohttp. - Redis caching to reduce repeated API calls.
- Task queue system powered by
Celery. - SQLite/PostgreSQL database for persistent user storage.
- Monitoring support with Prometheus and Grafana.
- Linted, formatted, and fully documented codebase.
- Automated unit testing using
pytest.
- Python >= 3.8
- Redis
- Celery
- Docker & Docker Compose (optional for full setup)
git clone https://github.com/GeekNeuron/srt-Ai-Telegram-Bot.git
cd srt-Ai-Telegram-Botpython -m venv venv
source venv/bin/activate
pip install -r requirements.txtBOT_TOKEN=your_telegram_bot_token
GEMINI_API_KEY=your_gemini_api_key
REDIS_URL=redis://localhost:6379python create_db.pyWorker:
celery -A worker.celery_app worker --loglevel=infoBot:
python bot.pyPrometheus Metrics (Optional):
Runs on port 8000.
Run tests:
pytest tests/Lint and format:
flake8 .
black .To run all services via Docker:
docker-compose up --build- Prometheus: http://localhost:9090
- Grafana: http://localhost:3000 (default login: admin/admin)
├── bot_service/ # Telegram interaction logic
├── core/ # Translation, utils, DB, tasks
├── tests/ # Unit tests with pytest
├── db.py # SQLAlchemy async DB setup
├── worker.py # Celery app instance
├── create_db.py # Script to initialize tables
├── requirements.txt # Python dependencies
├── docker-compose.yml # Multi-service container config
└── .env.example # Env var sample
MIT License © 2024 GeekNeuron