Devops Test
Devops Test
In this guide, we'll set up a microservice architecture with four components using
Python, Docker, and GitHub Actions. We'll document the process thoroughly to ensure
clarity and ease of understanding.
microservice-architecture/
├── api-service/
│ ├── app/
│ ├── Dockerfile
│ ├── requirements.txt
│ └── tests/
├── frontend/
│ ├── public/
│ ├── src/
│ ├── Dockerfile
│ └── package.json
├── auth-service/
│ ├── app/
│ ├── Dockerfile
│ ├── requirements.txt
│ └── tests/
├── database/
│ └── init.sql
├── docker-compose.yml
└── .github/
└── workflows/
└── ci-cd.yml
Use Flask to handle API requests and interact with the database.
api-service/app/main.py:
app = Flask(__name__)
app.config['SQLALCHEMY_DATABASE_URI'] =
'postgresql://user:password@db/microservice'
db = SQLAlchemy(app)
class Item(db.Model):
id = db.Column(db.Integer, primary_key=True)
name = db.Column(db.String(80), nullable=False)
@app.route('/items', methods=['GET'])
def get_items():
items = Item.query.all()
return jsonify([{"id": item.id, "name": item.name} for item in items])
if __name__ == "__main__":
app.run(host='0.0.0.0', port=5000)
Set Up the Dockerfile:
api-service/Dockerfile:
FROM python:3.9-slim
WORKDIR /app
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
COPY . .
CMD ["python", "app/main.py"]
Define Requirements:
api-service/requirements.txt:
Flask
Flask-SQLAlchemy
psycopg2-binary
api-service/tests/test_api.py:
import unittest
from app.main import app, db, Item
class TestAPI(unittest.TestCase):
def setUp(self):
self.app = app.test_client()
db.create_all()
db.session.add(Item(name="Test Item"))
db.session.commit()
def tearDown(self):
db.session.remove()
db.drop_all()
def test_get_items(self):
response = self.app.get('/items')
self.assertEqual(response.status_code, 200)
data = response.get_json()
self.assertEqual(len(data), 1)
self.assertEqual(data[0]['name'], 'Test Item')
if __name__ == "__main__":
unittest.main()
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<title>Frontend</title>
</head>
<body>
<h1>Items</h1>
<ul id="items"></ul>
<script>
fetch('http://localhost:5000/items')
.then(response => response.json())
.then(data => {
const itemsList = document.getElementById('items');
data.forEach(item => {
const li = document.createElement('li');
li.textContent = item.name;
itemsList.appendChild(li);
});
});
</script>
</body>
</html>
frontend/Dockerfile:
FROM nginx:alpine
COPY src /usr/share/nginx/html
app = Flask(__name__)
SECRET_KEY = 'your_secret_key'
@app.route('/verify', methods=['POST'])
def verify_token():
token = request.json.get('token')
try:
jwt.decode(token, SECRET_KEY, algorithms=["HS256"])
return jsonify({"message": "Token is valid"}), 200
except jwt.ExpiredSignatureError:
return jsonify({"message": "Token has expired"}), 401
except jwt.InvalidTokenError:
return jsonify({"message": "Invalid token"}), 401
if __name__ == "__main__":
app.run(host='0.0.0.0', port=5001)
Set Up the Dockerfile:
auth-service/Dockerfile:
FROM python:3.9-slim
WORKDIR /app
COPY requirements.txt requirements.txt
RUN pip install -r requirements.txt
COPY . .
CMD ["python", "app/main.py"]
Define Requirements:
auth-service/requirements.txt:
Flask
PyJWT
auth-service/tests/test_auth.py:
import unittest
from app.main import app, SECRET_KEY
import jwt
class TestAuth(unittest.TestCase):
def setUp(self):
self.app = app.test_client()
def test_verify_token(self):
token = jwt.encode({"user": "test"}, SECRET_KEY, algorithm="HS256")
response = self.app.post('/verify', json={"token": token})
self.assertEqual(response.status_code, 200)
self.assertEqual(response.get_json()["message"], "Token is valid")
def test_invalid_token(self):
response = self.app.post('/verify', json={"token": "invalid_token"})
self.assertEqual(response.status_code, 401)
self.assertEqual(response.get_json()["message"], "Invalid token")
if __name__ == "__main__":
unittest.main()
docker-compose.yml:
version: '3'
services:
db:
image: postgres:latest
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: password
POSTGRES_DB: microservice
volumes:
- ./database/init.sql:/docker-entrypoint-initdb.d/init.sql
api-service:
build: ./api-service
ports:
- "5000:5000"
depends_on:
- db
auth-service:
build: ./auth-service
ports:
- "5001:5001"
frontend:
build: ./frontend
ports:
- "80:80"
.github/workflows/ci-cd.yml:
on:
push:
branches: [ main ]
pull_request:
branches: [ main ]
jobs:
build-and-test:
runs-on: ubuntu-latest
services:
docker:
image: docker:19.03.12
options: --privileged
steps:
- name: Checkout code
uses: actions/checkout@v2
README.md:
# Microservice Architecture
## Overview
## Project Structure
microservice-architecture/
├── api-service/
├── frontend/
├── auth-service/
├── database/
├── docker-compose.yml
└── .github/
└── workflows/
## Getting Started
### Prerequisites
- Docker
- Docker Compose
- GitHub account (for CI/CD)
### Installation
The API service is built with Flask and provides a simple endpoint to retrieve a
list of items.
- **Endpoint**: `/items`
- **Method**: `GET`
- **Response**: JSON array of items
The Auth service is built with Flask and uses JWT for authentication.
- **Endpoint**: `/verify`
- **Method**: `POST`
- **Request Body**: `{"token": "your_jwt_token"}`
- **Response**: JSON message indicating token validity
### Frontend
The frontend is a simple static website that fetches data from the API service and
displays it.
### Database
A PostgreSQL database initialized with a script to create the `items` table and
insert sample data.
## CI/CD
This project uses GitHub Actions for CI/CD. The workflow is defined in
`.github/workflows/ci-cd.yml`.
- **Build and Test**: Installs dependencies and runs unit tests for API and Auth
services.
- **Docker Build and Push**: Builds Docker images and pushes them to Docker Hub.
- **Deployment**: Deploys the services using Docker Compose.
## Contributions
## License
This project is licensed under the MIT License.
git init
git add .
git commit -m "Initial commit"
Push to GitHub:
Final Steps:
DOCKER_HUB_USERNAME
DOCKER_HUB_ACCESS_TOKEN
Verify the CI/CD pipeline:
Make a commit to the main branch and ensure the pipeline runs successfully.