Full Stack Web Pipeline
Now let's create a Full Stack Web App a little more complex than the previous designed to leverage a combination of technologies and not only just a mini API.
The project integrates a Firebase database with a Python FastAPI backend (we can leverage the config of the previous section) and served to users through a Vue.js frontend.
This architecture is chosen to facilitate the CRUD (Create, Read, Update, Delete) operations on a Story Firebase Document
object within a Firebase collection, enabling dynamic content management and real-time data interaction.
Here our document are called
Story
with the following fileds :
class Story(BaseModel):
audioURL: Optional[HttpUrl] = None
promptThemes: Optional[List[PromptTheme]] = []
timesSeen: Optional[int] = 0
timesListen: Optional[int] = 0
imageURL: Optional[List[HttpUrl]] = []
id: Optional[str] = None
story: Optional[str] = None
isPrinted: Optional[bool] = False
language: Optional[str] = None
title: str
isFavorite: Optional[bool] = False
Project architecture
-
Vue.js (Frontend) : JavaScript framework used for building user interfaces and single-page applications in a fast and clean way. Vue is a component-based architecture and support the routing and state management. We will be using NuxtUI in order to speed up our work π
-
Python FastAPI (Backend) : Modern and fast python web framework for building APIs with Python 3.7+. The key features are its speed and ease of use, with automatic OpenAPI documentation generation.
- Firebase (Database) : NoSQL database provided by Google, known for real-time data synchronization and automatic scaling. Flexible and schemaless design for the storage of complex nested objects.
- Docker compose : In order to the network communication and expose our app through HTTPS with a simple
nginx.conf
file. - Cloudron : Minimalist web server with awersome applications and a docker ready deployment kit for reals π¨βπ³
.
βββ README.md
βββ back
βββ certs
βββ docker-compose.yml
βββ front
βββ my.secrets
βββ nginx.conf
Firebase Backend
Let's begin with the python FastAPI backend and Firebase ! Our goal as you may know is to connect Firebase to our Python FastAPI backend and test it.
First go the official documentation, download and install the Firebase SDK to your project.
Do not forget to add the Firebase SDK to your requirements π€
Then add the Firebase initialisation code into your python FastAPI application like this :
from firebase_admin import credentials, firestore, initialize_app
# other import
# your init code like env variable, init FastAPI app ...
app = FastAPI()
# initialize Firebase
cred = credentials.Certificate('./firebase-key.json')
initialize_app(cred)
db = firestore.client()
COLLECTION_NAME = "your-collection-name"
Now we want to test if the connection is good between firebase and our mini python API, let's create two endpoints for that :
@app.get("/")
for testing purpose@app.get("/stories/")
in order to fetch all thestory
documents and return it injson
format.
First Endpoints
Let's begin by adding some typing to our request with the Pydantic's BaseModel
object like in the official documentation for our Firebase documents.
from fastapi import FastAPI, HTTPException, Response
from fastapi.middleware.cors import CORSMiddleware
from firebase_admin import credentials, firestore, initialize_app
import json, os
from pydantic import BaseModel, HttpUrl, ValidationError
from typing import List, Optional, Dict, Any
import logging
# Initialize FastAPI & log
app = FastAPI()
logger = logging.getLogger(__name__)
# Configure CORS
app.add_middleware(
CORSMiddleware,
allow_origins=["*"], # Allows all origins
allow_credentials=True,
allow_methods=["*"], # Allows all methods
allow_headers=["*"], # Allows all headers
)
# Initialize Firebase
cred = credentials.Certificate('./firebase-key.json')
initialize_app(cred)
db = firestore.client()
COLLECTION_NAME = "private-stories"
#
# π§πΌββοΈ Helpers π§πΌββοΈ
#
# edit this class according to your firebase documents/collections
class Story(BaseModel):
audioURL: Optional[HttpUrl] = None
promptThemes: Optional[List[PromptTheme]] = []
timesSeen: Optional[int] = 0
timesListen: Optional[int] = 0
imageURL: Optional[List[HttpUrl]] = []
id: Optional[str] = None
story: Optional[str] = None
isPrinted: Optional[bool] = False
language: Optional[str] = None
title: str
isFavorite: Optional[bool] = False
#test http
@app.get("/")
async def test():
return {"message": "OK"}
Now that we have a clean set up we can create the other endpoint @app.get("/stories/")
for fetching all documents in the Firebase database.
If you are not comfortable with Cross-Origin Resource Sharing (CORS) configuration just take a look to the fastAPI documentation on this subject here
@app.get("/stories/")
Endpoint
This endpoint look like a little more complexe than the other one π
No worries let's focus on the role of the FastAPI backend. It's all about receiving requests from the frontend, processes them, and performs the necessary CRUD operations on the Firebase database. It acts as an intermediary, ensuring data validation, authentication (if applicable), and business logic execution.
Processing data from Firebase
Processing data in our case is essential especially when dealing with data retrieved from a database like Firestore in a FastAPI application for: - Data Normalization: When you retrieve documents from a database, the data might not always be in the exact format you need for your application or API response (this is our case for the Null/None value). - Data Consistency: By processing the data before returning it to the client (our vue front), you ensure that the API's response is consistent, regardless of how the data is stored in the database. - Error Handling and Validation: Processing data allows you to validate the data and handle errors gracefully before sending it to the client. - Adaptation to Client Needs: Sometimes, data stored in a database contains more information than the client needs. Processing data allows you to reshape it, removing unnecessary fields or adding derived ones, to better match the client's requirements. - Security: Processing data before sending it out can also be a security measure. For example, if certain information should not be exposed to all clients, processing steps can filter out sensitive data based on the client's permissions.
For our case let's write a mini python functions called process_doc_data()
for our particular case :
#
# π§πΌββοΈ Helpers π§πΌββοΈ
#
def process_doc_data(doc_data: Dict[str, Any]) -> Dict[str, Any]:
# Replace 'Null' string with None for audioURL
if doc_data.get("audioURL") == "Null":
doc_data["audioURL"] = None
return doc_data
This
process_doc_data()
function normalizes the audioURL field by converting the stringNull
to Python'sNone
. This is important becauseNull
as a string is not the same asNone
in Python.
Now let's code our @app.get("/stories/")
endpoint here :
@app.get("/stories/")
async def read_stories():
try:
docs = db.collection(COLLECTION_NAME).stream()
stories = []
for doc in docs:
# Process each document data
processed_data = process_doc_data(doc.to_dict())
# Create a Story object and convert to a dictionary
story = Story(**processed_data).dict()
# Ensure URLs are serialized as strings
if story.get('audioURL'):
story['audioURL'] = str(story['audioURL'])
for idx, url in enumerate(story.get('imageURL', [])):
story['imageURL'][idx] = str(url)
stories.append(story)
return Response(content=json.dumps(stories, indent=4), media_type="application/json")
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
As you have seen, this method, read_stories
is designed to retrieve and return a list of stories
from a database. You can read the comment on the function and test it part by part in your own side for your data.
CRUD Endpoints
Now it's time to code (and test) our CRUD endpoints, we will not go into details into all the CRUD methods, it's pretty much the same methodology. Just be carful to your data, like if their are very specific you will use some pre-processing functions like we have seen with the process_doc_data()
function.
# read
@app.get("/story/{story_id}",)
def read_story_by_id(story_id: str):
try:
doc_ref = db.collection(COLLECTION_NAME).document(story_id)
doc = doc_ref.get()
# Check if the document exists
if not doc.exists:
return {"message": "Story not found"}
return Response(content=json.dumps(doc.to_dict(), indent=4), media_type="application/json")
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
#create
@app.post("/story")
async def create_story(story: CreateStory):
try:
# Add a new document with an auto-generated ID
doc_ref = db.collection(COLLECTION_NAME).document()
story_data = story.dict(exclude_none=True)
doc_ref.set(story_data)
return {"id": doc_ref.id} # Return the generated ID
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
#delete
@app.delete("/story/{story_id}", status_code=204)
async def delete_story(story_id: str):
try:
doc_ref = db.collection(COLLECTION_NAME).document(story_id)
doc = doc_ref.get()
if not doc.exists:
raise HTTPException(status_code=404, detail="Story not found")
doc_ref.delete()
return {"detail": f"Story {doc_ref.id} deleted successfully"}
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))
#upload file method --> bonus
@app.post("/uploadfiles/")
async def create_upload_files(files: List[UploadFile] = File(...)):
if BUCKET_NAME is not None:
uploaded_file_urls = upload_to_gcp(files, BUCKET_NAME, "TestUploatFromWebSite")
return {"filenames": uploaded_file_urls}
return {"message": "Bucket not assign"}
# update
@app.put("/story/{story_id}")
async def update_story(story_id: str, updated_story: Story):
try:
doc_ref = db.collection(COLLECTION_NAME).document(story_id)
doc = doc_ref.get()
story_data = doc.to_dict()
if not doc.exists:
raise HTTPException(status_code=404, detail="Story not found")
# my specific conditions before updating
if (updated_story.title is not None) and (updated_story.story is not None):
doc_ref.update(updated_story)
else:
raise HTTPException(status_code=400, detail="No updates provided")
return {"id": doc_ref.id}
except Exception as e:
print(f"Error: {e}")
raise HTTPException(status_code=500, detail=f"Server error: {str(e)}")
Docker Packaging
In this step we want to write the Dockerfile
for the backend server and the docker-compose.yml
file in order to add an nginx
service then serve our backend through HTTPS as we said before.
Dockerfile
Let's start with the FastAPI base image here :
FROM tiangolo/uvicorn-gunicorn-fastapi:python3.9
WORKDIR /app
COPY . /app
RUN pip install --no-cache-dir firebase-admin pydantic
EXPOSE 8000
#COPY .env /app
# Set environment variables from .env file
ENV ENV_FILE_LOCATION=/app/.env
ENTRYPOINT [ "/app/start.sh" ]
Do not forget to add a
.env
file into your./back
folder
Docker-compose file
Here the docker-compose.yml
file to expose our services (backend + frontend) and a nginx container for the SSL part.
version: '3.8'
services:
frontend:
build:
context: ./front
dockerfile: Dockerfile
ports:
- "3000:3000"
volumes:
- ./front:/src
environment:
- NODE_ENV=production
healthcheck:
test: [ "CMD", "curl", "-f", "http://0.0.0.0:3000/" ]
interval: 50s
timeout: 5s
retries: 3
backend:
build:
context: ./back
dockerfile: Dockerfile
ports:
- "8000:8000"
volumes:
- ./back:/app #path for live-reload
environment:
- ENV_FILE_LOCATION=/app/.env
healthcheck:
test: [ "CMD", "curl", "-f", "http://0.0.0.0:8000/" ]
interval: 30s
timeout: 10s
retries: 3
nginx:
image: nginx:latest
ports:
- "80:80"
- "443:443"
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf
- ./certs:/etc/ssl/certs
depends_on:
- backend
restart: always
Add nginx proxy
Then we can generate our SSL certificate in order to serve our backendf application through HTTPS, with editing a simple nginx.conf
file like this :
events {}
http {
upstream backend {
server backend:8000;
}
server {
listen 80;
listen [::]:80;
server_name yourdomain.com;
return 301 https://$host$request_uri;
}
server {
listen 443 ssl http2;
listen [::]:443 ssl http2;
server_name yourdomain.com;
ssl_certificate /etc/ssl/certs/cert.pem;
ssl_certificate_key /etc/ssl/certs/key.pem;
location / {
proxy_pass http://backend;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-Proto https;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
}
}
}
Do not forget to generate a free certificate ! if you are not comfortable with letsencrypt go to the
Docker for app deployment
section of this course then go to theDocker and HTTPS
article you will find all the necessary instructions to handle your SSL key generation.
Tests with pytest
We can test our backend and the firebase connection with pytest by testing one by one each endpoints like we did before in the previous part. Here is a simple example of endpoint testing with pytest
. Like in the github ci-cd previous part we can rely on the documentation
from fastapi.testclient import TestClient
from .main import app
client = TestClient(app)
def test_read_main():
response = client.get("/")
assert response.status_code == 200
assert response.json() == {"message": "OK"}
In this project we will not be testing all endpointsour backend directly with the pytest
library. We will be using cypress
End to End Test in order to discover an other approach for testing. We will see this into the front testing part π€
Frontend
For this front part, we will be using vue.js and the Nuxt UI framework in order to speed up our devlopement and leverage the Nuxt components.
Frontend architecture
.
βββ front
βΒ Β βββ CloudronManifest.json #our deployment file for Cloudron
βΒ Β βββ Dockerfile
βΒ Β βββ api.js
βΒ Β βββ app.config.ts
βΒ Β βββ app.vue
βΒ Β βββ assets
βΒ Β βββ components
βΒ Β βββ composables
βΒ Β βββ cypress #our tests will go there
βΒ Β βββ cypress.config.ts
βΒ Β βββ icon.png #our favicon
βΒ Β βββ my.secrets #our secrets for testing git workflow in local
βΒ Β βββ node_modules
βΒ Β βββ nuxt.config.ts
βΒ Β βββ package-lock.json
βΒ Β βββ package.json
βΒ Β βββ pages
βΒ Β βββ plugins
βΒ Β βββ public
βΒ Β βββ server
βΒ Β βββ tailwind.config.ts
βΒ Β βββ tsconfig.json
βββββββ types
Quickstart Nuxt UI
You can check at the Nuxt UI official guide here and install it with your favourite package manager npm
or yarn
for example.
Then the good thing to do is to install some modules like tailwindcss in order tot deal with css and other needed plugins. For example in my case, I have several plugings :
- Algolia to put a search bar in my website
- VueFire to link our Firebase database to our front
- Icons and flags to add more style to our page π
For this you need to edit your nuxt.config.ts
file at the project root like this :
// https://nuxt.com/docs/api/configuration/nuxt-config
export default defineNuxtConfig({
devtools: { enabled: true },
modules: [
'@nuxt/ui',
'@nuxtjs/algolia',
'@vueuse/nuxt',
'nuxt-vuefire',
],
runtimeConfig: {
firebase: {
bucketName: 'BUCKET_NAME',
pageSizeMax: 100,
pageSizeDefault: 10,
},
deepL: {
apiKey: 'DEEPL_API_KEY',
},
},
css: [
'@/assets/css/scrollbars.css',
],
algolia: {
apiKey: process.env.ALGOLIA_SEARCH_API_KEY,
applicationId: process.env.ALGOLIA_APPLICATION_ID,
instantSearch: {
theme: 'algolia',
},
},
ui: {
icons: ['mdi', 'flag'],
},
vuefire: {
config: {
},
},
})
As uyou can see I used a
.env
file to not expose the secrets on github, I recommend you to do the same π
NuxtUI consists of the following concepts :
- Base components - simple components to build on top, see documentation
- Plugings - package to simplify usage and customization of third-party, see documentation
- Themes - a configuration file to keep your styling consistent across the app, see documentation
- Composables - functions to help you build an interactive experience for the users, see documentation
We will not go into details on how to code the full front NuxtUI website, I will just show you the final result here :
As you can see I used cards to display my firebase data informations. Thanks to my friend and mentor Christophe Vilas Boas who help me a lot with this project π
Continuous Integration
For this continuous integration part, we will use Cypress for End to End testing and github action with the cypress plugin in order to speed up our tests and deployment π
This is the cypress.yaml
test file I used in order to test our application on each pull request like good devs π
name: Cypress Tests
on:
pull_request:
branches: [ "main" ]
workflow_call: # allow this workflow to be called from other workflows
secrets:
GOOGLE_APPLICATION_CREDENTIALS:
required: true
NUXT_FIREBASE_BUCKET_NAME:
required: true
ALGOLIA_API_KEY:
required: true
ALGOLIA_APPLICATION_ID:
required: true
jobs:
cypress_tests:
name: Cypress Tests
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4
- name: Setup Node
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Install Dependencies
run: yarn install --frozen-lockfile
- name: Start Server
env:
GOOGLE_APPLICATION_CREDENTIALS: ${{ secrets.GOOGLE_APPLICATION_CREDENTIALS }}
NUXT_FIREBASE_BUCKET_NAME: ${{ secrets.NUXT_FIREBASE_BUCKET_NAME }}
ALGOLIA_API_KEY: ${{secrets.ALGOLIA_API_KEY}}
ALGOLIA_APPLICATION_ID: ${{secrets.ALGOLIA_APPLICATION_ID}}
run: yarn dev --host & sleep 30
- name: Run Cypress Tests
run: npx cypress run
I will not go into all the tests because it will be very long. I will just drop here some test template I used for e2e testing of the frontend app
part and the /api
backend part here :
// cypress/e2e/spec.cy.js
describe('App', () => {
it('home-page', () => {
cy.visit('/', {
failOnStatusCode: true,
})
// A StoryCard should exist
cy.get('[data-test="StoryCard"]').should('exist')
})
it('api', () => {
cy.request({
method: 'GET',
url: '/api',
}).then((response) => {
// Expecting the response status code to be 200
expect(response.status).to.eq(200)
expect(response.body.message).to.eq('OK')
})
})
})
and for the backend part. This is a short version because there is a lot to test like the all CRUD operations :
// cypress/e2e/spec.cy.js
describe('Server API', () => {
it('Up', () => {
cy.request({
method: 'GET',
url: '/api',
}).then((response) => {
// Expecting the response status code to be 200
expect(response.status, 'response.status').to.eq(200)
expect(response.body.message, 'response.body.message').to.eq('OK')
})
})
I think you see the point here, it is not very long to write but it take some time to identify all the test cases scenarios and implement it without bug π€
Cloudron auto deployment
We will use github action to auto deploy our solution on Cloudron with this yml bellow here :
name: CloudRon Auto Deployment
on:
push:
branches: [ "main" ]
jobs:
cypress_tests:
name: Cypress Tests
uses: ./.github/workflows/cypress.yml
secrets:
GOOGLE_APPLICATION_CREDENTIALS: ${{ secrets.GOOGLE_APPLICATION_CREDENTIALS }}
NUXT_FIREBASE_BUCKET_NAME: ${{ secrets.NUXT_FIREBASE_BUCKET_NAME }}
ALGOLIA_API_KEY: ${{ secrets.ALGOLIA_API_KEY }}
ALGOLIA_APPLICATION_ID: ${{ secrets.ALGOLIA_APPLICATION_ID }}
build_push_to_registry:
name: Build & Push Docker image to Docker Hub
runs-on: ubuntu-latest
needs: cypress_tests
steps:
- uses: actions/checkout@v4
- name: Build docker image
run: docker build . --file Dockerfile --tag docker.io/fairycorp/nuxt-fabla:gh-${{github.run_number}}
- name: Login to Docker Hub
uses: docker/login-action@v3
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_PASSWORD }}
- name: Push to Docker Hub
run: docker push docker.io/fairycorp/nuxt-fabla:gh-${{github.run_number}}
deploy:
name: Deploy to Cloudron
runs-on: ubuntu-latest
needs: build_push_to_registry
steps:
- uses: actions/checkout@v4
- name: Environment Setup
uses: actions/setup-node@v4
with:
node-version: '20'
- name: Deploy setup
run: npm install -g cloudron
- name: Update App
run: |
update="cloudron update \
--server ${{ secrets.CLOUDRON_SERVER }} \
--token ${{ secrets.CLOUDRON_TOKEN }} \
--app ${{ secrets.CLOUDRON_APP }} \
--image docker.io/fairycorp/nuxt-fabla:gh-${{github.run_number}}"
# Retry up to 5 times (with linear backoff)
NEXT_WAIT_TIME=0
until [ $NEXT_WAIT_TIME -eq 5 ] || $update; do
sleep $(( NEXT_WAIT_TIME++ ))
done
[ $NEXT_WAIT_TIME -lt 5 ]
This should looks like this diagram here :
Wrap it up
In this article we have seen how to set up a full stack web project and how we can auto deploy it with cloudron and github action. Like I told you earlier, I have not went into details about the frontend
And that's it you have the complete full stack web pipeline, hope you have learn a thing or two in this article π₯³