Skip to content

Cloud functions

Cloud functions are an important notion in serverless computing because they provide a way to run code without having to manage servers. This makes it possible to build and deploy applications more quickly and easily, and it can also help to reduce costs.

Cloud functions are typically event-driven. This means that they are only executed when triggered by an event, such as an HTTP request or a message from a queue. This can help to improve performance and reduce costs, as you are only paying for the resources that you use.

Key benefits of using cloud functions

  • Agility: Cloud functions make it easy to quickly deploy and scale applications. Cost-effectiveness: Cloud functions are a pay-as-you-go model, so you only pay for the resources that you use.
  • Reliability: Cloud functions are highly reliable and scalable.
  • Security: Cloud functions are built with security in mind.

Cloud functions can be used to build a wide variety of applications, including:

  • Web applications
  • Mobile applications
  • APIs
  • Microservices
  • Data processing applications
  • Event-driven applications

Deploy a Simple HTTP-triggered Cloud Function

You're a developer at a startup and want to quickly prototype a backend service without setting up a full server and having to look at storage.

We will create a cloud service (it is call cloud function) that can receive HTTP requests and send back a response without setting up a dedicated server.

Azure Functions

We will use the Azure Functions to create an HTTP-triggered dummy function that returns a "Hello, World!" message.

First you need a Microsoft Azure account like we have seen in the introduction part if you haven't already install the official CLI, then go to Azure functions section in your dashboard.

On your local machine, you'll need to set up the Azure Functions development environment. Install the Azure Functions Core Tools and the Azure Functions extension for Visual Studio Code.

Create a new Azure function project

func init MyFunctionProj --python
Navigate to the project directory:
cd MyFunctionProj

Create a new function:

func new --name HelloWorldFunction --template "HTTP trigger" --authlevel "anonymous"

This will generate a directory named HelloWorldFunction with files for your new function. Open __init__.py in this directory and modify the code to:

import logging
import azure.functions as func

def main(req: func.HttpRequest) -> func.HttpResponse:
    logging.info('Python HTTP trigger function processed a request.')

    return func.HttpResponse("Hello, World!", status_code=200)

Deploy the Azure Function

From within the MyFunctionProj directory:

func azure functionapp publish MyFunctionApp

After deployment, the CLI will provide you with the URL of the function. Visit this URL in your browser or use a tool like curl to trigger the function and see the "Hello, World!" message.

Google cloud function

Same deal here, you need a GCP account like we have seen in the introduction part if you haven't already install the official CLI, then go to Cloud functions section in your dashboard.

Do not forget to authenticate and set up your project using gcloud init before writing the function. You can go to the web GUI cloud function section and use the online editor or edit the main.py file in local.

main.py
def hello_world(request):
    """Responds to any HTTP request.
    Args:
        request (flask.Request): The request object.
    Returns:
        The response text, or any set of values that can be turned into a
        Response object using `make_response`.
    """
    return 'Hello, World!'

Deploy the Cloud Function

Deploy the function using the following command:

gcloud functions deploy hello_world --runtime python310 --trigger-http --allow-unauthenticated
or you can also use the web GUI to deploy the function as you may know.

After deployment, you'll receive a URL for the function. Visit the URL in your browser or use a tool like curl to trigger the function and see the "Hello, World!" message.

Data Storage Integration - build a mini pipeline

Context: Your startup's prototype is now expanding, and you need to log user interactions and you need to manage a storage service πŸ˜…

Our objective here is to store logs of every HTTP request made to your function in Google Cloud Storage bucket.

You have this python script who send data to a cloud function and store the results in cloud storage, I'll provide a general solution that uses HTTP requests to send logs to a cloud function. The cloud function will then be responsible for storing the logs in cloud storage.

import json
import time
import random
import requests

# Sample log messages
sample_logs = [
    {"level": "INFO", "message": "User logged in", "user_id": 1},
    {"level": "DEBUG", "message": "Query executed", "user_id": 3},
]
error_logs = [
    {"level": "ERROR", "message": "Failed to connect to database", "user_id": 2},
    {"level": "ERROR", "message": "Permission denied", "user_id": 4},
]

# Cloud function endpoint
CLOUD_FUNCTION_ENDPOINT = "YOUR_CLOUD_FUNCTION_URL"

def send_log_to_cloud_function(log):
    response = requests.post(CLOUD_FUNCTION_ENDPOINT, json=log)
    if response.status_code == 200:
        print(f"Sent log: {log}")
    else:
        print(f"Failed to send log: {log}. Status code: {response.status_code}")

def simulate_log_stream():
    while True:
        if random.random() < 0.1:
            log = random.choice(error_logs)
        else:
            log = random.choice(sample_logs)

        send_log_to_cloud_function(log)
        time.sleep(random.uniform(0.5, 3))

if __name__ == "__main__":
    simulate_log_stream()

Create a Google Cloud Storage Bucket

Navigate to the Google Cloud Storage section in the GCP Console and create a new bucket to store the logs. You can also do this with the gcloud command πŸ€“

Here some gcloud cheat sheet if you want to handle GCP CLI like a pro grame master

Write the base template of the Cloud Function

The function below is a simple template in order to assure the communiquation between our 2 services : cloud function and cloud storage you can test this first task before writing the next part !

def log_request(request):
    from google.cloud import storage

    # Set up the storage client and bucket
    client = storage.Client()
    bucket = client.get_bucket('YOUR_BUCKET_NAME')

    # Extract data from the request
    log_data = f"{request.remote_addr} - {request.method} - {request.url}\n"

    # Write data to a blob in GCS
    blob = bucket.blob('logs.txt')
    blob.upload_from_string(log_data, content_type='text/plain', client=client)

    return 'Logged request!', 200

Deploy the Cloud Function

After testing the function like we have done in the previous part you can deploy it with this bash command here :

gcloud functions deploy log_request --runtime python310 --trigger-http --allow-unauthenticated

Write the cloud function with event trigger

For the cloud function part, we need to set up a function (here on GCP but you can also do the same on AWS Lambda, Azure Functions, etc.) that accepts these logs and writes them to a different cloud storage eigher it is ERROR or not.

from google.cloud import storage
import os
import flask
import json

def store_log(request: flask.Request):
    log_data = request.get_json(silent=True)

    # Set up the storage client
    client = storage.Client()

    # Determine the bucket based on log level
    if log_data.get('level') == 'ERROR':
        bucket_name = 'YOUR_ERROR_BUCKET_NAME'
    else:
        bucket_name = 'YOUR_GENERAL_BUCKET_NAME'
    bucket = client.get_bucket(bucket_name)

    # Write data to a blob in GCS
    blob = bucket.blob('logs.txt')
    blob.upload_from_string(json.dumps(log_data) + "\n", content_type='text/plain', client=client)

    return 'Logged request!', 200

Replace YOUR_ERROR_BUCKET_NAME and YOUR_GENERAL_BUCKET_NAME with the actual names of your buckets for error logs and other logs, respectively.

This function will now store logs with the level "ERROR" in a separate bucket from the other logs 🧐

Create a Serverless API

Context: Your storage service is not enought advance you need to add a "real database" πŸ˜…

Here we will create a Serverless API that can perform CRUD operations on user data using Google Cloud Functions and Firestore in order to illustrate

Set Up Firestore

Firestore is a good choice for a NoSQL database because it is scalable, fast, flexible, secure, and fully managed. It can also be easily integrated with cloud functions to build complex and powerful applications.

Why Firebase is Serverless

Firestore is a good choice for a NoSQL database because it is:

  • Scalable: Firestore can scale horizontally to handle large amounts of data and traffic.
  • Fast: Firestore offers real-time data synchronization and low latency reads and writes.
  • Flexible: Firestore supports a variety of data types and schema designs. – Secure: Firestore encrypts data at rest and in transit. – Fully managed: Google manages Firestore, so you don't have to worry about provisioning, scaling, or maintenance.

  • Navigate to the Firestore section in the GCP Console.

  • Create a new Firestore database.
  • Create a collection named users.

Write the cloud function

from google.cloud import firestore
import json

db = firestore.Client()

def crud_api(request):
    request_json = request.get_json(silent=True)
    users_ref = db.collection('users')

    if request.method == 'GET':
        users = users_ref.stream()
        return json.dumps([{doc.id: doc.to_dict()} for doc in users])

    elif request.method == 'POST':
        user_data = request_json.get('data')
        users_ref.add(user_data)
        return 'User added!', 200

    elif request.method == 'PUT':
        user_id = request_json.get('id')
        user_data = request_json.get('data')
        users_ref.document(user_id).set(user_data)
        return 'User updated!', 200

    elif request.method == 'DELETE':
        user_id = request_json.get('id')
        users_ref.document(user_id).delete()
        return 'User deleted!', 200

    else:
        return 'Method not supported!', 400

Do not forget to edit the requirement.txt file if you have not yet πŸ€“

Deploy the Cloud Function

Deploy the function using the following command:

gcloud functions deploy crud_api --runtime python310 --trigger-http --allow-unauthenticated

Install Azure function tools on Linux OS

  1. Log in to the Azure Portal
  2. Create a minimal Linux VM with the Azure Portal
  3. SSH into the Linux VM
  4. Install Azure Functions Core Tools by executiong the following commands one by one
    sudo apt-get update -y
    sudo apt-get upgrade -y
    wget -q https://packages.microsoft.com/config/ubuntu/18.04/packages-microsoft-prod.deb -O packages-microsoft-prod.deb
    sudo dpkg -i packages-microsoft-prod.deb
    sudo apt-get update
    sudo apt-get install apt-transport-https -y
    sudo apt-get install dotnet-sdk-2.1 -y
    sudo apt-get install azure-functions-core-tools -y
    
  5. Test That the Azure Function Core Tools Are Properly Installed bu running the following command into your terminal VM
    func
    

Create and Run an Azure Function Locally Using Azure Functions Core Tools

  1. Create the Azure Functions Project : Start the creation of the project by issuing the following command :

    func init
    
    When prompted, select dotnet as the project type and press Enter. Wait a moment until it finishes. You can select the programming lang you want. The official example is in C# πŸ€“
  2. Create the Azure Function in the command prompt, issue the following command:

    func new
    
    When prompted, select HttpTrigger from the list and press Enter. Wait for the prompt to come back. Check that the function was created by listing the files inside your directory.
  3. Run the Function App with command line :

    func start
    
    The function app will build and start and you will see the URL of this function.
  4. Invoke the Function : You can use your favorite HTTP request tool (curl, wget, postman...) and invoke the function.

You have set up a public (anybody with this URL can call the function) cloud function on Microsoft Azure πŸ₯³

Deploy Azure function

For local deployment with visual studio code you can follow the official tutorial and instantiate a function with you programming langage here

Best practices

The exercise provided is a great starting point for understanding how to integrate serverless functions with databases and perform CRUD operations. However, in real-world scenarios, allowing unauthenticated access to perform CRUD operations on your database is a significant security risk. Here's why:

  • Data Integrity and Privacy: Without authentication and authorization, anyone can create, read, update, or delete records in your database. This can lead to data corruption, loss of data, or unauthorized access to sensitive information.
  • Data Tampering: Malicious actors can intentionally insert malicious or junk data, modify existing records, or even delete entire datasets.
  • Resource Exhaustion: Without any form of rate limiting or authentication, attackers can flood your serverless functions with requests, leading to increased costs and potential denial of service. Remember, with serverless architectures, you often pay per invocation.
  • Regulatory and Compliance Issues: Many industries have regulations and standards around data access and protection (e.g., GDPR, HIPAA). Allowing unauthenticated access can lead to non-compliance, which can result in hefty fines and legal actions.
  • Loss of Trust: If customers or users find out that their data can be accessed or modified by anyone, they will lose trust in your service or application. This can lead to a loss of users or customers and can harm your brand's reputation.
  • No Audit Trail: Without authentication, it's challenging to maintain an audit trail of who accessed or modified the data. An audit trail is crucial for understanding data breaches, debugging issues, and maintaining data accountability.
  • Complexity in Data Recovery: In the event of data tampering or deletion, recovering the original data can be complex, especially if backups were not regularly maintained or if the extent of the tampering is vast.

For these reasons, it's crucial to implement authentication and authorization mechanisms when exposing any form of data access or modification to the outside world. This ensures that only authorized users can perform specific actions, protecting the integrity and privacy of your data. In addition, it's also a good practice to implement rate limiting, logging, monitoring, and regular backups to further secure your applications and data.

Conclusion

Cloud functions are a powerful and flexible way to build and deploy applications. By using cloud functions, developers can focus on writing code without having to worry about managing servers. This can lead to faster development times, lower costs, and more reliable and secure applications.

Now let's do a summary of the informations we konw :

  • Basics of serverless architecture and why is it different from monolith architecture.
  • Understanding of HTTP-triggered basic functions
  • Link HTTP-triggered basic functions to a serverless database
  • Familiarity with Azure Functions or Google Cloud Functions deployment process and CLI