Disclaimer - The work you will see in this article is experimental.
Building distributed infrastructure is a pain, let alone building a globally distributed infrastructure. Building such infrastructure requires investment, Manpower, and Skills.
But the direction AWS has taken over the last couple of years have changed everything
If you really see how AWS is building its component you will realize that building API with minimum latency is no longer a difficult job. In fact, if you carefully understand each component then you will see a pattern, which utilized properly can enable you to build insanely powerful Infrastructure.
There are two components we going to talk about during this article.
- Lamdba@Edge
- Dynamo Global redundancy.
Part 1 will only talk about Lambda@edge. We will touch DynamoDB later in part2.
So let's start
On serverless deploying your functions in multi-region and enabling region API endpoints is not everyone does but that's there. You can work with that. But even after that, it's still a pain to manage the regional endpoints and multi-region lambda deployment, yeah management is not simple, and neither we are continuing with Lambda.
There is an aspect of AWS Lambda which we haven't explored greatly yet. It's a variant of lambda called Lambda@Edge.
Lambda@Edge is truly global. It's a global compute on request. The challenge is to understand how Lamdba@Edge can be used to run the API.
The general use case of lamdba@edge is to use it as a router or a redirecter. Using it as an engine to host our functions for our API endpoints is not a pattern we see across the industry much.
A large part of the problem is not many know how to code it in order to use it to run Global API's
This problem is solved in one of the tools I built for notion API extraction. I am posting the link to an HTTP-router that can help you to build API on top of Lambda@edge - https://github.com/maddygoround/notionedge/tree/master/http
Let's dive into code. We will build a simple global API endpoint. We will use the above library to achieve our objective.
The library behaves the same way express behaves. It enables you to break the standard coding pattern you follow for Lambda and gives you the ability to write your endpoints the way you write express endpoints
See below
const api = require('./http')({ provider: "awsEdge" });
const manifest = require("./manifest.json");
const { apis: { endpoints } } = manifest;
for (const route in endpoints) {
const { file, middlewares } = endpoints[route];
const middlewaresRequired = (middlewares && middlewares.length > 0) ? middlewares.map((middleware) => {
return require(`./${middleware}`)
}) : [];
api.any(route, ...middlewaresRequired, require(`./${file}`));
}
api.any("/", () => {
return {
"error": "Route not found!",
"routes": Object.keys(endpoints)
};
});
exports.handler = async (event, context, callback) => {
return await api.run(event, context, callback);
};
Let me walk through the code line by line
As you see this line provide you the instance of API, you are telling the HTTP router that you creating the endpoints on Lambda@Edge
const api = require('./http')({ provider: "awsEdge" });
We define our routes in the manifest.json file. manifest.json looks like this
const manifest = require("./manifest.json");
In the manifest, you have defined the endpoints. You can also define middleware in case you have any.
{
"apis": {
"endpoints": {
"/v1/pages/:id": {
"file": "api/routes/pages",
"middlewares": [
"api/middlewares/cors",
]
},
"/v1/tables/:id" : {
"file": "api/routes/tables",
"middlewares": [
"api/middlewares/cors",
]
}
}
}
}
Here we are iterating through the middleware and creating routes.
for (const route in endpoints) {
const { file, middlewares } = endpoints[route];
const middlewaresRequired = (middlewares && middlewares.length > 0) ? middlewares.map((middleware) => {
return require(`./${middleware}`)
}) : [];
api.any(route, ...middlewaresRequired, require(`./${file}`));
}
And finally, we are returning our result.
exports.handler = async (event, context, callback) => {
return await api.run(event, context, callback);
};
Each route is written inside the api/routes folder. It does not matter what the folder structure is just make sure your manifest files follow the structure.
The code inside the route looks like this
module.exports = async (req, res) => {
try {
return "hello" //write your own logic or do db operation
} catch (err) {
throw err;
}
}
Your middleware looks no different than the middleware you generally write in node
module.exports = async (req, res, next) => {
// set SPR/CORS headers
res.header('Access-Control-Allow-Origin', '*')
res.header('Cache-Control', 's-maxage=5, stale-while-revalidate')
res.header('Access-Control-Allow-Methods', 'GET')
res.header('Access-Control-Allow-Headers', 'pragma')
if (req.method === 'OPTIONS') {
res.status(200);
res.end();
return true
}
next();
}
Deploy your code on lamdba@edge, configure a CloudFront and point your lamdba@edge function to "origin-request"
That's it. Now you have a globally running API endpoint. It does not matter from where in the world you make a call from the code will be executed to your nearest region.
Here is a sample endpoint - https://notionedge.zoop.sh/v1/pages/2e22de6b770e4166be301490f6ffd420
In part 2 we will take about how we can take this further with the introduction of a global database.
Btw, here is an application that contains complete workflow - https://github.com/maddygoround/notionedge
Reach out in case you have any questions. ⁍.