bnjmnrsh / CloudflareWorker-middleman-API

A middleman API boilerplate for Cloudflare Workers

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

A simple middleman API boilerplate for Cloudflare Workers

A recipe for a light middleman API using Cloudflare Workers, which includes whitelisting origin requests, fetching multiple third-party endpoints, caching, and a single collated JSON response. PR's welcome.

Why?

Keeping secrets, secret ๐Ÿ‘ฎ๐Ÿผโ€โ™‚๏ธ

With Cloudflare Workers we can leverage environmental variables and secrets 1, to keep these details out of your HTTP requests, code base and repos. ๐ŸŽ‰

Simplicity & Cost ๐Ÿ’ฐ

Cloudflare Workers are 'serverless', written in JavaScript, and are easy to spool up. This cuts out the setup and maintenance overhead of complex tooling. Cloudflare's generous free tier makes them perfect for side projects, Github Pages, etc.

Speed ๐ŸŽ

Cloudflare's global network of low-latency servers ensures that your requests are handled by a hub nearest to your users. Further, any subsequent 3rd party API fetch calls your worker makes use of Cloudflare's best-in-class global network, resulting in flaming hot tacos for response times ๐ŸŒฎ๐ŸŒฎ (and who doesn't like tacos).

Anecdotal experience based on flaky broadband in rural Scotland and an even shoddier 3G network proves that this middleman API approach greatly improved the responsiveness of my apps, especially when collating two or more asynchronous fetch requests. Also, I can now run faster, learn a new language in a day, and have lasers for my eyes .... your mileage may vary. ๐Ÿƒ๐Ÿผโ€โ™‚๏ธ ๐Ÿ•ถ๏ธ

What's included?

Roll your own API from multiple sources ๐Ÿšช๐Ÿšช๐Ÿšช

The aToFetch array provides a mechanism for naming multiple API endpoints, and all the responses are returned as one unified JSON object.

Hotlink protection โ›“๏ธโ›“๏ธ

You can also check the IP address of incoming requests; if it's not from one of your whitelisted origins (i.e. your app), it's rejected with a 403 response -- No tacos for you, sir/mam!

Caching ๐Ÿšค

While Cloudflare Workers do have access to the powerful cache-control features of the Cache API, for Workers using fetch, (as we are), Cloudflare offers a simplified caching API.

Errors ๐Ÿšจ

In addition to console logs in the Workers Quick Edit interface, HTTP and upstream API errors are passed through to the response object with handle-able {'error': response} entries for each request. A single non-responsive endpoint won't bring the whole thing down.

What's cooking in this recipe? ๐Ÿฒ ๐Ÿฅ˜

In this recipe, for demonstration, we use the WeatherBit.io APIs, and we keep the API key hidden in a environment variable.

You'll need to:

  1. Have a Cloudflare Workers account
  2. An API key from WeatherBit.io
  3. Create a new Worker
  4. Create an environmental variable on this worker named WB_KEY for your shiny new API key.

Once your worker is published, try running the URL for it in your browser: https://YOURWORKER.YOURACCOUNT.workers.dev

Your should receive the following response:

403 Not a whitelisted domain.
content-length: 42
content-type: text/plain;charset=UTF-8
Requests are not allowed from this domain -- no tacos for you!

Now set the variable bDBG = true and re-run the request. You should now get the following:

200 OK
access-control-allow-headers: *
access-control-allow-methods: GET
access-control-allow-origin: *
content-length: 381
**content-type: **application/json;charset=UTF-8
{"USEAGE":{"calls_remaining":49756,"historical_calls_count":null,"calls_count":"244","calls_reset_ts":1616457599,"historical_calls_reset_ts":null,"historical_calls_remaining":null},"CURRENT":{"error":"Invalid Parameters supplied."}, "HOURLY":{"error":"Invalid Parameters supplied."}, "DAILY":{"error":"Invalid Parameters supplied."}, "ALERTS":{"error":"Invalid Parameters supplied."}}

The WeatherBit API requires a location in order to do its โ˜€๏ธ || โ›ˆ magic. Try adding longitude and latitude values: https://YOURWORKER.YOURACCOUNT.workers.dev/?lat=28.385233&lon=-81.563873

200 OK
access-control-allow-headers: *
**access-control-allow-methods: **GET
access-control-allow-origin: *
content-length: 381
content-type: application/json;charset=UTF-8
{"USEAGE":{"calls_remaining":XXXX,"historical_calls_count":null,"call_count":"XXX","calls_reset_ts":XXXXXXX,"historical_calls_reset_ts":null,"historical_calls_remaining":null},"CURRENT":{"data": โ›„ โ˜€๏ธโ›…โ˜” } ... }

๐Ÿธ!

Testing

Once your API is live, you probably don't want to set the bDBG boolean variable to true again. However, for quick checks for what your responses are, you can pop open the console in your browser while on one of your white-listed domains and run the following:

fetch('https://YOURWORKER.YOURACCOUNT.workers.dev/?lat=28.385233&lon=-81.563873')
.then(function (response) {
	if (response.ok) {
		return response.json()
	}
	return Promise.reject(response)
})
.then(function (data) {
	console.log(data)
	data.json()
})
.catch(function (error) {
	console.warn(error)
})

What happens when... ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ๐Ÿ”ฅ

What happens if my Cloudflare worker uses up its quota?

Burst Rates

At the time of writing, free Workers' plans are subject to burst limits of 1,000 requests per minute. Beyond this, the Worker will return an HTTP 429 response, which your application should handle gracefully.

Daily Limits

At the time of writing, free Workers' plans are subject to a daily request limit of 100,000 requests. How requests greater than 100,000 a day are handled depends on how routes are set up in your workers. For our purposes, the default 'Fail closed' will respond as if there is no worker at all, returning an HTTP 552 status code, which your application should handle gracefully.

Details on limits: Workers Limits


How many fetch sub-requests can I make on a CF Worker?

CF caps the number of subrequests [1] at 50, with each redirect counting towards this limit. This means that the total number of subrequests may be greater then the total number of fetch(request) calls in your Worker's code. [2]

What if I go over quota on one of my 3rd party APIs?

Third parties may handle this differently, though rejection will likely come in the form of some flavour of 4XX, with 429 Too Many Requests typical for rate limiting. As this example uses the WeatherBit API, instead of sending their typical data object, WeatherBit responds with: { "status_code": 429, "status_message": "Your request count (1022) is over the allowed limit of 1000 per day - Upgrade your key, or retry after 848.16666666667 minutes" }

But as you can see, it still returns a valid JSON object. So long as the response is JSON, our example passes it along for the client to handle. In this case, testing for the lack of a data object and/or the presence of a status_code should be sufficient to handle the issue gracefully.

What if I am using sloooowApi.com?

CF states that the typical CPU runtime for a worker is less than one millisecond, with a cap of 10ms on the free tier and 50ms on the "Bundled" tier [3]. So, long-running computing processes have a hard ceiling. However, this doesn't include response times. There's no 'hard limit' on the amount of "real-time" a Worker may use waiting for a fetch response, as long as the client that made the request remains connected. [4]

Further Reading

If you're new to Cloudflare Workers, these articles are a good place to start:

This project was greatly inspired by these two articles by Chris Ferdinandi, who provides a great intro to the subject:

About

A middleman API boilerplate for Cloudflare Workers

License:MIT License


Languages

Language:JavaScript 100.0%