Cloudflare workers are cool

2025/06/15

Writing a Telegram bot, my first option was to make it a Google Cloud Run function. The bot is stateless and in its core is a simple webhook responder. I quickly jotted down logic that only responded “Hello world” for every message. Even though it was about ten lines of Python, it took a couple minutes to deploy. I kind of expected that, knowing that Cloud Run functions are run in a docker container and building these could be slow due to many network requests. Cloud Run functions’ response time also increases for the first request, as it takes time to cold-start the container. Lastly, docker containers are bulky - a hundred megabytes to run 10 lines of code is off-putting.

Cloudflare Workers on the other hand boast support for 0ms cold starts. It turned out to be true, due to different approach they take. Instead of dedicated clusters to run containerized functions, Cloudflare puts functions on edge servers as V8 Isolates. The cold start is eliminated because V8 Engine is already running on the server. Deploying functions is also much faster, now that there is no need to build a container. Granted, a build step is still present if the program requires compilation (e.g. Typescript), but it is quick and happens on my laptop. Only the code itself needs to be uploaded (a couple kilobytes), as opposed to full runtime environment of a docker container.

sarg@thinkpad tgbot$ wrangler deploy --minify

 ⛅️ wrangler 4.20.0
───────────────────
Total Upload: 22.12 KiB / gzip: 7.50 KiB
Worker Startup Time: 14 ms
Uploaded tgbot (6.29 sec)
Deployed tgbot triggers (1.19 sec)