Skip to main content
  1. Blog
  2. Article

Canonical
on 11 December 2017

Silph Road embraces cloud and containers with Canonical


The Silph Road is the premier grassroots network for Pokémon GO players around the world offering research, tools, and resources to the largest Pokémon GO community worldwide, with up to 400,000 visitors per day

Operating a volunteer-run, community network with up to 400,000 daily visitors is no easy task especially in the face of massive and unpredictable demand spikes, and with developers spread all over the world. With massive user demand and with volunteer developers located all over the world, The Silph Road’s operations must be cost-effective, flexible, and scalable.

This led the Pokémon GO network first to cloud, and then to containers and in both cases Canonical ’s technology was the answer.

Highlights

  • Containerisation with Canonical’s Distribution of Kubernetes helped reduce cloud build by 40%
  • Autoscaling makes coping with spikes in user demand easy and cost-effective
  • Juju enables The Silph Road to migrate between public clouds with less than 2 minutes downtime

For more information and to view the case study please visit Silph Road Case Study

Related posts


Johann Wolf
27 April 2026

Why Web Engineering is great

Ubuntu Article

Like many software engineers, one of my first software development experiences started with creating my own web page. Since that time 20+ years ago, a lot has changed in the web landscape. Having worked a lot in web since then, I’d like to take a moment to reflect on what I think makes web great! ...


Ishani Ghoshal
27 April 2026

Ubuntu 16.04 LTS has reached the end of standard Expanded Security Maintenance with Ubuntu Pro. Here are your options.

Ubuntu Article

Ubuntu 16.04 LTS (Xenial Xerus) reached the end of its five-year Expanded Security Maintenance (ESM) window in April 2026. If you are still running 16.04, it is critical to address your support status to ensure continued security and compliance. Your support options Now that 16.04 is in its Legacy phase, you have two primary paths: ...


Rob Gibbon
27 April 2026

Understanding disaggregated GenAI model serving with llm-d

AI Article

What is llm-d? llm-d is an open source solution for managing high-scale, high-performance Large Language Model (LLM) deployments. LLMs are at the heart of generative AI – so when you chat with ChatGPT or Gemini, you’re talking to an LLM. Simple LLM deployments – where an LLM is deployed to a single server – can ...