Gettyimages 200414013 001
Work Is Burning, Yo.
To enlarge / Work is burning, yo.

Since 2017, in my spare time (ha!), I’ve been helping my colleague Eric Berger run a weather forecast site in the Houston area, Space City Weather. This is an interesting hosting challenge – on a typical day, SCW gets 20,000–30,000 page views to 10,000–15,000 unique visitors, a relatively easy load to manage with minimal work. But during severe weather events, especially in the summer when hurricanes lurk in the Gulf of Mexico, the site’s traffic can reach more than a million page views in 12 hours. This level of traffic requires a little more preparation to manage.

Hey, that's it <a href="">Space City Weather</a>!” src=”” width=” 300″ height=”235″ srcset=” AM.jpg 2x”/></a><figcaption class=
To enlarge / Hey, that’s it Space City Weather!

Lee Hutchinson

For a very long time I ran SCW in a self-contained backend stack HAProxy For SSL cancellation, Lak Keshi for in-box caching and Nginx for the actual web server application – it’s all up front Cloudflare to absorb most of the load. (I wrote about this setup at length on Ars a few years ago for those who want more details.) The stack was fully battle-tested and ready to absorb any traffic we threw at it, but it was also annoyingly complex. , with multiple cache layers to contend with, and this complexity made troubleshooting more difficult than I would have liked.

So, during winter break two years ago, I took the opportunity to remove some of the complexity and reduce the hosting stack to a single monolithic web server application: OpenLiteSpeed.

Out with the old, in with the new

I didn’t know much about OpenLiteSpeed ​​(“OLS to its friends”) except that it was mentioned a bunch in WordPress hosting discussions – and I started to wonder if SCW WordPress worked. OLS has received a lot of praise for its integrated caching, especially when WordPress is involved; was claimed to be pretty quickly Compared to Nginx; and to be honest, after running the same stack for five years, I was interested in switching things up. It was OpenLiteSpeed!

OLS admin console showing vhosts.  This is from my personal web server, not the Space City Weather server, but it looks the same.  If you want more details about the OLS configuration I used, <a href="">check out my blog</a>.  Yes, I still have a blog.  I’m old.” src=”” width =”640″ height=”398″ srcset=” 1.jpg 2x”/></a><figcaption class=
To enlarge / OLS admin console showing vhosts. This is from my personal web server, not the Space City Weather server, but it looks the same. If you want more details about the OLS configuration I used, check out my blog. Yes, I still have a blog. I am old.

Lee Hutchinson

The first major tweak to tackle was configuring OLS primarily through the actual GUI, with all the annoying potential problems that brought with it (another port to secure, another password to manage, another public access point to the backend, more PHP ) resources dedicated only to the admin interface). But the GUI was fast and it mostly exposed the settings that needed to be exposed. Converting the existing Nginx WordPress configuration to OLS-speaking was a good adaptation job and I finally decided. Cloudflare tunnels as a reasonable way to keep the admin console private and relatively secure.

Just Get A Taste Of The Options That Await In The Litespeed ​​​​Cache Wordpress Plugin.
To enlarge / Just get a taste of the options that await in the LiteSpeed ​​​​Cache WordPress plugin.

Lee Hutchinson

Another major adjustment was OLS LiteSpeed ​​​​Cache plugin For WordPress, the main tool that WordPress itself uses to configure how it interacts with OLS and its internal cache. This is a great plugin pages and pages of configurable optionsmany of them are related to driving use Quic.Cloud CDN service (Operated by LiteSpeed ​​Technology, the company that created OpenLiteSpeed ​​​​and its paid sister, LiteSpeed).

Getting the most out of WordPress at OLS meant spending some time with the plugin, figuring out which option would help and which would hurt. (Perhaps unsurprisingly, there are many ways to get yourself into silly problems by being too aggressive with caching.) Fortunately, Space City Weather provides an excellent testing ground for web servers, a beautifully active site with lots of cache. -friendly workload and so I’ve come up with a startup configuration that I’m quite happy with, and speaking the ancient sacred words of the ritual, flipped the breaker switch. HAProxy, Varnish and Nginx shut down and OLS took the load.