• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Home
  • Create a VM ($25 Credit)
  • Buy a Domain
  • 1 Month free Back Blaze Backup
  • Other Deals
    • Domain Email
    • Nixstats Server Monitoring
    • ewww.io Auto WordPress Image Resizing and Acceleration
  • About
  • Links

IoT, Code, Security, Server Stuff etc

Views are my own and not my employer's.

Personal Development Blog...

Coding for fun since 1996, Learn by doing and sharing.

Buy a domain name, then create your own server (get $25 free credit)

View all of my posts.

  • Cloud
    • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
    • How to buy a new domain and SSL cert from NameCheap, a Server from Digital Ocean and configure it.
    • Setting up a Vultr VM and configuring it
    • All Cloud Articles
  • Dev
    • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
    • How to setup pooled MySQL connections in Node JS that don’t disconnect
    • NodeJS code to handle App logins via API (using MySQL connection pools (1000 connections) and query parameters)
    • Infographic: So you have an idea for an app
    • All Development Articles
  • MySQL
    • Using the free Adminer GUI for MySQL on your website
    • All MySQL Articles
  • Perf
    • PHP 7 code to send object oriented sanitised input data via bound parameters to a MYSQL database
    • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
    • Measuring VM performance (CPU, Disk, Latency, Concurrent Users etc) on Ubuntu and comparing Vultr, Digital Ocean and UpCloud – Part 1 of 4
    • Speeding up WordPress with the ewww.io ExactDN CDN and Image Compression Plugin
    • Setting up a website to use Cloudflare on a VM hosted on Vultr and Namecheap
    • All Performance Articles
  • Sec
    • Using the Qualys FreeScan Scanner to test your website for online vulnerabilities
    • Using OWASP ZAP GUI to scan your Applications for security issues
    • Setting up the Debian Kali Linux distro to perform penetration testing of your systems
    • Enabling TLS 1.3 SSL on a NGINX Website (Ubuntu 16.04 server) that is using Cloudflare
    • PHP implementation to check a password exposure level with Troy Hunt’s pwnedpasswords API
    • Setting strong SSL cryptographic protocols and ciphers on Ubuntu and NGINX
    • Securing Google G Suite email by setting up SPF, DKIM and DMARC with Cloudflare
    • All Security Articles
  • Server
    • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
    • All Server Articles
  • Ubuntu
    • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
    • Useful Linux Terminal Commands
    • All Ubuntu Articles
  • VM
    • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
    • All VM Articles
  • WordPress
    • Speeding up WordPress with the ewww.io ExactDN CDN and Image Compression Plugin
    • Installing and managing WordPress with WP-CLI from the command line on Ubuntu
    • How to backup WordPress on a host that has CPanel
    • Moving WordPress to a new self managed server away from CPanel
    • Moving a CPanel domain with email to a self managed VPS and Gmail
    • All WordPress Articles
  • All

php

Recovering a Dead Nginx, Mysql, PHP WordPress website

July 10, 2021 by Simon

(laptrinhx.com – do not steal this post)

In early 2021 www.fearby.com died and it was all my fault. Here is my breakdown of the events.

On the 4th of January 2021, I woke to see my website not loading.

Cloud flare reporting my websitre was unavailable.

https://www.fearby.com had 2 servers.

  • Web server (www.fearby.com)
  • Database server (db.fearby.com)

Upon investigating why my website was down I found out that WordPress could not talk to the database server. I tried to log into the database server via SSH failed (no response). I tried logging into my db.fearby.com server via the root console and it too did not work.

I was locked out of my own server and memory told me it was caused by my be playing with fail2ban and other system auditing tools a few months earlier.

I tried restoring the db.fearby.com serer from backups (one at a time). I had the last 7 days as individual backups.  I had no luck, all of my backups were no good (I sat on the problem too long).

View of the last 7 days of backups.

In mid-2020 I locked myself out of db.fearby.com (SSH and root console) because I setup an aggressive fail2ban, AIDE intrusion detection system(s) and firewall rules. I could no longer access my db.fearby.com server via SSH or the root console.  The database server was still operational and I foolishly left it running (with no access).

I did not know how to (or had enough time) reset the root password on the Debian server. I was unable to think of a fix to restore my website. I should have reset the root password, it is easy to do thanks to a post from Janne Roustemaa – How to reset root password on cloud server.

A few months ago I finally found out how to reset the root password of a Debian server.

How to Reset the Root Password on a Debian server on UpCloud

I followed Janne Roustemaa’s guide here: How to reset root password on cloud server.

I logged into the Up Cloud Hub.

UpCloud Hub (login page)

I shut down db.fearby.com from the UpCloud Dashboard.

I created a backup of db.fearby.com (just in case). I upgraded my backup plan from 1 backup every 7 days to daily backups for 7 days and weekly backups for 1 month.

Database Backup plan selection

Deploy a temporary server to reset the root password

I deployed a new temporary (cheap) server alongside the dead server in Chicago.

Get $25 free credit on UpCloud and deploy your own server: Use this link to get $25 credit (new UpCloud users only).

Deploy $5/m server in Chicago.

I called the server “recovery.fearby.com” and set Debian 9 and the Operation System (same as the dead server).

Name: Recovery.fearby.com, Debian 9

I added the command “shutdown -h 1” to the Initialization script to ensure the server shuts down after it was deployed. 

I can only add the disk to the new server if the old db.fearby.com server is shut down.

Deploy server

I shut down db.fearby.com and recovery.fearby.com servers.

Shutting down servers GUI

Both servers have shut down

Shutdown

Detach the disk from db.fearby.com

I detach the disk from db.fearby.com server.

In the UpCloud Dashboard, I opened the db.fearby.com and clicked the resize tab

Resize Disk

I clicked the Detach button.

Detach Disk

I clicked Continue

Continue

Attach db.fearby.com disk to recovery.fearby.com

Now I attached this disk as a secondary disk onto the recovery.fearby.com server.

In the UpCloud hub I clicked Servers then selected the recovery.fearby.com server, then clicked the Resize Tab

Resize recovery.fearby.com

I scrolled down and clicked Attach existing storage

Attach Disk

Attach existing storage dialogue

Attach device Dialog

I selected the system disk from the db.fearby.com (that I detached earlier)

Attach Disk

I clicked Add a storage device button

Attach Disk dialog

Now I have attached the storage from db.fearby.com and attached it to recovery.fearby.com as a secondary disk.

2 Disks attached

Starting the recovery.fearby.com server

I started the recovery.fearby.com server by clicking Start

Start

The server is starting

Starting

When the server started, I obtained its IP and connect to it with MobaXTerm.

MobaXTerm SSH Client.

Now I can access the db.fearby.com disk.

I do not want to reset the root password until I undelete the files I need.

Viewing Disks

I ran this command to verify the attached disks.

lsblk

Two disks were visible

2 disk were visible.

Alternatively, I can view partitions with the following command

cat /proc/partitions
major minor  #blocks  name
 254        0   26214400 vda
 254        1   26213376 vda1
 254       16   52428800 vdb
 254       17   52427776 vdb1

I can see partition data with these commands

recovery.fearby.com disk: /dev/vda1

fdisk -l /dev/vda1
Disk /dev/vda1: 25 GiB, 26842497024 bytes, 52426752 sectors
Units: sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes

db.fearby.com disk: /dev/vdb1

fdisk -l /dev/vdb1
Disk /dev/vdb1: 50 GiB, 53686042624 bytes, 104855552 sectors
Units: sectors of 1 * 512 = 512 bytes
Sector size (logical/physical): 512 bytes / 512 bytes
I/O size (minimum/optimal): 512 bytes / 512 bytes

I ran this command to mount the second disk

mount /dev/vdb1 /mnt

I looked in the “/mnt/Backup” folder  as this was where I had daily MySQL dumps saving too on the old system.

The folder was empty. It looks like the file system was corrupted.

Finding Deleted Files

I assumed the file system was corrupt, I installed Testdisk as I wanted to recover deleted SQL dump backups in the backup folder. 

sudo apt-get update
sudo apt-get install testdisk

I confirmed testdisk was installed

photorec --version

I ran testdisk and passed in the disk as a parameter

sudo photorec /dev/vdb1

I selected the Disk /dev/vdb1 – 53GB and pressed enter

I selected ext4 partition and pressed enter (not whole disk)

I selected ext2/ext3/ext4 filesystem and pressed enter

I selected Free to only scan in unallocated space and pressed enter

When asked to choose the recovery location to restore files to I selected /recovery (on the recovery.fearby.com disk , not the db.fearby.com disk.

I just realized that the recovery.fearby.com disk is 25Gb and the db.fearby.com disk is 50GB, lets hope I do not have more than 25GB of deleted files (or I will have to delete recovery.fearby.com and deploy a 100GB system) and run undelete again.

Recovered Files

After 20 minutes 32,809 files were recovered to /recovery/recup_dir

I pressed CTRL+C to exit photorec

I changed directory to /recovery

cd /recovery

I counted the recovered files

find . -type f | wc -l
>32810

Recovered Files were placed in sub folders.

drwxr-xr-x 68 root root  4096 Jan 19 13:28 .
drwxr-xr-x 23 root root  4096 Jan 19 13:28 ..
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.1
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.10
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.11
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.12
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.13
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.14
drwxr-xr-x  2 root root 20480 Jan 19 13:22 recup_dir.15
drwxr-xr-x  2 root root 20480 Jan 19 13:22 recup_dir.16
drwxr-xr-x  2 root root 20480 Jan 19 13:22 recup_dir.17
drwxr-xr-x  2 root root 20480 Jan 19 13:22 recup_dir.18
drwxr-xr-x  2 root root 20480 Jan 19 13:22 recup_dir.19
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.2
drwxr-xr-x  2 root root 36864 Jan 19 13:22 recup_dir.20
drwxr-xr-x  2 root root 24576 Jan 19 13:22 recup_dir.21
drwxr-xr-x  2 root root 20480 Jan 19 13:22 recup_dir.22
drwxr-xr-x  2 root root 24576 Jan 19 13:22 recup_dir.23
drwxr-xr-x  2 root root 20480 Jan 19 13:22 recup_dir.24
drwxr-xr-x  2 root root 20480 Jan 19 13:22 recup_dir.25
drwxr-xr-x  2 root root 32768 Jan 19 13:22 recup_dir.26
drwxr-xr-x  2 root root 32768 Jan 19 13:22 recup_dir.27
drwxr-xr-x  2 root root 36864 Jan 19 13:22 recup_dir.28
drwxr-xr-x  2 root root 32768 Jan 19 13:22 recup_dir.29
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.3
drwxr-xr-x  2 root root 20480 Jan 19 13:22 recup_dir.30
drwxr-xr-x  2 root root 20480 Jan 19 13:22 recup_dir.31
drwxr-xr-x  2 root root 20480 Jan 19 13:22 recup_dir.32
drwxr-xr-x  2 root root 20480 Jan 19 13:22 recup_dir.33
drwxr-xr-x  2 root root 36864 Jan 19 13:22 recup_dir.34
drwxr-xr-x  2 root root 36864 Jan 19 13:22 recup_dir.35
drwxr-xr-x  2 root root 36864 Jan 19 13:22 recup_dir.36
drwxr-xr-x  2 root root 32768 Jan 19 13:22 recup_dir.37
drwxr-xr-x  2 root root 20480 Jan 19 13:23 recup_dir.38
drwxr-xr-x  2 root root 20480 Jan 19 13:23 recup_dir.39
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.4
drwxr-xr-x  2 root root 20480 Jan 19 13:23 recup_dir.40
drwxr-xr-x  2 root root 20480 Jan 19 13:23 recup_dir.41
drwxr-xr-x  2 root root 20480 Jan 19 13:23 recup_dir.42
drwxr-xr-x  2 root root 20480 Jan 19 13:23 recup_dir.43
drwxr-xr-x  2 root root 20480 Jan 19 13:23 recup_dir.44
drwxr-xr-x  2 root root 20480 Jan 19 13:23 recup_dir.45
drwxr-xr-x  2 root root 20480 Jan 19 13:23 recup_dir.46
drwxr-xr-x  2 root root 20480 Jan 19 13:24 recup_dir.47
drwxr-xr-x  2 root root 20480 Jan 19 13:24 recup_dir.48
drwxr-xr-x  2 root root 20480 Jan 19 13:24 recup_dir.49
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.5
drwxr-xr-x  2 root root 20480 Jan 19 13:24 recup_dir.50
drwxr-xr-x  2 root root 20480 Jan 19 13:24 recup_dir.51
drwxr-xr-x  2 root root 20480 Jan 19 13:25 recup_dir.52
drwxr-xr-x  2 root root 20480 Jan 19 13:25 recup_dir.53
drwxr-xr-x  2 root root 20480 Jan 19 13:25 recup_dir.54
drwxr-xr-x  2 root root 20480 Jan 19 13:25 recup_dir.55
drwxr-xr-x  2 root root 20480 Jan 19 13:26 recup_dir.56
drwxr-xr-x  2 root root 20480 Jan 19 13:26 recup_dir.57
drwxr-xr-x  2 root root 24576 Jan 19 13:26 recup_dir.58
drwxr-xr-x  2 root root 36864 Jan 19 13:26 recup_dir.59
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.6
drwxr-xr-x  2 root root 20480 Jan 19 13:26 recup_dir.60
drwxr-xr-x  2 root root 20480 Jan 19 13:26 recup_dir.61
drwxr-xr-x  2 root root 20480 Jan 19 13:27 recup_dir.62
drwxr-xr-x  2 root root 20480 Jan 19 13:27 recup_dir.63
drwxr-xr-x  2 root root 20480 Jan 19 13:27 recup_dir.64
drwxr-xr-x  2 root root 20480 Jan 19 13:28 recup_dir.65
drwxr-xr-x  2 root root 12288 Jan 19 13:28 recup_dir.66
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.7
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.8
drwxr-xr-x  2 root root 20480 Jan 19 13:21 recup_dir.9
66 folders were recovered

I calculated the folder size

sudo  du -sh /recovery
13G     /recovery

I immediately started downloading ALL recovered files (13GB) to my local PC with MobaXTerm

On recovery.fearby.com I searched for any trace of recovered *.sql files that I was dumping daily via cron jobs

find /recovery -name "*.sql"

Many pages of recovered sql files were listed

/recovery/recup_dir.49/f36487168.sql
/recovery/recup_dir.49/f35110912.sql
/recovery/recup_dir.49/f36667392.sql
/recovery/recup_dir.49/f36995072.sql
/recovery/recup_dir.49/f35667968.sql
/recovery/recup_dir.49/f35078144.sql
/recovery/recup_dir.49/f37535744.sql
/recovery/recup_dir.49/f36913152.sql
/recovery/recup_dir.49/f33996800.sql
/recovery/recup_dir.49/f36421632.sql
/recovery/recup_dir.49/f35061760.sql
/recovery/recup_dir.49/f36143104.sql
/recovery/recup_dir.49/f36618240.sql
/recovery/recup_dir.49/f34979840.sql
/recovery/recup_dir.49/f37273600.sql
/recovery/recup_dir.49/f35995648.sql
/recovery/recup_dir.49/f36241408.sql
/recovery/recup_dir.49/f37732360.sql
/recovery/recup_dir.49/f34603008.sql
/recovery/recup_dir.49/f33980416.sql
/recovery/recup_dir.1/f0452896.sql
/recovery/recup_dir.1/f0437184.sql
/recovery/recup_dir.1/f0211232.sql
/recovery/recup_dir.17/f16547840.sql
/recovery/recup_dir.17/f16252928.sql
/recovery/recup_dir.17/f15122432.sql
/recovery/recup_dir.17/f17159720.sql
/recovery/recup_dir.17/f15089664.sql
/recovery/recup_dir.17/f15958016.sql
/recovery/recup_dir.17/f15761408.sql
/recovery/recup_dir.57/f69582848.sql
/recovery/recup_dir.57/f69533696.sql
/recovery/recup_dir.57/f69173248.sql
/recovery/recup_dir.57/f68321280.sql
/recovery/recup_dir.57/f70483968.sql
/recovery/recup_dir.57/f70746112.sql
/recovery/recup_dir.57/f68730880.sql
/recovery/recup_dir.57/f67862528.sql
/recovery/recup_dir.57/f70123520.sql
/recovery/recup_dir.57/f68337664.sql
/recovery/recup_dir.57/f70172672.sql
/recovery/recup_dir.57/f71057408.sql
/recovery/recup_dir.57/f68796416.sql
/recovery/recup_dir.57/f70533120.sql
/recovery/recup_dir.57/f69419008.sql
/recovery/recup_dir.57/f68239360.sql
/recovery/recup_dir.57/f69779456.sql
/recovery/recup_dir.57/f68255744.sql
/recovery/recup_dir.57/f67764224.sql
/recovery/recup_dir.57/f71204864.sql
/recovery/recup_dir.57/f70336512.sql
/recovery/recup_dir.57/f68501504.sql
/recovery/recup_dir.57/f67944448.sql
/recovery/recup_dir.50/f39059456.sql
/recovery/recup_dir.50/f38518784.sql
/recovery/recup_dir.50/f40206336.sql
/recovery/recup_dir.50/f40927232.sql
/recovery/recup_dir.50/f39485440.sql
/recovery/recup_dir.50/f39092224.sql
/recovery/recup_dir.50/f40861696.sql
/recovery/recup_dir.50/f39731200.sql
/recovery/recup_dir.50/f40337408.sql
/recovery/recup_dir.50/f38862848.sql
/recovery/recup_dir.50/f41664512.sql
/recovery/recup_dir.50/f41074688.sql
/recovery/recup_dir.50/f40828928.sql
/recovery/recup_dir.50/f41713664.sql
/recovery/recup_dir.50/f38092800.sql
/recovery/recup_dir.50/f39878656.sql
/recovery/recup_dir.50/f38305792.sql
/recovery/recup_dir.50/f38830080.sql
/recovery/recup_dir.50/f39534592.sql
/recovery/recup_dir.50/f39813120.sql
/recovery/recup_dir.50/f40435712.sql
/recovery/recup_dir.50/f41467904.sql
/recovery/recup_dir.50/f37901728.sql
/recovery/recup_dir.50/f38682624.sql
/recovery/recup_dir.50/f38191104.sql
/recovery/recup_dir.50/f38174720.sql
/recovery/recup_dir.50/f40878080.sql
//and many more

It would take a few hours to download 32,000 files from the other side of the world. Next time I deploy fearby.com, I will deploy it to Sydney Australia.

It looks like MobaXTerm does not copy to the selected destination that I selected when dragging and dropping files, Being impatient I scanned my system for a file that MobaXTerm had copied. MobaXTerm saves download to “%Documents%\MobaXterm\splash\tmp\dragdrop\”.

Sql file contents

I opened some of the recovered SQL files and it looks like all files were partial and were not the whole database backup. A compete mysql dump should be over 200 lines long. Dang.

I downloaded all files I could from the these folders

  • /mnt/etc/nginx
  • /mnt/var/lib/mysql
  • /mnt/Scripts

Disappointed, I sat on things for a few weeks, I thought I had lost my website.

Try 2 – Reinstall a fresh db.fearby.com

After I copied files from the old db.fearby.com serve to recovery.fearby.com, I reattached the storage to the old db.fearby.com system.

In vein, I tried mending the broken MySQL service on db.fearby.com. I tried uninstalling and reinstalling MySql on db.fearby.com (each time no luck). I was having trouble with MySQL not starting.

I had too many rabbit holes (mysql errors) to list. I thought my database was corrupt.

I was kicking myself for letting my access to db.fearby.com lapse, I was kicking myself for not having more backups.

Try 3 – Reinstall MySQL

I tried deploying a new server and setting it up, maybe from the frustration I did not do it correctly. I had HTTPS issues with CloudFlare. I gave up for a few months

Try 4 – Check for Database Corruption

I downloaded DiskInternals – MySQL Recovery and scanned my database. To my amazement, it reported no database corruption.

All tables loaded

Maybe I can recover my website?

Try 5 – Using RunCloud.io to deploy a website

Having failed with setting up a server from scratch, a friend (Hi Zach) said I should try RunCloud to deploy a server.

I tried to document everything from attaching RunCloud to UpCloud and Cloudflare’s API to deploying a server.

Long story short, RunCloud is not for me.  I had too many issues with RunCloud and IPV6, CloudFlare API Integration, No SSH access to my server, no Nginx editing capabilities with RunCloud.

RunCloud errors

I deleted the RunCloud deployed server.

Try 6 – 10 minutes deploying a serer by hand

I ended up deploying a server in 10 minutes manually without taking notes.

Summary

I deployed a server (this time to Sydney).

I Installed the UFW firewall (configured and started)

sudo apt-get install ntp

Set the date and time

sudo timedatectl set-timezone Australia/Sydney

Installed ntp time server

sudo apt-get install ntp

I Installed PHP 7.4 and PHP 7.4 FPM (I cheated and googled: https://www.cloudbooklet.com/install-php-7-4-on-debian-10/ )

I edited PHP Config

sudo nano /etc/php/7.4/fpm/php.ini

Installed NGINX webserver 

sudo apt-get install nginx

Configured NGINX (I got an CLoudflare to NGINX SSL Certificate), I also sighned up for a Cloud flare SSL certificate

sudo nano /etc/nginx/nginx.conf

Contents

worker_cpu_affinity auto;
worker_rlimit_nofile 100000;
user www-data;
worker_processes auto;
pid /run/nginx.pid;
include /etc/nginx/modules-enabled/*.conf;

events {
        worker_connections 768;
         multi_accept on;
}

http {
        root /www-root;
        client_max_body_size 10M;

        proxy_connect_timeout 1200s;
        proxy_send_timeout 1200s;
        proxy_read_timeout 1200s;
        fastcgi_send_timeout 1200s;
        fastcgi_read_timeout 1200s;

        ##
        # Basic Settings
        ##
        sendfile on;
        tcp_nopush on;
        tcp_nodelay on;
        keepalive_timeout 65;
        types_hash_max_size 2048;

        # server_tokens off;
        # server_names_hash_bucket_size 64;
        # server_name_in_redirect off;

        include /etc/nginx/mime.types;
        default_type application/octet-stream;

        ##
        # SSL Settings
        ##

        ssl_protocols TLSv1.2 TLSv1.3;
        ssl_prefer_server_ciphers on;

        ##
        # Logging Settings
        ##

        access_log /var/log/nginx/access.log;
        error_log /var/log/nginx/error.log;

        ##
        # Gzip Settings
        ##


        gzip on;
        gzip_disable "msie6";

        gzip_vary on;
        gzip_proxied any;
        gzip_comp_level 6;
        gzip_buffers 16 8k;
        gzip_http_version 1.1;

        gzip_types text/plain application/xml;
        gzip_min_length 256;
        gzip_proxied no-cache no-store private expired auth;


        ##
        # Virtual Host Configs
        ##

        include /etc/nginx/conf.d/*.conf;
        include /etc/nginx/sites-enabled/*;
}

I edited sudo nano /etc/nginx/sites-available/default

Contents

server {
        listen 80 default_server;
        listen [::]:80 default_server;

        # SSL configuration
        #
        listen 443 ssl default_server;
        listen [::]:443 ssl default_server;

        ssl on;
        ssl_certificate /path/to/ssl/certs/cert.pem;
        ssl_certificate_key /path/to/ssl/private/key.pem;

        root /path-to-www;

        # Add index.php to the list if you are using PHP
        
        index index.html index.php;

        server_name fearby.com;

        #Security Headers
        add_header Strict-Transport-Security "max-age=31536000; includeSubDomains" always;
        add_header X-Content-Type-Options "nosniff" always;
        add_header Referrer-Policy "no-referrer-when-downgrade";
        add_header X-XSS-Protection "1; mode=block" always;
        add_header X-Frame-Options SAMEORIGIN always;
        add_header Permissions-Policy "accelerometer=(), camera=(), geolocation=(), gyroscope=(), magnetometer=(), microphone=(), payment=(), usb=()";

        # Force HTTPS
        if ($scheme != "https") {
                return 301 https://$host$request_uri;
        }

        # DENY RULES
        location ~ /\.ht {
                deny all;
        }
        location ~ ^/\.user\.ini {
                deny all;
        }
        location ~ (\.ini) {
                return 403;
        }

        if ($http_referer ~* "laptrinhx.com") {
                return 404;
        }

        if ($http_referer ~* "bdev.dev") {
                return 404;
        }

        if ($http_referer ~* "raoxyz.com") {
                return 404;
        }

        if ($http_referer ~* "congtyaz.com") {
                return 404;
        }


        location / {
                try_files $uri $uri/ /index.php?$args;
        }

        location ~ \.php$ {
                include snippets/fastcgi-php.conf;
                fastcgi_pass unix:/run/php/php7.4-fpm.sock;
        }

        # DNS
        resolver 1.1.1.1 1.0.0.1 valid=60s;
        resolver_timeout 1m;
}

I tested Nginx and PHP.

I installed MySQL (I Googled a guide)

I created a database, database user and assigned permissions for my blog.

I installed the WordPress CLI tool

I installed WordPress the using the wp-cli tool

wp core install --url=example.com --title=Example --admin_user=supervisor --admin_password=strongpassword [email protected]

When I had a blank WordPress I uploaded the blog folder that I backed up before.

I ran this command to allow Nginx to read the backed up website files

sudo chown -R www-data:www-data /path-to-www

I also uploaded the backup of my mysql database to /var/lib/mysql/oldblogdatabase

I ran this command to allow mysql to read the backed up database

sudo chown -R mysql:mysql /var/lib/mysql/oldblogdatabase

I also uploaded the following files to /var/lib/mysql

TIP: These files are very important to restore.  You cannot just copy a database in a subfolder.

  • ibdata1
  • ib_logfile0
  • ib_logfile1
  • ib_logfile1
  • ibtmp1

I ran this command to allow mysql ro read the ib* files

sudo chown -R mysql:mysql /var/lib/mysql/ib*

I was able to load my old website

Blog Up

I still have some issues to solve but it is back.

Lessons Learned

  1. Save all passwords and have backup accounts and roll back before working backups are gone.
  2. Setup a Dev Test, Pre Prod Environment and do no test on production servers.
  3. Do not delay disaster recovery actions.
  4. Do not rely on automation.
  5. Have more than a weeks worth of backups.

Get $25 free credit on UpCloud and deploy your own server: Use this link to get $25 credit (new UpCloud users only).

 

Change Log

Version 1.2

Filed Under: Uncategorized Tagged With: 404, MySQL, nginx, php, website

I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.

December 22, 2020 by Simon

I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance. Here is what I did to set up a complete Ubuntu 18.04 system (NGINX, PHP, MySQL, WordPress etc). This is not a paid review (just me documenting my steps over 2 days).

Background (CPanel hosts)

In 1999 I hosted my first domain (www.fearby.com) on a host in Seattle (for $10 USD a month), the host used CPanel and all was good.  After a decade I was using the domain more for online development and the website was now too slow (I think I was on dial-up or ADSL 1 at the time). I moved my domain to an Australian host (for $25 a month).

After 8 years the domain host was sold and performance remained mediocre. After another year the new host was sold again and performance was terrible.

I started receiving Resource Limit Is Reached warnings (basically this was a plot by the new CPanel host to say “Pay us more and this message will go away”).

Page load times were near 30 seconds.

cpenal_usage_exceeded

The straw that broke the camel’s back was their demand of $150/year for a dodgy SSL certificate.

I needed to move to a self-managed server where I was in control.

Buying a Domain Name

Buy a domain name from Namecheap here.

Domain names for just 88 cents!

Self Managed Server

I found a good web IDE ( http://www.c9.io/ ) that allowed me to connect to a cloud VM.  C9 allowed me to open many files and terminal windows and reconnect to them later. Don’t get excited, though, as AWS has purchased C9 and it’s not the same.

C9 IDE

C9 IDE

I spun up a Digital Ocean Server at the closest data centre in Singapore. Here was my setup guide creating a Digital Ocean VM, connecting to it with C9 and configuring it. I moved my email to G Suite and moved my WordPress to Digital Ocean (other guides here and here).

I was happy since I could now send emails via CLI/code, set up free SSL certs, add second domain email to G Suite and Secure G Suite. No more usage limit errors either.

Self-managing servers require more work but it is more rewarding (flexible, faster and cheaper).  Page load times were now near 20 seconds (10-second improvement).

Latency Issue

Over 6 months, performance on Digital Ocean (in Singapore) from Australia started to drop (mentioned here).  I tried upgrading the memory but that did not help (latency was king).

Moved the website to Australia

I moved my domain to Vultr in Australia (guide here and here). All was good for a year until traffic growth started to increase.

Blog Growth

I tried upgrading the memory on Vultr and I setup PHP child workers, set up Cloudflare.

GT Metrix scores were about a “B” and Google Page Speed Scores were in the lower 40’s. Page loads were about 14 seconds (5-second improvement).

Tweaking WordPress

I set up an image compression plugin in WordPress then set up a cloud image compression and CDN Plugin from the same vendor.  Page Speed info here.

GT Metrix scores were now occasionally an “A” and Page Speed scores were in the lower 20’s. Page loads were about 3-5 seconds (10-second improvement).

A mixed bag from Vultr (more optimisation and performance improvements were needed).

This screenshot is showing poor www.gtmetrix.com scores , pool google page speed index scores and upgrading from 1GB to 2GB memory on my server.

Google Chrome Developer Console Audit Results on Vultr hosted website were not very good (I stopped checking as nothing helped).

This is a screenshot showing poor site performance (screenshot taken in Google Dev tools audit feature)

The problem was the Vultr server (400km away in Sydney) was offline (my issue) and everything above (adding more memory, adding 2x CDN’s (EWWW and Cloudflare), adding PHP Child workers etc) did not seem to help???

Enter UpCloud…

Recently, a friend sent a link to a blog article about a host called “UpCloud” who promised “Faster than SSD” performance.  This can’t be right: “Faster than SSD”? I was intrigued. I wanted to check it out as I thought nothing was faster than SSD (well, maybe RAM).

I signed up for a trial and ran a disk IO test (read the review here) and I was shocked. It’s fast. Very fast.

Summary: UpCloud was twice as fast (Disk IO and CPU) as Vultr (+ an optional $4/m firewall and $3/m for 1x backup).

This is a screenshot showing Vultr.com servers getting half the read and write disk io performance compared to upcloud.com.

fyi: Labels above are K Bytes per second. iozone loops through all file size from 4 KB to 16,348 KB and measures through the reads per second. To be honest, the meaning of the numbers doesn’t interest me, I just want to compare apples to apples.

This is am image showing iozone results breakdown chart (kbytes per sec on vertical axis, file size in horizontal axis and transfer size on third access)

(image snip from http://www.iozone.org/ which explains the numbers)

I might have to copy my website on UpCloud and see how fast it is.

Where to Deploy and Pricing

UpCloud Pricing: https://www.upcloud.com/pricing/

UpCloud Pricing

UpCloud does not have a data centre in Australia yet so why choose UpCloud?

Most of my site’s visitors are based in the US and UpCloud have disk IO twice as fast as Vultr (win-win?).  I could deploy to Chicago?

This image sows most of my visitors are in the US

My site’s traffic is growing and I need to ensure the site is fast enough in the future.

This image shows that most of my sites visitors are hitting my site on week days.

Creating an UpCloud VM

I used a friend’s referral code and signed up to create my first VM.

FYI: use my Referral code and get $25 free credit.  Sign up only takes 2 minutes.

https://www.upcloud.com/register/?promo=D84793

When you click the link above you will receive 25$ to try out serves for 3 days. You can exit his trail and deposit $10 into UpCloud.

Trial Limitations

The trial mode restrictions are as following:

* Cloud servers can only be accessed using SSH, RDP, HTTP or HTTPS protocols
* Cloud servers are not allowed to send outgoing e-mails or to create outbound SSH/RDP connections
* The internet connection is restricted to 100 Mbps (compared to 500 Mbps for non-trial accounts)
* After your 72 hours free trial, your services will be deleted unless you make a one-time deposit of $10

UpCloud Links

The UpCloud support page is located here: https://www.upcloud.com/support/

  • Quick start: Introduction to UpCloud
  • How to deploy a Cloud Server
  • Deploy a cloud server with UpCloud’s API

More UpCloud links to read:

  • Two-Factor Authentication on UpCloud
  • Floating IPs on UpCloud
  • How to manage your firewall
  • Finalizing deployment

Signing up to UpCloud

Navigate to https://upcloud.com/signup and add your username, password and email address and click signup.

New UpCloud Signup Page

Add your address and payment details and click proceed (you don’t need to pay anything ($1 may be charged and instantly refunded to verify the card)

Add address and payment details

That’s it, check yout email.

Signup Done

Look for the UpCloud email and click https://my.upcloud.com/

Check Email

Now login

Login to UpCloud

Now I can see a dashboard 🙂

UpCloud Dashboard

I was happy to see 24/7 support is available.

This image shows the www.upcloud.com live chat

I opted in for the new dashboard

UpCloud new new dashboard

Deploy My First UpCloud Server

This is how I deployed a server.

Note: If you are going to deploy a server consider using my referral code and get $25 credit for free.

Under the “deploy a server” widget I named the server and chose a location (I think I was supposed to use an FQDN name -e.g., “fearby.com”). The deployment worked though. I clicked continue, then more options were made available:

  1. Enter a short server description.
  2. Choose a location (Frankfurt, Helsinki, Amsterdam, Singapore, London and Chicago)
  3. Choose the number of CPU’s and amount of memory
  4. Specify disk number/names and type (MaxIOPS or HDD).
  5. Choose an Operating System
  6. Select a Timezone
  7. Define SSH Keys for access
  8. Allowed login methods
  9. Choose hardware adapter types
  10. Where the send the login password

Deploy Server

FYI: How to generate a new SSH Key (on OSX or Ubuntu)

ssh-keygen -t rsa

Output

Generating public/private rsa key pair.
Enter file in which to save the key (/root/.ssh/id_rsa): /temp/example_rsa
Enter passphrase (empty for no passphrase): *********************************
Enter same passphrase again:*********************************
Your identification has been saved in /temp/example_rsa.
Your public key has been saved in /temp/example_rsa.pub.
The key fingerprint is:
SHA256:########################### [email protected]
Outputted public and private key

Did the key export? (yes)

> /temp# ls /temp/ -al
> drwxr-xr-x 2 root root 4096 Jun 9 15:33 .
> drwxr-xr-x 27 root root 4096 Jun 8 14:25 ..
> -rw——- 1 user user 1766 Jun 9 15:33 example_rsa
> -rw-r–r– 1 user user 396 Jun 9 15:33 example_rsa.pub

“example_rsa” is the private key and “example_rsa.pub “is the public key.

  • The public key needs to be added to the server to allow access.
  • The private key needs to be added to any local ssh program used for remote access.

Initialisation script (after deployment)

I was pleased to see an initialization script section that calls actions after the server is deployed. I configured the initialisation script to pull down a few GB of backups from my Vultr website in Sydney (files now removed).

This was my Initialisation script:

#!/bin/bash
echo "Downloading the Vultr websites backups"
mkdir /backup
cd /backup
wget -o www-mysql-backup.sql https://fearby.com/.../www-mysql-backup.sql
wget -o www-blog-backup.zip https://fearby.com/.../www-blog-backup.zip

Confirm and Deploy

I clicked “Confirm and deploy” but I had an alert that said trial mode can only deploy servers up to 1024MB of memory.

This image shows I cant deploy servers with 2/GB in trial modeExiting UpCloud Trial Mode

I opened the dashboard and clicked My Account then Billing, I could see the $25 referral credit but I guess I can’t use that in Trial.

I exited trial mode by depositing $10 (USD).

View Billing Details

Make a manual 1-time deposit of $10 to exit trial mode.

Deposit $10 to exit the trial

FYI: Server prices are listed below (or view prices here).

UpCloud Pricing

Now I can go back and deploy the server with the same settings above (1x CPU, 2GB Memory, Ubuntu 18.04, MaxIOPS Storage etc)

Deployment takes a few minutes and depending on how you specified a password may be emailed to you.

UpCloud Server Deployed

The server is now deployed; now I can connect to it with my SSH program (vSSH).  Simply add the server’s IP, username, password and the SSH private key (generated above) to your ssh program of choice.

fyi: The public key contents start with “ssh-rsa”.

This image shows me connecting to my sever via ssh

I noticed that the initialisation script downloaded my 2+GB of files already. Nice.

UpCloud Billing Breakdown

I can now see on the UpCloud billing page in my dashboard that credit is deducted daily (68c); at this rate, I have 49 days credit left?

Billing Breakdown

I can manually deposit funds or set up automatic payments at any time 🙂

UpCloud Backup Options

You do not need to setup backups but in case you want to roll back (if things stuff up), it is a good idea. Backups are an additional charge.

I have set up automatic daily backups with an auto deletion after 2 days

To view backup scheduled click on your deployed server then click backup

List of UpCloud Backups

Note: Backups are charged at $0.056 for every GB stored – so $5.60 for every 100GB per month (half that for 50GB etc)

You can take manual backups at any time (and only be charged for the hour)

UpCloud Firewall Options

I set up a firewall at UpCloud to only allow the minimum number of ports (UpCloud DNS, HTTP, HTTPS and My IP to port 22).  The firewall feature is charged at $0.0056 an hour ($4.03 a month)

I love the ability to set firewall rules on incoming, destination and outgoing ports.

To view your firewall click on your deployed server then click firewall

UpCloud firewall

Update: I modified my firewall to allow inbound ICMP (IPv4/IPv6) and UDP (IPv4/IPv6) packets.

(Note: Old firewall screenshot)

Firewall Rules Allow port 80, 443 and DNS

Because my internet provider has a dynamic IP, I set up a VPN with a static IP and whitelisted it for backdoor access.

Local Ubuntu ufw Firewall

I duplicated the rules in my local ufw (2nd level) firewall (and blocked mail)

sudo ufw status numbered
Status: active

     To                         Action      From
     --                         ------      ----
[ 1] 80                         ALLOW IN    Anywhere
[ 2] 443                        ALLOW IN    Anywhere
[ 3] 25                         DENY OUT    Anywhere                   (out)
[ 4] 53                         ALLOW IN    93.237.127.9
[ 5] 53                         ALLOW IN    93.237.40.9
[ 6] 22                         ALLOW IN    REMOVED (MY WHITELISTED IP))
[ 7] 80 (v6)                    ALLOW IN    Anywhere (v6)
[ 8] 443 (v6)                   ALLOW IN    Anywhere (v6)
[ 9] 25 (v6)                    DENY OUT    Anywhere (v6)              (out)
[10] 53                         ALLOW IN    2a04:3540:53::1
[11] 53                         ALLOW IN    2a04:3544:53::1

UpCloud Download Speeds

I pulled down a 1.8GB Ubuntu 18.08 Desktop ISO 3 times from gigenet.com and the file downloaded in 32 seconds (57MB/sec). Nice.

$/temp# wget http://mirrors.gigenet.com/ubuntu/18.04/ubuntu-18.04-desktop-amd64.iso
--2018-06-08 18:02:04-- http://mirrors.gigenet.com/ubuntu/18.04/ubuntu-18.04-desktop-amd64.iso
Resolving mirrors.gigenet.com (mirrors.gigenet.com)... 69.65.15.34
Connecting to mirrors.gigenet.com (mirrors.gigenet.com)|69.65.15.34|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 1921843200 (1.8G) [application/x-iso9660-image]
Saving to: 'ubuntu-18.04-desktop-amd64.iso'

ubuntu-18.04-desktop-amd64.iso 100%[==================================================================>] 1.79G 57.0MB/s in 32s

2018-06-08 18:02:37 (56.6 MB/s) - 'ubuntu-18.04-desktop-amd64.iso' saved [1921843200/1921843200]

$/temp# wget http://mirrors.gigenet.com/ubuntu/18.04/ubuntu-18.04-desktop-amd64.iso
--2018-06-08 18:02:46-- http://mirrors.gigenet.com/ubuntu/18.04/ubuntu-18.04-desktop-amd64.iso
Resolving mirrors.gigenet.com (mirrors.gigenet.com)... 69.65.15.34
Connecting to mirrors.gigenet.com (mirrors.gigenet.com)|69.65.15.34|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 1921843200 (1.8G) [application/x-iso9660-image]
Saving to: 'ubuntu-18.04-desktop-amd64.iso.1'

ubuntu-18.04-desktop-amd64.iso.1 100%[==================================================================>] 1.79G 57.0MB/s in 32s

2018-06-08 18:03:19 (56.6 MB/s) - 'ubuntu-18.04-desktop-amd64.iso.1' saved [1921843200/1921843200]

$/temp# wget http://mirrors.gigenet.com/ubuntu/18.04/ubuntu-18.04-desktop-amd64.iso
--2018-06-08 18:03:23-- http://mirrors.gigenet.com/ubuntu/18.04/ubuntu-18.04-desktop-amd64.iso
Resolving mirrors.gigenet.com (mirrors.gigenet.com)... 69.65.15.34
Connecting to mirrors.gigenet.com (mirrors.gigenet.com)|69.65.15.34|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 1921843200 (1.8G) [application/x-iso9660-image]
Saving to: 'ubuntu-18.04-desktop-amd64.iso.2'

ubuntu-18.04-desktop-amd64.iso.2 100%[==================================================================>] 1.79G 57.0MB/s in 32s

2018-06-08 18:03:56 (56.8 MB/s) - 'ubuntu-18.04-desktop-amd64.iso.2' saved [1921843200/1921843200]

Install Common Ubuntu Packages

I installed common Ubuntu packages.

apt-get install zip htop ifstat iftop bmon tcptrack ethstatus speedometer iozone3 bonnie++ sysbench siege tree tree unzip jq jq ncdu pydf ntp rcconf ufw iperf nmap iozone3

Timezone

I checked the server’s time (I thought this was auto set before I deployed)?

$hwclock --show
2018-06-06 23:52:53.639378+0000

I reset the time to Australia/Sydney.

dpkg-reconfigure tzdata
Current default time zone: 'Australia/Sydney'
Local time is now: Thu Jun 7 06:53:20 AEST 2018.
Universal Time is now: Wed Jun 6 20:53:20 UTC 2018.

Now the timezone is set 🙂

Shell History

I increased the shell history.

HISTSIZEH =10000
HISTCONTROL=ignoredups

SSH Login

I created a ~/.ssh/authorized_keys file and added my SSH public key to allow password-less logins.

mkdir ~/.ssh
sudo nano ~/.ssh/authorized_keys

I added my pubic ssh key, then exited the ssh session and logged back in. I can now log in without a password.

Install NGINX

apt-get install nginx

nginx/1.14.0 is now installed.

A quick GT Metrix test.

This image shows awesome static nginx performance ratings of of 99%

Install MySQL

Run these commands to install and secure MySQL.

apt install mysql-server
mysql_secure_installation

Securing the MySQL server deployment.
> Would you like to setup VALIDATE PASSWORD plugin?: n
> New password: **********************************************
> Re-enter new password: **********************************************
> Remove anonymous users? (Press y|Y for Yes, any other key for No) : y
> Disallow root login remotely? (Press y|Y for Yes, any other key for No) : y
> Remove test database and access to it? (Press y|Y for Yes, any other key for No) : y
> Reload privilege tables now? (Press y|Y for Yes, any other key for No) : y
> Success.

I disabled the validate password plugin because I hate it.

MySQL Ver 14.14 Distrib 5.7.22 is now installed.

Set MySQL root login password type

Set MySQL root user to authenticate via “mysql_native_password”. Run the “mysql” command.

mysql
SELECT user,authentication_string,plugin,host FROM mysql.user;
+------------------+-------------------------------------------+-----------------------+-----------+
| user | authentication_string | plugin | host |
+------------------+-------------------------------------------+-----------------------+-----------+
| root | | auth_socket | localhost |
| mysql.session | hiddden | mysql_native_password | localhost |
| mysql.sys | hiddden | mysql_native_password | localhost |
| debian-sys-maint | hiddden | mysql_native_password | localhost |
+------------------+-------------------------------------------+-----------------------+----------

Now let’s set the root password authentication method to “mysql_native_password”

ALTER USER 'root'@'localhost' IDENTIFIED WITH mysql_native_password BY '*****************************************';
Query OK, 0 rows affected (0.00 sec)

Check authentication method.

mysql> SELECT user,authentication_string,plugin,host FROM mysql.user;
+------------------+-------------------------------------------+-----------------------+-----------+
| user | authentication_string | plugin | host |
+------------------+-------------------------------------------+-----------------------+-----------+
| root | ######################################### | mysql_native_password | localhost |
| mysql.session | hiddden | mysql_native_password | localhost |
| mysql.sys | hiddden | mysql_native_password | localhost |
| debian-sys-maint | hiddden | mysql_native_password | localhost |
+------------------+-------------------------------------------+-----------------------+-----------+

Now we need to flush permissions.

mysql> FLUSH PRIVILEGES;
Query OK, 0 rows affected (0.00 sec)

Done.

Install PHP

Install PHP 7.2

apt-get install software-properties-common
add-apt-repository ppa:ondrej/php
apt-get update
apt-get install -y php7.2
php -v

PHP 7.2.5, Zend Engine v3.2.0 with Zend OPcache v7.2.5-1 is now installed. Do update PHP frequently.

I made the following changes in /etc/php/7.2/fpm/php.ini

> cgi.fix_pathinfo=0
> max_input_vars = 1000
> memory_limit = 1024M
> max_file_uploads = 20M
> post_max_size = 20M

Install PHP Modules

sudo apt-get install php-pear php7.2-curl php7.2-dev php7.2-mbstring php7.2-zip php7.2-mysql php7.2-xml

Install PHP FPM

apt-get install php7.2-fpm

Configure PHP FPM config.

Edit /etc/php/7.2/fpm/php.ini

> cgi.fix_pathinfo=0
> max_input_vars = 1000
> memory_limit = 1024M
> max_file_uploads = 20M
> post_max_size = 20M

Reload php sudo service.

php7.2-fpm restart service php7.2-fpm status

Install PHP Modules

sudo apt-get install php-pear php7.2-curl php7.2-dev php7.2-mbstring php7.2-zip php7.2-mysql php7.2-xml

Configuring NGINX

If you are not comfortable editing NGINX config files read here, here and here.

I made a new “www root” folder, set permissions and created a default html file.

mkdir /www-root
chown -R www-data:www-data /www-root
echo "Hello World" >> /www-root/index.html

I edited the “root” key in “/etc/nginx/sites-enabled/default” file and set the root a new location (e.g., “/www-root”)

I added these performance tweaks to /etc/nginx/nginx.conf

> worker_cpu_affinity auto;
> worker_rlimit_nofile 100000

I add the following lines to “http {” section in /etc/nginx/nginx.conf

client_max_body_size 10M;

gzip on;
gzip_disable "msie6";
gzip_comp_level 5;
gzip_min_length 256;
gzip_vary on;
gzip_types
application/atom+xml
application/ld+json
application/manifest+json
application/rss+xml
application/vnd.geo+json
application/vnd.ms-fontobject
application/x-font-ttf
application/x-web-app-manifest+json
application/xhtml+xml
font/opentype
image/bmp
image/x-icon
text/cache-manifest
text/vcard
text/vnd.rim.location.xloc
text/vtt
text/x-component
text/x-cross-domain-policy;
#text/html is always compressed by gzip module

gzip_proxied any;
gzip_buffers 16 8k;
gzip_http_version 1.1;
gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss te$

Check NGINX Status

service nginx status
* nginx.service - A high performance web server and a reverse proxy server
Loaded: loaded (/lib/systemd/system/nginx.service; enabled; vendor preset: enabled)
Active: active (running) since Thu 2018-06-07 21:16:28 AEST; 30min ago
Docs: man:nginx(8)
Main PID: # (nginx)
Tasks: 2 (limit: 2322)
CGroup: /system.slice/nginx.service
|- # nginx: master process /usr/sbin/nginx -g daemon on; master_process on;
`- # nginx: worker process

Install Open SSL that supports TLS 1.3

This is a work in progress. The steps work just fine for me on Ubuntu 16.04. but not Ubuntu 18.04.?

Installing Adminer MySQL GUI

I will use the PHP based Adminer MySQL GUI to export and import my blog from one server to another. All I needed to do is install it on both servers (simple 1 file download)

cd /utils
wget -o adminer.php https://github.com/vrana/adminer/releases/download/v4.6.2/adminer-4.6.2-mysql-en.php

Use Adminer to Export My Blog (on Vultr)

On the original server open Adminer (http) and..

  1. Login with the MySQL root account
  2. Open your database
  3. Choose “Save” as the output
  4. Click on Export

This image shows the export of the wordpress adminer page

Save the “.sql” file.

I used Adminer on the UpCloud server to Import My Blog

FYI: Depending on the size of your database backup you may need to temporarily increase your upload and post sizes limits in PHP and NGINX before you can import your database.

Edit /etc/php/7.2/fpm/php.ini
> max_file_uploads = 100M
> post_max_size =100M

And Edit: /etc/nginx/nginx.conf
> client_max_body_size 100M;

Don’t forget to reload NGINX config and restart NGINX and PHP. Take note of the maximum allowed file size in the screenshot below. I temporarily increased my upload limits to 100MB in order to restore my 87MB blog.

Now I could open Adminer on my UpCloud server.

  1. Create a new database
  2. Click on the database and click Import
  3. Choose the SQL file
  4. Click Execute to import it

Import MuSQL backup with Adminer

Don’t forget to create a user and assign permissions (as required – check your wp-config.php file).

Import MySQL Database

Tip: Don’t forget to lower the maximum upload file size and max post size after you import your database,

Cloudflare DNS

I use Cloudflare to manage DNS, so I need to tell it about my new server.

You can get your server’s IP details from the UpCloud dashboard.

Find IP

At Cloudflare update your DNS details to point to the server’s new IPv4 (“A Record”) and IPv6 (“AAAA Record”).

Cloudflare DNS

Domain Error

I waited an hour and my website was suddenly unavailable.  At first, I thought this was Cloudflare forcing the redirection of my domain to HTTP (that was not yet set up).

DNS Not Replicated Yet

I chatted with UpCloud chat on their webpage and they kindly assisted me to diagnose all the common issues like DNS values, DNS replication, Cloudflare settings and the error was pinpointed to my NGINX installation.  All NGINX config settings were ok from what we could see?  I uninstalled NGINX and reinstalled it (and that fixed it). Thanks UpCloud Support 🙂

Reinstalled NGINX

sudo apt-get purge nginx nginx-common

I reinstalled NGINX and reconfigured /etc/nginx/nginx.conf (I downloaded my SSL cert from my old server just in case).

Here is my /etc/nginx/nginx.conf file.

user www-data;
worker_processes auto;
worker_cpu_affinity auto;
pid /run/nginx.pid;
include /etc/nginx/modules-enabled/*.conf;
error_log /var/log/nginx/www-nginxcriterror.log crit;

events {
        worker_connections 768;
        multi_accept on;
}

http {

        client_max_body_size 10M;
        sendfile on;
        tcp_nopush on;
        tcp_nodelay on;
        keepalive_timeout 65;
        types_hash_max_size 2048;
        server_tokens off;

        server_names_hash_bucket_size 64;
        server_name_in_redirect off;

        include /etc/nginx/mime.types;
        default_type application/octet-stream;

        ssl_protocols TLSv1.1 TLSv1.2;
        ssl_prefer_server_ciphers on;

        access_log /var/log/nginx/www-access.log;
        error_log /var/log/nginx/www-error.log;

        gzip on;

        gzip_vary on;
        gzip_disable "msie6";
        gzip_min_length 256;
        gzip_proxied any;
        gzip_comp_level 6;
        gzip_buffers 16 8k;
        gzip_http_version 1.1;
        gzip_types text/plain text/css application/json application/javascript text/xml application/xml application/xml+rss text/javascript;

        include /etc/nginx/conf.d/*.conf;
        include /etc/nginx/sites-enabled/*;
}

Here is my /etc/nginx/sites-available/default file (fyi, I have not fully re-setup TLS 1.3 yet so I commented out the settings)

proxy_cache_path /tmp/nginx-cache keys_zone=one:10m;#
server {
        root /www-root;

        # Listen Ports
        listen 80 default_server http2;
        listen [::]:80 default_server http2;
        listen 443 ssl default_server http2;
        listen [::]:443 ssl default_server http2;

        # Default File
        index index.html index.php index.htm;

        # Server Name
        server_name www.fearby.com fearby.com localhost;

        # HTTPS Cert
        ssl_certificate /etc/nginx/ssl-cert-path/fearby.crt;
        ssl_certificate_key /etc/nginx/ssl-cert-path/fearby.key;
        ssl_dhparam /etc/nginx/ssl-cert-path/dhparams4096.pem;

        # HTTPS Ciphers
        
        # TLS 1.2
        ssl_protocols TLSv1.2;
        ssl_prefer_server_ciphers on;
        ssl_ciphers "EECDH+AESGCM:EDH+AESGCM:AES256+EECDH:AES256+EDH";

        # TLS 1.3			#todo
        # ssl_ciphers 
        # ECDHE-RSA-AES256-GCM-SHA512:DHE-RSA-AES256-GCM-SHA512:ECDHE-RSA-AES256-GCM-SHA384:DHE-RSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-SHA384:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES128-GCM-SHA256:DES-CBC3-SHA;
        # ssl_ecdh_curve secp384r1;

        # Force HTTPS
        if ($scheme != "https") {
                return 301 https://$host$request_uri;
        }

        # HTTPS Settings
        server_tokens off;
        ssl_session_cache shared:SSL:10m;
        ssl_session_timeout 30m;
        ssl_session_tickets off;
        add_header Strict-Transport-Security "max-age=63072000; includeSubdomains; preload";
        add_header X-Frame-Options DENY;
        add_header X-Content-Type-Options nosniff;
        add_header X-XSS-Protection "1; mode=block";
	#ssl_stapling on; 						# Requires nginx >= 1.3.7

        # Cloudflare DNS
        resolver 1.1.1.1 1.0.0.1 valid=60s;
        resolver_timeout 1m;

        # PHP Memory 
        fastcgi_param PHP_VALUE "memory_limit = 1024M";

	# pass the PHP scripts to FastCGI server listening on 127.0.0.1:9000
        location ~ .php$ {
            try_files $uri =404;
            # include snippets/fastcgi-php.conf;

            fastcgi_split_path_info ^(.+.php)(/.+)$;
            fastcgi_index index.php;
            fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
            include fastcgi_params;
            fastcgi_pass unix:/run/php/php7.2-fpm.sock;

            # NOTE: You should have "cgi.fix_pathinfo = 0;" in php.ini
            # fastcgi_pass 127.0.0.1:9000;
	    }

        location / {
            # try_files $uri $uri/ =404;
            try_files $uri $uri/ /index.php?q=$uri&$args;
            index index.php index.html index.htm;
            proxy_set_header Proxy "";
        }

        # Deny Rules
        location ~ /.ht {
                deny all;
        }
        location ~ ^/.user.ini {
            deny all;
        }
        location ~ (.ini) {
            return 403;
        }

        # Headers
        location ~* .(?:ico|css|js|gif|jpe?g|png|js)$ {
            expires 30d;
            add_header Pragma public;
            add_header Cache-Control "public";
        }

}

SSL Labs SSL Certificate Check

All good thanks to the config above.

SSL Labs

Install WP-CLI

I don’t like setting up FTP to auto-update WordPress plugins. I use the WP-CLI tool to manage WordPress installations by the command line. Read my blog here on using WP-CLI.

Download WP-CLI

mkdir /utils
cd /utils
curl -O https://raw.githubusercontent.com/wp-cli/builds/gh-pages/phar/wp-cli.phar

Move WP-CLI to the bin folder as “wp”

chmod +x wp-cli.phar
sudo mv wp-cli.phar /usr/local/bin/wp

Test wp

wp --info
OS: Linux 4.15.0-22-generic #24-Ubuntu SMP Wed May 16 12:15:17 UTC 2018 x86_64
Shell: /bin/bash
PHP binary: /usr/bin/php7.2
PHP version: 7.2.5-1+ubuntu18.04.1+deb.sury.org+1
php.ini used: /etc/php/7.2/cli/php.ini
WP-CLI root dir: phar://wp-cli.phar
WP-CLI vendor dir: phar://wp-cli.phar/vendor
WP_CLI phar path: /www-root
WP-CLI packages dir:
WP-CLI global config:
WP-CLI project config:
WP-CLI version: 1.5.1

Update WordPress Plugins

Now I can run “wp plugin update” to update all WordPress plugins

wp plugin update
Enabling Maintenance mode...
Downloading update from https://downloads.wordpress.org/plugin/wordfence.7.1.7.zip...
Unpacking the update...
Installing the latest version...
Removing the old version of the plugin...
Plugin updated successfully.
Downloading update from https://downloads.wordpress.org/plugin/wp-meta-seo.3.7.1.zip...
Unpacking the update...
Installing the latest version...
Removing the old version of the plugin...
Plugin updated successfully.
Downloading update from https://downloads.wordpress.org/plugin/wordpress-seo.7.6.1.zip...
Unpacking the update...
Installing the latest version...
Removing the old version of the plugin...
Plugin updated successfully.
Disabling Maintenance mode...
Success: Updated 3 of 3 plugins.
+---------------+-------------+-------------+---------+
| name | old_version | new_version | status |
+---------------+-------------+-------------+---------+
| wordfence | 7.1.6 | 7.1.7 | Updated |
| wp-meta-seo | 3.7.0 | 3.7.1 | Updated |
| wordpress-seo | 7.5.3 | 7.6.1 | Updated |
+---------------+-------------+-------------+---------+

Update WordPress Core

WordPress core file can be updated with “wp core update“

wp core update
Success: WordPress is up to date.

Troubleshooting: Use the flag “–allow-root “if wp needs higher access (unsafe action though).

Install PHP Child Workers

I edited the following file to setup PHP child workers /etc/php/7.2/fpm/pool.d/www.conf

Changes

> pm = dynamic
> pm.max_children = 40
> pm.start_servers = 15
> pm.min_spare_servers = 5
> pm.max_spare_servers = 15
> pm.process_idle_timeout = 30s;
> pm.max_requests = 500;
> php_admin_value[error_log] = /var/log/www-fpm-php.www.log
> php_admin_value[memory_limit] = 512M

Restart PHP

sudo service php7.2-fpm restart

Test NGINX config, reload NGINX config and restart NGINX

nginx -t
nginx -s reload
/etc/init.d/nginx restart

Output (14 workers are ready)

Check PHP Child Worker Status

sudo service php7.2-fpm status
* php7.2-fpm.service - The PHP 7.2 FastCGI Process Manager
Loaded: loaded (/lib/systemd/system/php7.2-fpm.service; enabled; vendor preset: enabled)
Active: active (running) since Thu 2018-06-07 19:32:47 AEST; 20s ago
Docs: man:php-fpm7.2(8)
Main PID: # (php-fpm7.2)
Status: "Processes active: 0, idle: 15, Requests: 2, slow: 0, Traffic: 0.1req/sec"
Tasks: 16 (limit: 2322)
CGroup: /system.slice/php7.2-fpm.service
|- # php-fpm: master process (/etc/php/7.2/fpm/php-fpm.conf)
|- # php-fpm: pool www
|- # php-fpm: pool www
|- # php-fpm: pool www
|- # php-fpm: pool www
|- # php-fpm: pool www
|- # php-fpm: pool www
|- # php-fpm: pool www
|- # php-fpm: pool www
|- # php-fpm: pool www
|- # php-fpm: pool www
|- # php-fpm: pool www
|- # php-fpm: pool www
|- # php-fpm: pool www
|- # php-fpm: pool www
- # php-fpm: pool www

Memory Tweak (set at your own risk)

sudo nano /etc/sysctl.conf

vm.swappiness = 1

Setting swappiness to a value of 1 all but disables the swap file and tells the Operating System to aggressively use ram, a value of 10 is safer. Only set this if you have enough memory available (and free).

Possible swappiness settings:

> vm.swappiness = 0 Swap is disabled. In earlier versions, this meant that the kernel would swap only to avoid an out of memory condition when free memory will be below vm.min_free_kbytes limit, but in later versions, this is achieved by setting to 1.[2]> vm.swappiness = 1 Kernel version 3.5 and over, as well as Red Hat kernel version 2.6.32-303 and over: Minimum amount of swapping without disabling it entirely.
> vm.swappiness = 10 This value is sometimes recommended to improve performance when sufficient memory exists in a system.[3]
> vm.swappiness = 60 The default value.
> vm.swappiness = 100 The kernel will swap aggressively.

The “htop” tool is a handy memory monitoring tool to “top”

Also, you can use good old “watch” command to show near-live memory usage (auto-refreshes every 2 seconds)

watch -n 2 free -m

Script to auto-clear the memory/cache

As a habit, I am setting up a cronjob to check when free memory falls below 100MB, then the cache is automatically cleared (freeing memory).

Script Contents: clearcache.sh

#!/bin/bash

# Script help inspired by https://unix.stackexchange.com/questions/119126/command-to-display-memory-usage-disk-usage-and-cpu-load
ram_use=$(free -m)
IFS=

I set the cronjob to run every 15 mins, I added this to my cronjob.

SHELL=/bin/bash
*/15  *  *  *  *  root /bin/bash /scripts/clearcache.sh >> /scripts/clearcache.log

Sample log output

2018-06-10 01:13:22 RAM OK (Total: 1993 MB, Used: 981 MB, Free: 387 MB)
2018-06-10 01:15:01 RAM OK (Total: 1993 MB, Used: 974 MB, Free: 394 MB)
2018-06-10 01:20:01 RAM OK (Total: 1993 MB, Used: 955 MB, Free: 412 MB)
2018-06-10 01:25:01 RAM OK (Total: 1993 MB, Used: 1002 MB, Free: 363 MB)
2018-06-10 01:30:01 RAM OK (Total: 1993 MB, Used: 970 MB, Free: 394 MB)
2018-06-10 01:35:01 RAM OK (Total: 1993 MB, Used: 963 MB, Free: 400 MB)
2018-06-10 01:40:01 RAM OK (Total: 1993 MB, Used: 976 MB, Free: 387 MB)
2018-06-10 01:45:01 RAM OK (Total: 1993 MB, Used: 985 MB, Free: 377 MB)
2018-06-10 01:50:01 RAM OK (Total: 1993 MB, Used: 983 MB, Free: 379 MB)
2018-06-10 01:55:01 RAM OK (Total: 1993 MB, Used: 979 MB, Free: 382 MB)
2018-06-10 02:00:01 RAM OK (Total: 1993 MB, Used: 980 MB, Free: 380 MB)
2018-06-10 02:05:01 RAM OK (Total: 1993 MB, Used: 971 MB, Free: 389 MB)
2018-06-10 02:10:01 RAM OK (Total: 1993 MB, Used: 983 MB, Free: 376 MB)
2018-06-10 02:15:01 RAM OK (Total: 1993 MB, Used: 967 MB, Free: 392 MB)

I will check the log (/scripts/clearcache.log) in a few days and view the memory trends.

After 1/2 a day Ubuntu 18.04 is handling memory just fine, no externally triggered cache clears have happened 🙂

Free memory over time

I used https://crontab.guru/every-hour to set the right schedule in crontab.

I rebooted the VM.

Update: I now use Nixstats monitoring

Swap File

FYI: Here is a handy guide on viewing swap file usage here. I’m not using swap files so it is only an aside.

After the system rebooted I checked if the swappiness setting was active.

sudo cat /proc/sys/vm/swappiness
1

Yes, swappiness is set.

File System Tweaks – Write Back Cache (set at your own risk)

First, check your disk name and file system

sudo lsblk -o NAME,FSTYPE,SIZE,MOUNTPOINT,LABEL

Take note of your disk name (e.g vda1)

I used TuneFS to enable writing data to the disk before writing to the journal. tunefs is a great tool for setting file system parameters.

Warning (snip from here): “I set the mode to journal_data_writeback. This basically means that data may be written to the disk before the journal. The data consistency guarantees are the same as the ext3 file system. The downside is that if your system crashes before the journal gets written then you may lose new data — the old data may magically reappear.“

Warning this can corrupt your data. More information here.

I ran this command.

tune2fs -o journal_data_writeback /dev/vda1

I edited my fstab to append the “writeback,noatime,nodiratime” flags for my volume after a reboot.

Edit FS Tab:

sudo nano /etc/fstab

I added “writeback,noatime,nodiratime” flags to my disk options.

# /etc/fstab: static file system information.
#
# Use 'blkid' to print the universally unique identifier for a
# device; this may be used with UUID= as a more robust way to name devices
# that works even if disks are added and removed. See fstab(5).
#
# <file system> <mount point>   <type>  <options> <dump>  <pass>
# / was on /dev/vda1 during installation
#                <device>                 <dir>           <fs>    <options>                                             <dump>  <fsck>
UUID=xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx /               ext4    errors=remount-ro,data=writeback,noatime,nodiratime   0       1

Updating Ubuntu Packages

Show updatable packages.

apt-get -s dist-upgrade | grep "^Inst"

Update Packages.

sudo apt-get update && sudo apt-get upgrade

Unattended Security Updates

Read more on Ubuntu 18.04 Unattended upgrades here, here and here.

Install Unattended Upgrades

sudo apt-get install unattended-upgrades

Enable Unattended Upgrades.

sudo dpkg-reconfigure --priority=low unattended-upgrades

Now I configure what packages not to auto-update.

Edit /etc/apt/apt.conf.d/50unattended-upgrades

Find “Unattended-Upgrade::Package-Blacklist” and add packages that you don’t want automatically updated, you may want to manually update these (and monitor updates).

I prefer not to auto-update critical system apps (I will do this myself).

Unattended-Upgrade::Package-Blacklist {
"nginx";
"nginx-common";
"nginx-core";
"php7.2";
"php7.2-fpm";
"mysql-server";
"mysql-server-5.7";
"mysql-server-core-5.7";
"libssl1.0.0";
"libssl1.1";
};

FYI: You can find installed packages by running this command:

apt list --installed

Enable automatic updates by editing /etc/apt/apt.conf.d/20auto-upgrades

Edit the number at the end (the number is how many days to wait before updating) of each line.

> APT::Periodic::Update-Package-Lists “1”;
> APT::Periodic::Download-Upgradeable-Packages “1”;
> APT::Periodic::AutocleanInterval “7”;
> APT::Periodic::Unattended-Upgrade “1”;

Set to “0” to disable automatic updates.

The results of unattended-upgrades will be logged to /var/log/unattended-upgrades

Update packages now.

unattended-upgrade -d

Almost done.

I Rebooted

GT Metrix Score

I almost fell off my chair. It’s an amazing feeling hitting refresh in GT Metrix and getting sub-2-second score consistently (and that is with 17 assets loading and 361KB of HTML content)

0.9sec load times

WebPageTest.org Test Score

Nice. I am not sure why the effective use of CDN has an X rating as I have the EWWW CDN and Cloudflare. First Byte time is now a respectable “B”, This was always bad.

Update: I found out the longer you set cache delays in Cloudflare the higher the score.

Web Page Test

GT Metrix has a nice historical breakdown of load times (night and day).

Upcloud Site Speed in GTMetrix

Google Page Speed Insight Desktop Score

I benchmarked with https://developers.google.com/speed/pagespeed/insights/

This will help with future SEO rankings. It is well known that Google is pushing fast servers.

100% Desktop page speed score

Google Chrome 70 Dev Console Audit (Desktop)

100% Chrome Audit Score

This is amazing, I never expected to get this high score.  I know Google like (and are pushing) sub-1-second scores.

My site is loading so well it is time I restored some old features that were too slow on other servers

  • I disabled Lazy loading of images (this was not working on some Android devices)
  • I re-added the News Widget and news images.

GTMetrix and WebpageTest sores are still good (even after adding bloat)

Benchmarks are still good

My WordPress site is not really that small either

Large website

FYI: WordPress Plugins I use.

These are the plugins I use.

  • Autoptimize – Optimises your website, concatenating the CSS and JavaScript code, and compressing it.
  • BJ Lazy Load (Now Disabled) – Lazy image loading makes your site load faster and saves bandwidth.
  • Cloudflare – Cloudflare speeds up and protects your WordPress site.
  • Contact Form 7 – Just another contact form plugin. Simple but flexible.
  • Contact Form 7 Honeypot – Add honeypot anti-spam functionality to the popular Contact Form 7 plugin.
  • Crayon Syntax Highlighter – Supports multiple languages, themes, highlighting from a URL, local file or post text.
  • Democracy Poll – Allows creating democratic polls. Visitors can vote for more than one answer & add their own answers.
  • Display Posts Shortcode – Display a listing of posts using the
    • HomePi – Raspberry PI powered touch screen showing information from house-wide sensors
    • Wemos Mini D1 Pro Pinout Guide
    • Yubico Security Key NFC
    • Moving Oracle Virtual Box Virtual Machines to another disk
    • Installing Windows 11 in a Virtual Machine on Windows 10 to test software compatibility
    • Diagnosing a Windows 10 PC that will not post
    • Using a 12-year-old dual Xeon server setup as a desktop PC
    • How to create a Private GitHub repository and access via SSH with TortiseGIT
    • Recovering a Dead Nginx, Mysql, PHP WordPress website
    • laptrinhx.com is stealing website content
    shortcode
  • EWWW Image Optimizer – Reduce file sizes for images within WordPress including NextGEN Gallery and GRAND FlAGallery. Uses jpegtran, optipng/pngout, and gifsicle.
  • GDPR Cookie Consent – A simple way to show that your website complies with the EU Cookie Law / GDPR.
  • GTmetrix for WordPress – GTmetrix can help you develop a faster, more efficient, and all-around improved website experience for your users. Your users will love you for it.
  • TinyMCE Advanced – Enables advanced features and plugins in TinyMCE, the visual editor in WordPress.
  • Wordfence Security – Anti-virus, Firewall and Malware Scan
  • WP Meta SEO – WP Meta SEO is a plugin for WordPress to fill meta for content, images and main SEO info in a single view.
  • WP Performance Score Booster – Speed-up page load times and improve website scores in services like PageSpeed, YSlow, Pingdom and GTmetrix.
  • WP SEO HTML Sitemap – A responsive HTML sitemap that uses all of the settings for your XML sitemap in the WordPress SEO by Yoast Plugin.
  • WP-Optimize – WP-Optimize is WordPress’s #1 most installed optimisation plugin. With it, you can clean up your database easily and safely, without manual queries.
  • WP News and Scrolling Widgets Pro – WP News Pro plugin with six different types of shortcode and seven different types of widgets. Display News posts with various designs.
  • Yoast SEO – The first true all-in-one SEO solution for WordPress, including on-page content analysis, XML sitemaps and much more.
  • YouTube – YouTube Embed and YouTube Gallery WordPress Plugin. Embed a responsive video, YouTube channel, playlist gallery, or live stream

How I use these plugins to speed up my site.

  • I use EWWW Image Optimizer plugin to auto-compress my images and to provide a CDN for media asset deliver (pre-Cloudflare). Learn more about ExactDN and EWWW.io here.
  • I use Autoptimize plugin to optimise HTML/CSS/JS and ensure select assets are on my EWWW CDN. This plugin also removes WordPress Emojis, removed the use of Google Fonts, allows you to define pre-configured domains, Async Javascript-files etc.
  • I use BJ Lazy Load to prevent all images in a post from loading on load (and only as the user scrolls down the page).
  • GTmetrix for WordPress and Cloudflare plugins are for information only?
  • I use WP-Optimize to ensure my database is healthy and to disable comments/trackbacks and pingbacks.

Let’s Test UpCloud’s Disk IO in Chicago

Looks good to me, Read IO is a little bit lower than UpCloud’s Singapore data centre but still, it’s faster than Vultr.  I can’t wait for more data centres to become available around the world.

Why is UpCloud Disk IO so good?

I asked UpCloud on Twitter why the Disk IO was so good.

  • “MaxIOPS is UpCloud’s proprietary block-storage technology. MaxIOPS is physically redundant storage technology where all customer’s data is located in two separate physical devices at all times. UpCloud uses InfiniBand (!) network to connect storage backends to compute nodes, where customers’ cloud servers are running. All disks are enterprise-grade SSD’s. And using separate storage backends, it allows us to live migrate our customers’ cloud servers freely inside our infrastructure between compute nodes – whether it be due to hardware malfunction (compute node) or backend software updates (example CPU vulnerability and immediate patching).“

My Answers to Questions to support

Q1) What’s the difference between backups and snapshots (a Twitter user said Snapshots were a thing)

A1) Backups and snapshots are the same things with our infrastructure.

Q2) What are charges for backup of a 50GB drive?

A2) We charge $0.06 / GB of the disk being captured. But capture the whole disk, not just what was used. So for a 50GB drive, we charge $0.06 * 50 = $3/month. Even if 1GB were only used.

  • Support confirmed that each backup is charged (so 5 times manual backups are charged 5 times). Setting up a daily auto backup schedule for 2 weeks would create 14 billable backup charges.
  • I guess a 25GB server will be $1.50 a month

Q3) What are data charges if I go over my 2TB quota?

A3) Outgoing data charges are $0.056/GB after the pre-configured allowance.

Q4) What happens if my balance hits $0?

A4) You will get notification of low account balance 2 weeks in advance based on your current daily spend. When your balance reaches zero, your servers will be shut down. But they will still be charged for. You can automatically top-up if you want to assign a payment type from your Control Panel. You deposit into your balance when you want. We use a prepaid model of payment, so you need to top up before using, not billing you after usage. We give you lots of chances to top-up.

Support Tips

  • One thing to note, when deleting servers (CPU, RAM) instances, you get the option to delete the storages separately via a pop-up window. Choose to delete permanently to delete the disk, to save credit. Any disk storage lying around even unattached to servers will be billed.
  • Charges are in USD.

I think it’s time to delete my domain from Vultr in Sydney.

Deleted my Vultr domain

I deleted my Vultr domain.

Delete Vultr Server

Done.

More Reading on UpCloud

https://www.upcloud.com/documentation/faq/

UpCloud Server Status

http://status.upcloud.com

Check out my new guide on Nixstats for awesome monitoring

What I would like

  1. Ability to name individual manual backups (tag with why I backed up).
  2. Ability to push user-defined data from my VM to the dashboard
  3. Cheaper scheduled backups
  4. Sydney data centres (one day)

Update: Post UpCloud Launch Tweaks (Awesome)

I had a look at https://www.webpagetest.org/ results to see where else I can optimise webpage delivery.

Optimisation Options

Disable dasjhicons.min.css (for unauthenticated WordPress users).

Find functions.php in the www root

sudo find . -print |grep  functions.php

Edit functions.php

sudo nano ./wp-includes/functions.php

Add the following

// Remove dashicons in frontend for unauthenticated users
add_action( 'wp_enqueue_scripts', 'bs_dequeue_dashicons' );
function bs_dequeue_dashicons() {
    if ( ! is_user_logged_in() ) {
        wp_deregister_style( 'dashicons' );
    }
}

HTTP2 Push

  • Introducing HTTP/2 Server Push with NGINX 1.13.9 | NGINX
  • How To Set Up Nginx with HTTP/2 Support on Ubuntu 16.04 | DigitalOcean

I added http2 to my listening servers

server {
        root /www;

        ...
        listen 80 default_server http2;
        listen [::]:80 default_server http2;
        listen 443 ssl default_server http2;
        listen [::]:443 ssl default_server http2;
        ...

I tested a http2 push page by defining this in /etc/nginx/sites-available/default 

location = /http2/push_demo.html {
        http2_push /http2/pushed.css;
        http2_push /http2/pushedimage1.jpg;
        http2_push /http2/pushedimage2.jpg;
        http2_push /http2/pushedimage3.jpg;
}

Once I tested that push (demo here) was working I then defined two files to push that were being sent from my server

location / {
        ...
        http2_push /https://fearby.com/wp-includes/js/jquery/jquery.js;
        http2_push /wp-content/themes/news-pro/images/favicon.ico;
        ...
}

I used the WordPress Plugin Autoptimize to remove Google font usage (this removed a number of files being loaded when my page loads).

I used the WordPress Plugin WP-Optimize plugin into to remove comments and disable pingbacks and trackbacks.

WordPress wp-config.php tweaks

# Memory
define('WP_MEMORY_LIMIT','1024M');
define('WP_MAX_MEMORY_LIMIT','1024M');
set_time_limit (60);

# Security
define( 'FORCE_SSL_ADMIN', true);

# Disable Updates
define( 'WP_AUTO_UPDATE_CORE', false );
define( 'AUTOMATIC_UPDATER_DISABLED', true );

# ewww.io
define( 'WP_AUTO_UPDATE_CORE', false );

Add 2FA Authentication to server logins.

I recently checked out YubiCo YubiKeys and I have secured my Linux servers with 2FA prompts at login. Read the guide here. I secured my WordPress too.

Tweaks Todo

  • Compress placeholder BJ Lazy Load Image (plugin is broken)
  • Solve 2x Google Analytics tracker redirects (done, switched to Matomo)

Conclusion

I love UpCloud’s fast servers, give them a go (use my link and get $25 free credit).

I love Cloudflare for providing a fast CDN.

I love ewww.io’s automatic Image Compression and Resizing plugin that automatically handles image optimisations and pre Cloudflare/first hit CDN caching.

Read my post about server monitoring with Nixstats here.

Let the results speak for themselves (sub <1 second load times).

Results

I hope this guide helps someone.

Please consider using my referral code and get $25 credit for free.

https://www.upcloud.com/register/?promo=D84793

2020 Update. I have stopped using Putty and WinSCP. I now use MobaXterm (a tabbed SSH client for Windows) as it is way faster than WinSCP and better than Putty. Read my review post of MobaXTerm here.

Ask a question or recommend an article

[contact-form-7 id=”30″ title=”Ask a Question”]

Revision History

v2.1 Newer GTMetrix scores

v2.0 New UpCloud UI Update and links to new guides.

v1.9 Spelling and grammar

v1.8 Trial mode gotcha (deposit money ASAP)

v1.7 Added RSA Private key info

v1.7 – Added new firewall rules info.

v1.6 – Added more bloat to the site, still good.

v1.5 Improving Accessibility

v1.4 Added Firewall Price

v1.3 Added wp-config and plugin usage descriptions.

v1.2 Added GTMetrix historical chart.

v1.1 Fixed free typos and added final conclusion images.

v1.0 Added final results

v0.9 added more tweaks (http2 push, removing unwanted files etc)

v0.81 Draft  – Added memory usage chart and added MaxIOPS info from UpCloud.

v0.8 Draft post.

n' read -rd '' -a ram_use_arr <<< "$ram_use" ram_use="${ram_use_arr[1]}" ram_use=$(echo "$ram_use" | tr -s " ") IFS=' ' read -ra ram_use_arr <<< "$ram_use" ram_total="${ram_use_arr[1]}" ram_used="${ram_use_arr[2]}" ram_free="${ram_use_arr[3]}" d=`date '+%Y-%m-%d %H:%M:%S'` if ! [[ "$ram_free" =~ ^[0-9]+$ ]]; then echo "Sorry ram_free is not an integer" else if [ "$ram_free" -lt "100" ]; then echo "$d RAM LOW (Total: $ram_total MB, Used: $ram_used MB, Free: $ram_free MB) - Clearing Cache..." sync; echo 1 > /proc/sys/vm/drop_caches sync; echo 2 > /proc/sys/vm/drop_caches #sync; echo 3 > /proc/sys/vm/drop_caches #Not advised in production # Read for more info https://www.tecmint.com/clear-ram-memory-cache-buffer-and-swap-space-on-linux/ exit 1 else if [ "$ram_free" -lt "256" ]; then echo "$d RAM ALMOST LOW (Total: $ram_total MB, Used: $ram_used MB, Free: $ram_free MB)" exit 1 else if [ "$ram_free" -lt "512" ]; then echo "$d RAM OK (Total: $ram_total MB, Used: $ram_used MB, Free: $ram_free MB)" exit 1 else echo "$d RAM LOW (Total: $ram_total MB, Used: $ram_used MB, Free: $ram_free MB)" exit 1 fi fi fi fi

I set the cronjob to run every 15 mins, I added this to my cronjob.

 

Sample log output

 

I will check the log (/scripts/clearcache.log) in a few days and view the memory trends.

After 1/2 a day Ubuntu 18.04 is handling memory just fine, no externally triggered cache clears have happened 🙂

Free memory over time

I used https://crontab.guru/every-hour to set the right schedule in crontab.

I rebooted the VM.

Update: I now use Nixstats monitoring

Swap File

FYI: Here is a handy guide on viewing swap file usage here. I’m not using swap files so it is only an aside.

After the system rebooted I checked if the swappiness setting was active.

 

Yes, swappiness is set.

File System Tweaks – Write Back Cache (set at your own risk)

First, check your disk name and file system

 

Take note of your disk name (e.g vda1)

I used TuneFS to enable writing data to the disk before writing to the journal. tunefs is a great tool for setting file system parameters.

Warning (snip from here): “I set the mode to journal_data_writeback. This basically means that data may be written to the disk before the journal. The data consistency guarantees are the same as the ext3 file system. The downside is that if your system crashes before the journal gets written then you may loose new data — the old data may magically reappear.“

Warning this can corrupt your data. More information here.

I ran this command.

 

I edited my fstab to append the “writeback,noatime,nodiratime” flags for my volume after a reboot.

Edit FS Tab:

 

I added “writeback,noatime,nodiratime” flags to my disk options.

 

Updating Ubuntu Packages

Show updatable packages.

 

Update Packages.

 

Unattended Security Updates

Read more on Ubuntu 18.04 Unattended upgrades here, here and here.

Install Unattended Upgrades

 

Enable Unattended Upgrades.

 

Now I configure what packages not to auto update.

Edit /etc/apt/apt.conf.d/50unattended-upgrades

Find “Unattended-Upgrade::Package-Blacklist” and add packages that you don’t want automatically updated, you may want to manually update these (and monitor updates).

I prefer not to auto-update critical system apps (I will do this myself).

 

FYI: You can find installed packages by running this command:

 

Enable automatic updates by editing /etc/apt/apt.conf.d/20auto-upgrades

Edit the number at the end (the number is how many days to wait before updating) of each line.

> APT::Periodic::Update-Package-Lists “1”;
> APT::Periodic::Download-Upgradeable-Packages “1”;
> APT::Periodic::AutocleanInterval “7”;
> APT::Periodic::Unattended-Upgrade “1”;

Set to “0” to disable automatic updates.

The results of unattended-upgrades will be logged to /var/log/unattended-upgrades

Update packages now.

 

Almost done.

I Rebooted

GT Metrix Score

I almost fell off my chair. It’s an amazing feeling hitting refresh in GT Metrix and getting sub-2-second score consistently (and that is with 17 assets loading and 361KB of HTML content)

0.9sec load times

WebPageTest.org Test Score

Nice. I am not sure why the effective use of CDN has an X rating as I have the EWWW CDN and Cloudflare. First Byte time is now a respectable “B”, This was always bad.

Update: I found out the longer you set cache delays in Cloudflare the higher the score.

Web Page Test

GT Metrix has a nice historical breakdown of load times (night and day).

Upcloud Site Speed in GTMetrix

Google Page Speed Insight Desktop Score

I benchmarked with https://developers.google.com/speed/pagespeed/insights/

This will help with future SEO rankings. It is well known that Google is pushing fast servers.

100% Desktop page speed score

Google Chrome 70 Dev Console Audit (Desktop)

100% Chrome Audit Score

This is amazing, I never expected to get this high score.  I know Google like (and are pushing) sub-1-second scores.

My site is loading so well it is time I restored some old features that were too slow on other servers

  • I disabled Lazy loading of images (this was not working on some Android devices)
  • I re-added the News Widget and news images.

GTMetrix and WebpageTest sores are still good (even after adding bloat)

Benchmarks are still good

My WordPress site is not really that small either

Large website

FYI: WordPress Plugins I use.

These are the plugins I use.

  • Autoptimize – Optimises your website, concatenating the CSS and JavaScript code, and compressing it.
  • BJ Lazy Load (Now Disabled) – Lazy image loading makes your site load faster and saves bandwidth.
  • Cloudflare – Cloudflare speeds up and protects your WordPress site.
  • Contact Form 7 – Just another contact form plugin. Simple but flexible.
  • Contact Form 7 Honeypot – Add honeypot anti-spam functionality to the popular Contact Form 7 plugin.
  • Crayon Syntax Highlighter – Supports multiple languages, themes, highlighting from a URL, local file or post text.
  • Democracy Poll – Allows to create democratic polls. Visitors can vote for more than one answer & add their own answers.
  • Display Posts Shortcode – Display a listing of posts using the
    • HomePi – Raspberry PI powered touch screen showing information from house-wide sensors
    • Wemos Mini D1 Pro Pinout Guide
    • Yubico Security Key NFC
    • Moving Oracle Virtual Box Virtual Machines to another disk
    • Installing Windows 11 in a Virtual Machine on Windows 10 to test software compatibility
    • Diagnosing a Windows 10 PC that will not post
    • Using a 12-year-old dual Xeon server setup as a desktop PC
    • How to create a Private GitHub repository and access via SSH with TortiseGIT
    • Recovering a Dead Nginx, Mysql, PHP WordPress website
    • laptrinhx.com is stealing website content
    shortcode
  • EWWW Image Optimizer – Reduce file sizes for images within WordPress including NextGEN Gallery and GRAND FlAGallery. Uses jpegtran, optipng/pngout, and gifsicle.
  • GDPR Cookie Consent – A simple way to show that your website complies with the EU Cookie Law / GDPR.
  • GTmetrix for WordPress – GTmetrix can help you develop a faster, more efficient, and all-around improved website experience for your users. Your users will love you for it.
  • TinyMCE Advanced – Enables advanced features and plugins in TinyMCE, the visual editor in WordPress.
  • Wordfence Security – Anti-virus, Firewall and Malware Scan
  • WP Meta SEO – WP Meta SEO is a plugin for WordPress to fill meta for content, images and main SEO info in a single view.
  • WP Performance Score Booster – Speed-up page load times and improve website scores in services like PageSpeed, YSlow, Pingdom and GTmetrix.
  • WP SEO HTML Sitemap – A responsive HTML sitemap that uses all of the settings for your XML sitemap in the WordPress SEO by Yoast Plugin.
  • WP-Optimize – WP-Optimize is WordPress’s #1 most installed optimisation plugin. With it, you can clean up your database easily and safely, without manual queries.
  • WP News and Scrolling Widgets Pro – WP News Pro plugin with six different types of shortcode and seven different types of widgets. Display News posts with various designs.
  • Yoast SEO – The first true all-in-one SEO solution for WordPress, including on-page content analysis, XML sitemaps and much more.
  • YouTube – YouTube Embed and YouTube Gallery WordPress Plugin. Embed a responsive video, YouTube channel, playlist gallery, or live stream

How I use these plugins to speed up my site.

  • I use EWWW Image Optimizer plugin to auto-compress my images and to provide a CDN for media asset deliver (pre-Cloudflare). Learn more about ExactDN and EWWW.io here.
  • I use Autoptimize plugin to optimise HTML/CSS/JS and ensure select assets are on my EWWW CDN. This plugin also removes WordPress Emojis, removed the use of Google Fonts, allows you to define pre-configured domains, Async Javascript-files etc.
  • I use BJ Lazy Load to prevent all images in a post from loading on load (and only as the user scrolls down the page).
  • GTmetrix for WordPress and Cloudflare plugins are for information only?
  • I use WP-Optimize to ensure my database is healthy and to disable comments/trackbacks and pingbacks.

Let’s Test UpCloud’s Disk IO in Chicago

Looks good to me, Read IO is a little bit lower than UpCloud’s Singapore data centre but still, it’s faster than Vultr.  I can’t wait for more data centres to become available around the world.

Why is UpCloud Disk IO so good?

I asked UpCloud on Twitter why the Disk IO was so good.

  • “MaxIOPS is UpCloud’s proprietary block-storage technology. MaxIOPS is physically redundant storage technology where all customer’s data is located in two separate physical devices at all times. UpCloud uses InfiniBand (!) network to connect storage backends to compute nodes, where customers’ cloud servers are running. All disks are enterprise-grade SSD’s. And using separate storage backends, it allows us to live migrate our customers’ cloud servers freely inside our infrastructure between compute nodes – whether it be due to hardware malfunction (compute node) or backend software updates (example CPU vulnerability and immediate patching).“

My Answers to Questions to support

Q1) What’s the difference between backups and snapshots (a Twitter user said Snapshots were a thing)

A1) Backups and snapshots are the same things with our infrastructure.

Q2) What are charges for backup of a 50GB drive?

A2) We charge $0.06 / GB of the disk being captured. But capture the whole disk, not just what was used. So for a 50GB drive, we charge $0.06 * 50 = $3/month. Even if 1GB were only used.

  • Support confirmed that each backup is charged (so 5 times manual backups are charged 5 times). Setting up a daily auto backup schedule for 2 weeks would create 14 billable backup charges.
  • I guess a 25GB server will be $1.50 a month

Q3) What are data charges if I go over my 2TB quota?

A3) Outgoing data charges are $0.056/GB after the pre-configured allowance.

Q4) What happens if my balance hits $0?

A4) You will get notification of low account balance 2 weeks in advance based on your current daily spend. When your balance reaches zero, your servers will be shut down. But they will still be charged for. You can automatically top-up if you want to assign a payment type from your Control Panel. You deposit into your balance when you want. We use a prepay model of payment, so you need to top up before using, not billing you after usage. We give you lots of chances to top-up.

Support Tips

  • One thing to note, when deleting servers (CPU, RAM) instances, you get the option to delete the storages separately via a pop-up window. Choose to delete permanently to delete the disk, to save credit. Any disk storage lying around even unattached to servers will be billed.
  • Charges are in USD.

I think it’s time to delete my domain from Vultr in Sydney.

Deleted my Vultr domain

I deleted my Vultr domain.

Delete Vultr Server

Done.

Check out my new guide on Nixstats for awesome monitoring

What I would like

  1. Ability to name individual manual backups (tag with why I backed up).
  2. Ability to push user defined data from my VM to the dashboard
  3. Cheaper scheduled backups
  4. Sydney data centres (one day)

Update: Post UpCloud Launch Tweaks (Awesome)

I had a look at https://www.webpagetest.org/ results to see where else I can optimise webpage delivery.

Optimisation Options

HTTP2 Push

  • Introducing HTTP/2 Server Push with NGINX 1.13.9 | NGINX
  • How To Set Up Nginx with HTTP/2 Support on Ubuntu 16.04 | DigitalOcean

I added http2 to my listening servers I tested a http2 push page by defining this in /etc/nginx/sites-available/default 

Once I tested that push (demo here) was working I then defined two files to push that were being sent from my server

2FA Authentication at login

I recently checked out YubiCo YubiKeys and I have secured my Linux servers with 2FA prompts at login. Read the guide here. I secured my WordPress aswel.

Performance

I used the WordPress Plugin Autoptimize to remove Google font usage (this removed a number of files being loaded when my page loads).

I used the WordPress Plugin WP-Optimize plugin into to remove comments and disable pingbacks and trackbacks.

Results

Conclusion

I love UpCloud’s fast servers, give them a go (use my link and get $25 free credit).

I love Cloudflare for providing a fast CDN.

I love ewww.io’s automatic Image Compression and Resizing plugin that automatically handles image optimisations and pre Cloudflare/first hit CDN caching.

Read my post about server monitoring with Nixstats here.

Let the results speak for themselves (sub <1 second load times).

More Reading on UpCloud

https://www.upcloud.com/documentation/faq/

UpCloud Server Status

http://status.upcloud.com

I hope this guide helps someone.

Free Credit

Please consider using my referral code and get $25 credit for free.

https://www.upcloud.com/register/?promo=D84793

2020 Update. I have stopped using Putty and WinSCP. I now use MobaXterm (a tabbed SSH client for Windows) as it is way faster than WinSCP and better than Putty. Read my review post of MobaXTerm here.

Ask a question or recommend an article

[contact-form-7 id=”30″ title=”Ask a Question”]

Revision History

v2.2 Converting to Blocks

v2.1 Newer GTMetrix scores

v2.0 New UpCloud UI Update and links to new guides.

v1.9 Spelling and grammar

v1.8 Trial mode gotcha (deposit money ASAP)

v1.7 Added RSA Private key info

v1.7 – Added new firewall rules info.

v1.6 – Added more bloat to the site, still good.

v1.5 Improving Accessibility

v1.4 Added Firewall Price

v1.3 Added wp-config and plugin usage descriptions.

v1.2 Added GTMetrix historical chart.

v1.1 Fixed free typos and added final conclusion images.

v1.0 Added final results

v0.9 added more tweaks (http2 push, removing unwanted files etc)

v0.81 Draft  – Added memory usage chart and added MaxIOPS info from UpCloud.

v0.8 Draft post.

Filed Under: CDN, Cloud, Cloudflare, Cost, CPanel, Digital Ocean, DNS, Domain, ExactDN, Firewall, Hosting, HTTPS, MySQL, MySQLGUI, NGINX, Performance, PHP, php72, Scalability, TLS, Ubuntu, UpCloud, Vultr, Wordpress Tagged With: draft, GTetrix, host, IOPS, Load Time, maxIOPS, MySQL, nginx, Page Speed Insights, Performance, php, SSD, ubuntu, UpCloud, vm

Setting up the free MySQL database server on Windows 10

April 20, 2019 by Simon

This guide assumes you are a beginner using Windows 10 and maybe have a default website configured on Windows 10 using the built in Information Server (IIS) along with PHP (e.g PHP 7.3.4 or greater).

If you are have never used Internet Information Server (IIS) then XAMPP is a great way (single install) to setup a web server, database server and PHP with little fuss.

In this case I will manually install MySQL Community Server to add alongside Internet Information Server (IIS) on Windows 10.

Downloading
MySQL Server (Community Edition)

Go to here and click MySQL Installer for Windows.

Screenshot of Download installer link

Click MySQL Installer for Windows.

fyi: I sent myself on a goose chase as I could only see a 32 bit installer (I spent days trying to find a 64-bit installer or manual install binaries that were 64 bit). I should have read the bit that said “MySQL Installer is 32 bit, but will install both 32 bit and 64 bit binaries.“

MySQL Installer is 32 bit, but will install both 32 bit and 64 bit binaries.

You can read the installer documentation here if you wish.

I downloaded the larger of the two available installers (one 16MB the other 300+ MB, same version etc.). I had to login with an Oracle ID to start the download.

Download MySQL Installer screenshot

Install file downloaded

Installing MySQL Server (Community Edition)

I started the installer (accepted the licence agreement)

I accepted the licence agreement

I selected “Full Install“

I could have selected server only or custom.

I selected Full Install

I downloaded and installed Python 3.7, Thanks MySQL Installer for adding a link to the Python download.

I Installed Python 3.7

After Python was installed I clicked refresh in MySQL and now MySQL can see Python 3.7.

Now MySQL can see Python 3.7.

I had a Visual Studio plugin install error (because Visual Studio was not installed).

Visual Studio plugin install error (Visual Studio is not installed)

Full Install (all components selected) reported the items are ready for install.

Full Install (all components selected)

Installation status complete.

Installation status list (Full Install)

List of items to configure (post install)

List of items to configure.

I setup a standard MySQL Server (not a Cluster)

I setup a standard MySQL Server (not a Cluster)

I setup MySQL as a standard Development computer on port 3306 over TCP/IP (no Named Pipe, Shared Memory etc).

I setup MySQL as a standard Development computer on port 3306 over TCP/IP.

I enforced strong passwords.

I enforced strong passwords.

I set a root password and added few app usernames (with passwords).

I set a root password and a few app usernames (with passwords)

I named the MySQL Instance and set it to auto start when windows starts as a standard system account.

I named the MySQL Instance and set it to auto start when windows starts as a standard account.

Post installation in progress.

Installation in progress.

I accepted the defaults for the next screen (configuring routers). I tested the connection to MySQL Server.

Connect to MySQL server test screen.

Installation complete.

Installation complete screen.

MySQL WorkBench

I opened the MySQL Workbench software and viewed the server instance information. Nice.

MySQL Workbench Instance information

MySQL Workbench server performance graphs are nice.

MySQL Workbench performance graphs.

I am used to Adminer for managing MySQL on Linux so I install that now.

Install Adminer (formerly phpMinAdmin)

I prefer the PHP based Admirer database management tool from here.

https://adminer.net website  screenshot.

I downloaded a single PHP file and placed it in my IIS website root folder (as /adminer.php).

I tried to connect to my MySQL instance but received this error “The server requested authentication method unknown to the client”.

Unable to connect to MySQL with Adminer: Error "he server requested authentication method unknown to the client"

I googled and checked that I had a MySQL extension in php.ini (It did).

I opened MySQL Workbench and opened Manage Server Connections and located my my.ini file location (“C:\ProgramData\MySQL\MySQL Server 8.0\my.ini“). I opened my my.ini in a text editor and commented out the following line

#default_authentication_plugin=caching_sha2_password

and added

default_authentication_plugin= mysql_native_password

I saved the my.ini file, stopped and started the MySQL Service.

MySQL Workbench restarting database service UI screenshot.

I opened the MySQL Workbench and ran a query (File then “New Query Tab“) to set/reset the password.

ALTER USER 'root'@'localhost' IDENTIFIED WITH mysql_native_password
BY 'your_password_goes_here';  

Screenshot

Reset root password SQL statement "ALTER USER

I then tested logging in to MySQL with Adminer.

Adminer view showing default databases installed by MySQL.

Success, I can now created databases and tables.

Adminer create datbase table screenshot showing a table with 4 fields.

I hope this helps someone.

Filed Under: Database, MySQL, PHP Tagged With: MySQL, php, Setup

How to install PHP 7.2.latest on Ubuntu 16.04

November 17, 2018 by Simon

How to install PHP 7.2.latest on Ubuntu 16.04/ Ubuntu 18.04/Debian etc/

I have a number of guides on moving away from CPanel, Setting up VM’s on UpCloud, AWS, Vultr or Digital Ocean along with installing and managing WordPress from the command line. PHP is my programming language of choice.

PHP has a support page that declares the support date ranges and support types: http://php.net/supported-versions.php

PHP 7.0 going EOL

A version of PHP is either actively supported, security fix supported or end of life. Read this post to check WordPress for PHP compatibility.

From time to time vulnerabilities come up that require PHP updates to be applied.

Multiple flaw found in #PHP, most severe of which could allow arbitrary code execution

Affected Versions:
PHP 7.2 —prior to 7.2.5
PHP 7.1 —prior to 7.1.17
PHP 7.0 —prior to 7.0.30
PHP 5.0 —prior to 5.6.36https://t.co/TtiqXePoHu

Upgrade to the latest version of PHP immediately

— The Hacker News (@TheHackersNews) May 1, 2018

#PHP 7.2.12 has been released https://t.co/iNXGYTs0PX

— Neustradamus (@neustradamus) November 9, 2018

Source Link here

Advertisement:



I have guides on setting up PHP 7 here on Digital Ocean, here on AWS and here on Vultr. I have tried upgrading to PHP 7.1 in the past with no luck (I forgot to change something and rolled back to 7.0).

FYI: I have a guide on setting up PHP child workers so the output from some commands below may be different than yours. Here are the steps I performed to install PHP 7.2 alongside 7.0 then switch. to 7.2.

Backup your system

Do perform a Snapshot or Backup before proceeding. Nothing beats a quick restore if things fail.

Note: Use this information at your own risk.

Updating php 7.2.12 to 7.2.12

Update your Ubuntu systems

apt-get update && apt-get upgrade

Updating from an older php (e.g 5.x, 7.1, 7.1 to say 7.2.12)

Backup PHP

cd /etc/php
zip -r php7.0backup.zip 7.0/

Install Helper

This software provides an abstraction of the used apt repositories. It allows you to easily manage your distribution and independent software vendor software sources. More Info

apt-get install python-software-properties

Add the main PHP repo (more information)

add-apt-repository ppa:ondrej/php

Update the package lists

“In a nutshell, apt-get update doesn’t actually install new versions of the software. Instead, it updates the package lists for upgrades for packages that need upgrading, as well as new packages that have just come to the repositories.” from here

apt-get update

List Installed Packages (optional)

dpkg -l

Install PHP 7.2

apt-get install php7.2

Install common PHP modules

apt-get install php-pear php7.2-curl php7.2-dev php7.2-mbstring php7.2-zip php7.2-mysql php7.2-xml

Install PHP FPM

apt-get install php7.2-fpm

Update all packages (may be needed to update from php 7.2.4 to 7.2.5)

sudo apt-get upgrade

Edit your NGINX sites-available config

sudo nano /etc/nginx/sites-available/default
# I set: fastcgi_pass /run/php/php7.2-fpm.sock;

Edit your NGINX sites-enabled config

sudo nano /etc/nginx/sites-enabled/default
# I set: fastcgi_pass unix:/var/run/php/php7.2-fpm.sock;

I edited these lines

location ~ \.php$ {
    ...
    fastcgi_pass unix:/var/run/php/php7.2-fpm.sock;
    ...
}

Edit your PHP config (and make desired changes)

sudo nano /etc/php/7.2/fpm/php.ini

Edit your PHP pool config file (as required). See this guide here.

e.g.

> cgi.fix_pathinfo=0
> max_input_vars = 1000
> memory_limit = 1024M
> max_file_uploads = 8M
> post_max_size = 8M

sudo nano /etc/php/7.2/fpm/pool.d/www.conf

Make sure you set: listen = /run/php/php7.2-fpm.sock

Set PHP 7.2 as the default PHP

update-alternatives --set php /usr/bin/php7.2

Check your PHP version

php -v

Reload PHP

sudo service php7.2-fpm reload

Reload NGINX

nginx -t
nginx -s reload
/etc/init.d/nginx restart

Check the status of your PHP (and child workers)

sudo service php7.2-fpm status
● php7.2-fpm.service - The PHP 7.2 FastCGI Process Manager
   Loaded: loaded (/lib/systemd/system/php7.2-fpm.service; enabled; vendor preset: enabled)
   Active: active (running) since Fri 2018-05-04 19:02:27 AEST;
     Docs: man:php-fpm7.2(8)
  Process: 123456 ExecReload=/bin/kill -USR2 $MAINPID (code=exited, status=0/SUCCESS)
 Main PID: 123456 (php-fpm7.2)
   Status: "Processes active: 0, idle: 10, Requests: 0, slow: 0, Traffic: 0req/sec"
    Tasks: 11
   Memory: 30.5M
      CPU: 10.678s
   CGroup: /system.slice/php7.2-fpm.service
           ├─16494 php-fpm: master process (/etc/php/7.2/fpm/php-fpm.conf)
           ├─16497 php-fpm: pool www
           ├─16498 php-fpm: pool www
           ├─16499 php-fpm: pool www
           ├─16500 php-fpm: pool www
           ├─16501 php-fpm: pool www
           ├─16502 php-fpm: pool www
           ├─16503 php-fpm: pool www
           ├─16504 php-fpm: pool www
           ├─16505 php-fpm: pool www
           └─16506 php-fpm: pool www

Check your website.

Troubleshooting

Guides that helped me.

https://thishosting.rocks/install-php-on-ubuntu/

https://websiteforstudents.com/wordpress-supports-php-7-2-heres-how-to-install-with-nginx-and-mariadb-support/

Check your log files

tail /var/log/nginx/error.log

Debug FPM Service

systemctl status php7.2-fpm.service
● php7.2-fpm.service - The PHP 7.2 FastCGI Process Manager
   Loaded: loaded (/lib/systemd/system/php7.2-fpm.service; enabled; vendor preset: enabled)
   Active: active (running) since Sun 2018-05-06 00:18:55 AEST; 7min ago
     Docs: man:php-fpm7.2(8)
  Process: 123456 ExecReload=/bin/kill -USR2 $MAINPID (code=exited, status=0/SUCCESS)
 Main PID: 123 (php-fpm7.2)
   Status: "Processes active: 0, idle: 10, Requests: 44, slow: 0, Traffic: 0req/sec"
    Tasks: 11
   Memory: 212.6M
      CPU: 12.052s
   CGroup: /system.slice/php7.2-fpm.service
           ├─438 php-fpm: master process (/etc/php/7.2/fpm/php-fpm.conf)
           ├─441 php-fpm: pool www
           ├─442 php-fpm: pool www
           ├─443 php-fpm: pool www
           ├─444 php-fpm: pool www
           ├─445 php-fpm: pool www
           ├─446 php-fpm: pool www
           ├─447 php-fpm: pool www
           ├─449 php-fpm: pool www
           ├─450 php-fpm: pool www
           └─451 php-fpm: pool www

May 06 00:18:55 server systemd[1]: Stopped The PHP 7.2 FastCGI Process Manager.
May 06 00:18:55 server systemd[1]: Starting The PHP 7.2 FastCGI Process Manager...
May 06 00:18:55 server systemd[1]: Started The PHP 7.2 FastCGI Process Manager.

Remove PHP 7.0

sudo apt-get purge php7.0-common
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following packages were automatically installed and are no longer required:
  libaspell15 libauthen-pam-perl libc-client2007e libio-pty-perl libmcrypt4 librecode0 libtidy-0.99-0 libxmlrpc-epi0 linux-headers-4.4.0-109
  linux-headers-4.4.0-109-generic linux-headers-4.4.0-112 linux-headers-4.4.0-112-generic linux-headers-4.4.0-87
  linux-headers-4.4.0-87-generic linux-headers-4.4.0-96 linux-headers-4.4.0-96-generic linux-image-4.4.0-109-generic
  linux-image-4.4.0-112-generic linux-image-4.4.0-87-generic linux-image-4.4.0-96-generic linux-image-extra-4.4.0-109-generic
  linux-image-extra-4.4.0-112-generic linux-image-extra-4.4.0-87-generic linux-image-extra-4.4.0-96-generic mlock
Use 'sudo apt autoremove' to remove them.
The following packages will be REMOVED:
  php7.0-cli* php7.0-common* php7.0-curl* php7.0-fpm* php7.0-gd* php7.0-imap* php7.0-intl* php7.0-json* php7.0-mbstring* php7.0-mcrypt*
  php7.0-mysql* php7.0-opcache* php7.0-pspell* php7.0-readline* php7.0-recode* php7.0-sqlite3* php7.0-tidy* php7.0-xml* php7.0-xmlrpc*
  php7.0-xsl*

PHP 7.0 Removed 🙂

Remove other unused packages

sudo apt autoremove

At the time of writing (November the 18th 2018) PHP 7.2.12 is the latest version of PHP and PHP 7.3 will be out at the end of the year.

Good luck and I hope this guide helps someone

Ask a question or recommend an article

[contact-form-7 id=”30″ title=”Ask a Question”]

Revision History

v1.4 Updated the post to mention PHP 7.0 EOL

v1.3 Updated to add PHP 7.2.12 information

v1.2 PHP 7.2.9 and PHP 7.2 updates

v1.1 Remove PHP 7.0 steps

v1.0 Initial post

Filed Under: Patch, PHP, php72, Security, Ubuntu Tagged With: 16.04, 7.2.latest, How, install, on, php, to, ubuntu

Check the compatibility of your WordPress theme and plugin code with PHP Compatibility Checker

November 7, 2018 by Simon

This is how I checked the compatibility of my WordPress theme and plugin (code) with PHP Compatibility Checker

Aside

I have a number of guides on moving away from CPanel, Setting up VM’s on AWS, Vultr or Digital Ocean along with installing and managing WordPress from the command line. PHP is my programming language of choice.

Now on with the post

Snip from: https://wordpress.org/plugins/php-compatibility-checker/

What is PHP Compatibility Checker

> The WP Engine PHP Compatibility Checker can be used by any WordPress website on any web host to check PHP version compatibility.

> This plugin will lint theme and plugin code inside your WordPress file system and give you back a report of compatibility issues for you to fix. Compatibility issues are categorized into errors and warnings and will list the file and line number of the offending code, as well as the info about why that line of code is incompatible with the chosen version of PHP. The plugin will also suggest updates to themes and plugins, as a new version may offer compatible code.

> This plugin does not execute your theme and plugin code, as such this plugin cannot detect runtime compatibility issues.
Please note that linting code is not perfect. This plugin cannot detect unused code-paths that might be used for backwards compatibility, and thus might show false positives. We maintain a whitelist of plugins that can cause false positives. We are continuously working to ensure the checker provides the most accurate results possible.
This plugin relies on WP-Cron to scan files in the background. The scan will get stuck if the site’s WP-Cron isn’t running correctly. Please see the FAQ for more information.

Install PHP Compatibility Checker

PHP Compatibility Checker

I instaled by SSH’ing to my server and opening my WP Plugins folder

cd /www-root/wp-content/plugins/

I grabbed the latest download URL from here (hover over the download button), at the time of writing this was the latest version: https://downloads.wordpress.org/plugin/php-compatibility-checker.1.4.6.zip

I downloaded the plugin on my server (then unzipped it and deleted the zip)

wget https://downloads.wordpress.org/plugin/php-compatibility-checker.1.4.6.zip
unzip php-compatibility-checker.1.4.6.zip
rm php-compatibility-checker.1.4.6.zip

Enable PHP Compatibility Checker Plugin

I enabled the plugin

Enable the Plugin

I clicked on the following message

> You have just activated the PHP Compatibility Checker. Start scanning your plugins and themes for compatibility with the latest PHP versions now!

Start Scan

I already have PHP 7.2 installed so let’s scan my site. PHP 7.3 will be available in December and it is already being tested in beta.

Scan PHP 7.2

PHP Versions

PHP Versions

Site Scanning

PHP Compatibility Checker site scanning is very business-like

Site Scan Progress

PHP Compatability Checker Scan Results

2 of 22 plugins I use were not PHP 7.2 compatible (WordFence and WP Meta SEO)?

PHP Compatibility Report

I read on twitter that Wordfence may be a false positive.

Clicking toggle details reveal why the scan failed. A Two Factor Auth plugin was all OK.

Scan Results

Your results will hopefully be…

> PHP 7.2 compatible

Of if errors exist it should explain why it did not pass.

FILE: /www-root/wp-content/plugins/wp-meta-seo/jutranslation/jutranslation.php
> —————————————————————————————-
> FOUND 1 ERROR AFFECTING 1 LINE
> —————————————————————————————-
> 251 | ERROR | The function is_countable() is not present in PHP version 7.2 or earlier
> —————————————————————————————-

I can’t wait for PHP 7.3 scanning.  I will update this post in December 2018 after PHP 7.3 is released.

Good luck and I hope this guide helps someone

Ask a question or recommend an article

[contact-form-7 id=”30″ title=”Ask a Question”]

Revision History

v1.0 Initial post

Filed Under: Compatibility, PHP Tagged With: and, check, Checker, code, compatibility, of, php, plugin, the, theme, with, wordpress, your

PHP implementation to check a password exposure level with Troy Hunt’s pwnedpasswords API

March 1, 2018 by Simon

Developed by Simon Fearby https://www.fearby.com to allow PHP developers to integrate haveibeenpwned exposed password checks into their websites sign up’s (or logins). Get the latest version of this code from https://github.com/SimonFearby/phphaveibeenpwned/.

Update 2018: For the best performing VM host (UpCloud) read my guide on the awesome UpCloud VM hosts (get $25 free credit by signing up here).

This demonstrates a PHP framework less way (using HTML 5, Javascript and PHP) to validate a password by hashing (SHA1) the password (before the HTML form is submitted). A part of the password hash is checked at https://api.pwnedpasswords.com/range/{[}xxxxx} API (before a decision to save the form data is made). A Password exposed result returns the user to the sign-up form and no match completes the submission process.

SHA is performed on the password entered in the HTML form in Javascript (Your password never leaves the browser) and the PHP submit receiver performs a partial hash check at api.pwnedpasswords.com. Only a fraction of your password hash is sent to api.pwnedpasswords.com and only a partial hash of your password is returned with other partial matches (Making it hard (for anyone listening) to know what password you used).

This demo does not enforce SSL, sanitize, validate any form data or save the password to a database etc. The aim of this page is to demonstrate integration with api.pwnedpasswords.com. This demo displays a password strength meter. signup_submit.php allows you to enable debugging to see what is going on (detected errors are sent back to the submit.php and alerts shown.

Basics

signup.php – Main PHP file with a form with basic Javascript validation.

signup_submit.php – The form submit sends the form data here and calls the pwnedpasswords API

signup_ok.php – Is loaded if the password is not exposed

The initial HTML code was generated with the Platforma GUI web generator.  I added to the Javascript and relevant code. the HTML input field types are set to “text” (not “password”) so you can see the passwords.  A password SHA1 hash is generated on form submission and the user’s password never leaves the browser.

haveibeenpwned-001

A SHA1 hash of the password is updated and displayed (I am using the jsSHA library).

<script type="text/javascript" src="./js/sha/sha1.js"></script>

haveibeenpwned-002

After a basic HTML Javascript form validation is performed the uses password is replaced with a hash then the form is submitted (to signup_submit.php).

// Generate SHA1 Hash
var shaObj = new jsSHA("SHA-1", "TEXT");
shaObj.update(document.forms["submitform"]["password1"].value);
var passwordhash = shaObj.getHash("HEX");
document.getElementById("sha135").value = passwordhash;

signup_submit.php then takes the password hash, get the first 5 chars and fires up a curl connection to https://api.pwnedpasswords.com/range/$data when the data returns PHP checks the haveibeenpwned API body for matches of the matching password hashed and compared the known hash with the passwords has. Read more about how the API works here.

The PHP function that does the AI check is located here

function sendPostToPwnedPasswordsCom($data) {

    $curl = curl_init();		// Init Curl Object

    if (defined('ENABLE_DEBUG_OUTPUT') && true === ENABLE_DEBUG_OUTPUT) {
    	echo "Data to Send: $data <br />";
    	echo "Sending Data to: https://api.pwnedpasswords.com/range/$data <br />";
    }

    // Set Curl Options: http://php.net/manual/en/function.curl-setopt.php
    curl_setopt($curl, CURLOPT_URL, "https://api.pwnedpasswords.com/range/$data");
    curl_setopt($curl, CURLOPT_RETURNTRANSFER, true); 
	curl_setopt($curl, CURLOPT_FRESH_CONNECT, true); 		// TRUE to force the use of a new connection instead of a cached one.
	curl_setopt($curl, CURLOPT_FORBID_REUSE, true); 		// TRUE to force the connection to explicitly close when it has finished processing, and not be pooled for reuse.
	curl_setopt($curl, CURLOPT_TIMEOUT, 10); 
    curl_setopt($curl, CURLOPT_MAXREDIRS, 10); 
    curl_setopt($curl, CURLOPT_FOLLOWLOCATION, true);		// TRUE to follow any "Location: " header that the server sends as part of the HTTP header (note this is recursive, 
    														// PHP will follow as many "Location: " headers that it is sent, unless CURLOPT_MAXREDIRS is set).

	// Make a request to the api.pwnedpasswords.com
    $http_request_result = curl_exec ($curl);
    $http_return_code = curl_getinfo($curl, CURLINFO_HTTP_CODE);

	// api.pwnedpasswords.com Response codes
	/*
	Semantic HTTP response code are used to indicate the result of the search:

	Code	Description
	200	Ok — everything worked and there's a string array of pwned sites for the account
	400	Bad request — the account does not comply with an acceptable format (i.e. it's an empty string)
	403	Forbidden — no user agent has been specified in the request
	404	Not found — the account could not be found and has therefore not been pwned
	429	Too many requests — the rate limit has been exceeded
	*/


	// Change the return code to debug
	//$http_return_code = 429;
	if (defined('ENABLE_DEBUG_OUTPUT') && true === ENABLE_DEBUG_OUTPUT) {
      	echo "Return HTTP CODE: $http_return_code <br />";
    }
	
    // What was the http response code from api.pwnedpasswords.com
	if ($http_return_code == 200) {
		// OK (All other return codes direct the user back with an error)
		if (defined('ENABLE_DEBUG_OUTPUT') && true === ENABLE_DEBUG_OUTPUT) {
	    	echo "Return HTTP Data site: " . strlen($http_request_result) . " bytes. <br />";
	    }

	} elseif ($http_return_code == 400) {
		// api.pwnedpasswords.com: API Bad Request
    	if (defined('ENABLE_DEBUG_OUTPUT') && true === ENABLE_DEBUG_OUTPUT) {
        	echo "API Bad Request <br />";
        }

        header("Location: signup.php?Error=PwnedpasswordsAPIBadRequest&code=" . $http_return_code);
		die();

	} elseif ($http_return_code == 403) {
    	// api.pwnedpasswords.com: API Bad User Agent
    	if (defined('ENABLE_DEBUG_OUTPUT') && true === ENABLE_DEBUG_OUTPUT) {
        	echo "API Bad User Agent <br />";
        }

        header("Location: signup.php?Error=PwnedpasswordsAPIBadUserAgent&code=" . $http_return_code);
		die();

	} elseif ($http_return_code == 404) {
		// api.pwnedpasswords.com: API User Not Found, not needed in his password hash check but we may as well catch now
    	if (defined('ENABLE_DEBUG_OUTPUT') && true === ENABLE_DEBUG_OUTPUT) {
        	echo "API User Not Found <br />";
        }

        header("Location: signup.php?Error=PwnedpasswordsAPIuserNotFound&code=" . $http_return_code);
		die();

    } elseif ($http_return_code == 429) {
    	// api.pwnedpasswords.com: API Too Many Requests
    	if (defined('ENABLE_DEBUG_OUTPUT') && true === ENABLE_DEBUG_OUTPUT) {
        	echo "API Too Many Requests</ br>";
        }

        header("Location: signup.php?Error=PwnedpasswordsAPIuserTooManyRequests&code=" . $http_return_code);
		die();

    } else {
    	// api.pwnedpasswords.com: API Down

		if (defined('ENABLE_DEBUG_OUTPUT') && true === ENABLE_DEBUG_OUTPUT) {
        	echo "API Down!</ br>";
        }

        header("Location: signup.php?Error=PwnedpasswordsAPIDown&code=" . $http_return_code);
		die();
    } 

    // Tidy up the curl object and return the request api body
    curl_close ($curl); 
	return $http_request_result; 
}

You can enable debugging in signup_submit.php if you wish (but if an echo has presented debug data and a header redirect happens it will produce an error).

// define('ENABLE_DEBUG_OUTPUT', true);
define('ENABLE_DEBUG_OUTPUT', false);

echo statements are wrapped

if (defined('ENABLE_DEBUG_OUTPUT') && true === ENABLE_DEBUG_OUTPUT) {
    echo "API Bad Request <br />";
}

signup_submit.php will redirect the browser back to signup.php if an error is found

header("Location: signup.php?Error=PwnedpasswordsAPIuserTooManyRequests&code=" . $http_return_code);
die();

A success event (no password hash found) the user is sent to ignup_ok.php

header("Location: signup_ok.php");
die();

signup.php will check for the Error query string

Then display bootstrap alert errors for each error case

if ($error == "PwnedpasswordsAPIuserTooManyRequests") {
    echo '<div class="alert alert-danger" role="alert">';
    echo '<br /><a target="_blank" href="https://api.pwnedpasswords.com">https://api.pwnedpasswords.com</a> reported too many requests to the API from this IP, please wait a few seconds and try again.<br /> (E009)<br />';
    echo '<br />Please consider donating to Troy Hunt <a target="_blank" href="https://www.troyhunt.com/donations-why-i-dont-need-them-and-why/">https://www.troyhunt.com/donations-why-i-dont-need-them-and-why/</a> (<em>developer of <a target="_blank" href="https://haveibeenpwned.com">https://haveibeenpwned.com</a></em>).<br />';
    echo '</div>'; 
}
if ($error == "PwnedpasswordsAPIuserTooManyRequests") {
    echo '<div class="alert alert-danger" role="alert">';
    echo '<br /><a target="_blank" href="https://api.pwnedpasswords.com">https://api.pwnedpasswords.com</a> reported too many requests to the API from this IP, please wait a few seconds and try again.<br /> (E009)<br />';
    echo '<br />Please consider donating to Troy Hunt <a target="_blank" href="https://www.troyhunt.com/donations-why-i-dont-need-them-and-why/">https://www.troyhunt.com/donations-why-i-dont-need-them-and-why/</a> (<em>developer of <a target="_blank" href="https://haveibeenpwned.com">https://haveibeenpwned.com</a></em>).<br />';
    echo '</div>'; 
}

Sample Errors

Sample Password exposed to error.

haveibeenpwned-003

Sample API Offline alert

haveibeenpwned-005

If form field needs attention a JavaScript event is written to set the focus etc.

if ($error == "PasswordExposed") {
    echo '<script>';
    echo 'document.getElementById("password1").focus();';
    echo 'document.getElementById("password1").select();';
    echo '</script>';
}

If no password hash has been matched with pwnedpasswords the user is directed to signup_ok.php (not very exciting but that’s your jobs to integrate it with your system and harden).

haveibeenpwned-006

Sample debugging output

haveibeenpwned-007

Get the code: https://github.com/SimonFearby/phphaveibeenpwned/

More Reading

If you are using Ubuntu don’t forget to set up a free SSL cert, setting up an SSL cert on OSX is also a good idea. I have guides on setting up an Ubuntu server on AWS, Digital Ocean and Vultr. I love Vultr VM hosts and have blogged about setting up WordPress via the CLI, uploading files with SSH, restoring Vultr Snapshots etc.

I hope this guide helps someone.

Ask a question or recommend an article

[contact-form-7 id=”30″ title=”Ask a Question”]

Revision History

v1.0 Initial post

Filed Under: Pwned Tagged With: a, api, check, exposure, Hunts, implementation, level, password, php, pwnedpasswords, to, Troy, with

How to setup PHP FPM on demand child workers in PHP 7.x to increase website traffic

February 26, 2018 by Simon

This blog post will show you how to setup PHP FPM on-demand child workers in PHP 7.x to increase website traffic.

My blog was experiencing a number of slow page loads and often running “sudo service php7.0-fpm restart” would resolve the problem.  I have blogged before about setting up Ubuntu Servers on AWS, Digital Ocean and Vultr but this post is about debugging and speeding up PHP on Ubuntu self-managed servers.

Background

I tried the normal tweaks in “/etc/php/7.0/fpm/php.ini” like

memory_limit = 512M

I setup servers like this.

Temporary Fix

I had even set up a temporary NGINX and php7.0-fpm restart ever 5 and 1-minute respectively until I had time to look into this.

*/5 * * * * /etc/init.d/nginx restart
* * * * * sudo service php7.0-fpm restart

Debug

I checked out the PHP7.0-fpm.log and I found the following

[25-Feb-2018 16:35:35] WARNING: [pool www] server reached pm.max_children setting (5), consider raising it
[25-Feb-2018 17:02:26] WARNING: [pool www] server reached pm.max_children setting (5), consider raising it
[25-Feb-2018 17:51:09] WARNING: [pool www] server reached pm.max_children setting (5), consider raising it
[25-Feb-2018 18:18:51] WARNING: [pool www] server reached pm.max_children setting (5), consider raising it
[25-Feb-2018 20:58:12] WARNING: [pool www] server reached pm.max_children setting (5), consider raising it
[25-Feb-2018 21:02:57] WARNING: [pool www] server reached pm.max_children setting (5), consider raising it
[25-Feb-2018 21:30:58] WARNING: [pool www] server reached pm.max_children setting (5), consider raising it
[25-Feb-2018 21:35:10] WARNING: [pool www] server reached pm.max_children setting (5), consider raising it
[25-Feb-2018 23:36:28] WARNING: [pool www] server reached pm.max_children setting (5), consider raising it

Setting up a PHP-FPM pool

Read the official guide here on configuring PHP FPM pools etc.

I edited “/etc/php/7.0/fpm/pool.d/www.conf” and added the following to set up a pool of PHP-FPM servers.

; Note: This value is mandatory.
pm = dynamic

; The number ocf child processes to be created when pm is set to 'static' and the
; maximum number of child processes when pm is set to 'dynamic' or 'ondemand'.
; This value sets the limit on the number of simultaneous requests that will be
; served. Equivalent to the ApacheMaxClients directive with mpm_prefork.
; Equivalent to the PHP_FCGI_CHILDREN environment variable in the original PHP
; CGI. The below defaults are based on a server without much resources. Don't
; forget to tweak pm.* to fit your needs.
; Note: Used when pm is set to 'static', 'dynamic' or 'ondemand'
; Note: This value is mandatory.
pm.max_children = 40

; The number of child processes created on startup.
; Note: Used only when pm is set to 'dynamic'
; Default Value: min_spare_servers + (max_spare_servers - min_spare_servers) / 2
pm.start_servers = 10

; The desired minimum number of idle server processes.
; Note: Used only when pm is set to 'dynamic'
; Note: Mandatory when pm is set to 'dynamic'
pm.min_spare_servers = 5

; The number of seconds after which an idle process will be killed.
; Note: Used only when pm is set to 'ondemand'
; Default Value: 10s
pm.process_idle_timeout = 30s;

; The number of requests each child process should execute before respawning.
; This can be useful to work around memory leaks in 3rd party libraries. For
; endless request processing specify '0'. Equivalent to PHP_FCGI_MAX_REQUESTS.
; Default Value: 0
pm.max_requests = 250

You may need more or fewer child processes depending on your needs and free memory.

After editing the PHP-FPM config file restart PHP-FPM

sudo service php7.0-fpm restart

Restart Nginx

sudo /etc/init.d/nginx restart

You will be able to view the PHP child process status by typing the following

service php7.0-fpm status
● php7.0-fpm.service - The PHP 7.0 FastCGI Process Manager
   Loaded: loaded (/lib/systemd/system/php7.0-fpm.service; enabled; vendor preset: enabled)
   Active: active (running) since Mon 2018-02-26 00:33:17 AEDT; 5min ago
     Docs: man:php-fpm7.0(8)
 Main PID: 1284 (php-fpm7.0)
   Status: "Processes active: 0, idle: 10, Requests: 56, slow: 0, Traffic: 0.2req/sec"
    Tasks: 11
   Memory: 330.1M
      CPU: 39.558s
   CGroup: /system.slice/php7.0-fpm.service
           ├─1284 php-fpm: master process (/etc/php/7.0/fpm/php-fpm.conf)
           ├─1503 php-fpm: pool www
           ├─1504 php-fpm: pool www
           ├─1505 php-fpm: pool www
           ├─1506 php-fpm: pool www
           ├─1507 php-fpm: pool www
           ├─1508 php-fpm: pool www
           ├─1509 php-fpm: pool www
           ├─1511 php-fpm: pool www
           ├─1512 php-fpm: pool www
           └─1513 php-fpm: pool www

Feb 25 10:33:16 servername systemd[1]: Starting The PHP 7.0 FastCGI Process Manager...
Feb 25 10:33:17 servername systemd[1]: Started The PHP 7.0 FastCGI Process Manager.

You can use htop (commands here) to see child PHP processes in the pool and to verify free memory.

php-pool

This command is good for watching free memory on a server

watch -n 1 'free -m'

I prefer to use up free memory (if available) and leave about 100mb free.

Every 1.0s: free -m                                                                                                            Mon Feb 26 00:47:55 2018

              total        used        free      shared  buff/cache   available
Mem:            992         518         120          40         353         280
Swap:             0           0           0

Hope this helps someone.

Donate and make this blog better

Ask a question or recommend an article

[contact-form-7 id=”30″ title=”Ask a Question”]

Revision History

v1.0 Initial Post

Filed Under: PHP Tagged With: 7.x, child, demand, FPM, How, in, increase, on, php, Setup, to, traffic, website, workers

Updating PHP 7.0 to 7.1 on an Ubuntu 16.04 Vultr VM

November 21, 2017 by Simon

Here is how you can quickly update PHP 7.0 to 7.1 on a Vultr Ubuntu domain.

I have configured a number of Vultr domains with NGINX and PHP 7.1 FPM and today I realised I need to update PHP 7.0 to 7.1 to fix a  few security exploits (read more here and here on securing Ubuntu in the cloud). PHP has a good page where you can keep up to date with PHP news here https://secure.php.net/. You can also view the PHP bug tracker to view bugs here. PHP aggregation user @php_net on twitter is good to follow, the official PHP twitter account is @official_php.

I have not noticed in daily Ubuntu package updates no option to update PHP 7.0 to 7.1, I must have to update manually.

WARNING: Backup your site and test this on a non-production server before doing it on a live server.  I had an issue with PHP 7.1 breaking WordPress 3.9 (MySQL issues with some plugins) and I had to roll back to 7.0 (see rollback tips in troubleshooting below). WordPress says it is PHP 7.1 compatible but issues exist. WordPress 3.9 ditches “mysql” and used “mysqli” and when instead PHP 7.1 WordPress could not find “mysqli”?

List packages with updates

sudo /usr/lib/update-notifier/apt-check -p
linux-libc-dev
python3-apport
python3-problem-report

You can run the following to view upgradable packages (TIP: Backup NGINX and other configuration files before any upgrades).

apt list --upgradable
Listing... Done
apport/xenial-updates,xenial-updates,xenial-security,xenial-security 2.20.1-0ubuntu2.13 all [upgradable from: 2.20.1-0ubuntu2.12]
linux-generic/xenial-updates,xenial-security 4.4.0.101.106 amd64 [upgradable from: 4.4.0.87.93]
linux-headers-generic/xenial-updates,xenial-security 4.4.0.101.106 amd64 [upgradable from: 4.4.0.87.93]
linux-image-generic/xenial-updates,xenial-security 4.4.0.101.106 amd64 [upgradable from: 4.4.0.87.93]
linux-libc-dev/xenial-updates,xenial-security 4.4.0-101.124 amd64 [upgradable from: 4.4.0-98.121]
nginx/xenial,xenial 1.13.6-2chl1~xenial1 all [upgradable from: 1.13.3-1chl1~xenial1]
nginx-common/xenial,xenial 1.13.6-2chl1~xenial1 all [upgradable from: 1.13.3-1chl1~xenial1]
nginx-core/xenial 1.13.4-1chl1~xenial1 amd64 [upgradable from: 1.13.3-1chl1~xenial1]
procmail/xenial-updates,xenial-security 3.22-25ubuntu0.16.04.1 amd64 [upgradable from: 3.22-25]
python-cryptography/xenial 1.9-1+ubuntu16.04.1+certbot+2 amd64 [upgradable from: 1.7.1-2+certbot~xenial+1]
python-openssl/xenial,xenial 17.3.0-1~0+ubuntu16.04.1+certbot+1 all [upgradable from: 17.0.0-0+certbot~xenial+1]
python-requests/xenial,xenial 2.18.1-1+ubuntu16.04.1+certbot+1 all [upgradable from: 2.12.4-1+certbot~xenial+1]
python-urllib3/xenial,xenial 1.21.1-1+ubuntu16.04.1+certbot+1 all [upgradable from: 1.19.1-1+certbot~xenial+1]
python3-apport/xenial-updates,xenial-updates,xenial-security,xenial-security 2.20.1-0ubuntu2.13 all [upgradable from: 2.20.1-0ubuntu2.12]
python3-problem-report/xenial-updates,xenial-updates,xenial-security,xenial-security 2.20.1-0ubuntu2.13 all [upgradable from: 2.20.1-0ubuntu2.12]
python3-requests/xenial,xenial 2.18.1-1+ubuntu16.04.1+certbot+1 all [upgradable from: 2.12.4-1+certbot~xenial+1]
python3-urllib3/xenial,xenial 1.21.1-1+ubuntu16.04.1+certbot+1 all [upgradable from: 1.19.1-1+certbot~xenial+1]

Update your server packages

sudo apt-get update && sudo apt-get upgrade

Reboot

sudo shutdown -r now

You should now see this on startup

0 packages can be updated.
0 updates are security updates.

You can view your installed PHP configuration file and installed version by typing to following in your servers command line.

# locate php.ini
/etc/php/7.0/apache2/php.ini
/etc/php/7.0/cli/php.ini
/etc/php/7.0/fpm/php.ini

Now let’s install a package viewer

sudo apt-get install apt-show-versions

Search installed packages (or non-installed) PHP packages.

sudo apt-show-versions | grep php | more

libapache2-mod-php7.0:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
libapache2-mod-php7.0:i386 not installed
php-common:all/xenial 1:55+ubuntu16.04.1+deb.sury.org+1 uptodate
php-xdebug:amd64/xenial 2.5.5-3+ubuntu16.04.1+deb.sury.org+1 uptodate
php-xdebug:i386 not installed
php7.0:all/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-cli:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-cli:i386 not installed
php7.0-common:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-common:i386 not installed
php7.0-curl:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-curl:i386 not installed
php7.0-dev:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-dev:i386 not installed
php7.0-fpm:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-fpm:i386 not installed
php7.0-gd:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-gd:i386 not installed
php7.0-imap:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-imap:i386 not installed
php7.0-intl:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-intl:i386 not installed
php7.0-json:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-json:i386 not installed
php7.0-ldap:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-ldap:i386 not installed
php7.0-mbstring:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-mbstring:i386 not installed
php7.0-mysql:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-mysql:i386 not installed
php7.0-opcache:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-opcache:i386 not installed
php7.0-pgsql:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-pgsql:i386 not installed
php7.0-phpdbg:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-phpdbg:i386 not installed
php7.0-pspell:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-pspell:i386 not installed
php7.0-readline:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-readline:i386 not installed
php7.0-recode:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-recode:i386 not installed
php7.0-snmp:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-snmp:i386 not installed
php7.0-tidy:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-tidy:i386 not installed
php7.0-xml:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-xml:i386 not installed
php7.0-zip:amd64/xenial 7.0.25-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.0-zip:i386 not installed

Uninstall all local PHP related packages

sudo apt-get remove php* 
...
After this operation, 35.7 MB disk space will be freed.
Do you want to continue? [Y/n] y
(Reading database ... 139182 files and directories currently installed.)
Removing php7.0 (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php-xdebug (2.5.5-3+ubuntu16.04.1+deb.sury.org+1) ...
Removing libapache2-mod-php7.0 (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-zip (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-xml (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-mbstring (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-dev (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-fpm (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-curl (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-gd (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-imap (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-intl (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-phpdbg (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-ldap (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-mysql (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-pgsql (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-pspell (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-recode (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-snmp (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-tidy (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-cli (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-json (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-opcache (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-readline (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php7.0-common (7.0.25-1+ubuntu16.04.1+deb.sury.org+1) ...
Removing php-common (1:55+ubuntu16.04.1+deb.sury.org+1) ...
Processing triggers for man-db (2.7.5-1) ...

Confirm packages are uninstalled

sudo apt-show-versions | grep php
>

Install PHP 7.1 and common packages

sudo apt-get install php7.1 php7.1-cli php7.1-common libapache2-mod-php7.1 php7.1-mysql php7.1-fpm php7.1-curl php7.1-gd php7.1-bz2 php7.1-mcrypt php7.1-json php7.1-tidy php7.1-mbstring php-redis php-memcached

Verify PHP 7.1 installation

apt-show-versions | grep php
libapache2-mod-php7.1:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
libapache2-mod-php7.1:i386 not installed
php-common:all/xenial 1:55+ubuntu16.04.1+deb.sury.org+1 uptodate
php-igbinary:amd64/xenial 2.0.1-1+ubuntu16.04.1+deb.sury.org+2 uptodate
php-igbinary:i386 not installed
php-memcached:amd64/xenial 3.0.3+2.2.0-1+ubuntu16.04.1+deb.sury.org+3 uptodate
php-memcached:i386 not installed
php-msgpack:amd64/xenial 2.0.2+0.5.7-1+ubuntu16.04.1+deb.sury.org+3 uptodate
php-msgpack:i386 not installed
php-redis:amd64/xenial 3.1.4-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php-redis:i386 not installed
php7.1:all/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-bz2:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-bz2:i386 not installed
php7.1-cli:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-cli:i386 not installed
php7.1-common:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-common:i386 not installed
php7.1-curl:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-curl:i386 not installed
php7.1-fpm:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-fpm:i386 not installed
php7.1-gd:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-gd:i386 not installed
php7.1-json:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-json:i386 not installed
php7.1-mbstring:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-mbstring:i386 not installed
php7.1-mcrypt:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-mcrypt:i386 not installed
php7.1-mysql:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-mysql:i386 not installed
php7.1-opcache:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-opcache:i386 not installed
php7.1-readline:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-readline:i386 not installed
php7.1-tidy:amd64/xenial 7.1.11-1+ubuntu16.04.1+deb.sury.org+1 uptodate
php7.1-tidy:i386 not installed

Reboot

sudo shutdown -r now

See if the PHP 7.1 FPM service has started

sudo systemctl | grep php
> php7.1-fpm.service

Restart PHP 7.1 FPM Service

sudo systemctl restart php7.1-fpm.service

Edit your /etc/nginx/sites-enabled/default and change the fastcgi_pass from “7.0” to “7.1”

sudo nano /etc/nginx/sites-enabled/default

Edits:

location ~ \.php$ {
    ...
    fastcgi_pass unix:/var/run/php/php7.1-fpm.sock;
    ...
}

Reload NGINX configuration and restart NGINX

sudo nginx -t && sudo nginx -s reload && sudo /etc/init.d/nginx restart
nginx: the configuration file /etc/nginx/nginx.conf syntax is ok
nginx: configuration file /etc/nginx/nginx.conf test is successful
[ ok ] Restarting nginx (via systemctl): nginx.service.

Your website should now be back up and running PHP 7.1

PHP 7.1

Post Install Tasks

View this blog post on other useful linux commands.

Run a Lynis security scan.

Edit your PHP.ini file and add required changes (e.g upload sizes).

sudo nano /etc/php/7.1/fpm/php.ini
# upload_max_filesize = 2M
+ upload_max_filesize = 8M

Troubleshooting

View PHP configuration values (add this to a debug.php and load in in a browser)

<?php

// Show all information, defaults to INFO_ALL
phpinfo();

// Show just the module information.
// phpinfo(8) yields identical results.
phpinfo(INFO_MODULES);

?>

I broke my WordPress 3.9 when I tried to update to PHP 7.1 so I rolled back to 7.0.

sudo apt-get remove php*
sudo apt-get -y install php7.0-fpm
sudo apt-get -y install php7.0-mysql php7.0-curl php7.0-gd php7.0-intl php-pear php-imagick php7.0-imap php7.0-mcrypt php-memcache  php7.0-pspell php7.0-recode php7.0-sqlite3 php7.0-tidy php7.0-xmlrpc php7.0-xsl php7.0-mbstring php-gettext
service php7.0-fpm reload

Google help had me stuck for a while when I had issues purging php 7.1.

Purge Error

Because my blog (with install steps) was down I used this site to help be find the commands to run.

Conclusion

Sometimes going with cutting edge tech you will go out on a limb, ensure you know and can restore a working site if need be.

Always have a backup and restore plan.

Hope this guide helps.

Donate and make this blog better


Ask a question or recommend an article
[contact-form-7 id=”30″ title=”Ask a Question”]

v1.35 WordPress 3.9 error with PHP 7.1

Filed Under: PHP, Server, Ubuntu, VM, Vultr Tagged With: 16.04, a, on, php, ubuntu, Updating, vm, vultr

Caching MySQL queries in memory for xx seconds

February 10, 2016 by Simon Fearby

PHP Opcache is a good caching plugin for PHP but what if you wanted a quicker in code way of selectively caching MySQL results in memory. More information on the underlying memcached.

Official Description:

Description: A high-performance memory object caching system. Danga Interactive developed memcached to enhance the speed of LiveJournal.com, a site which was already doing 20 million+ dynamic page views per day for 1 million users with a bunch of webservers and a bunch of database servers. memcached dropped the database load to almost nothing, yielding faster page load times for users, better resource utilization, and faster access to the databases on a memcache miss.

Danga Interactive developed memcached to enhance the speed of LiveJournal.com, a site which was already doing 20 million+ dynamic page views per day for 1 million users with a bunch of webservers and a bunch of database servers. memcached dropped the database load to almost nothing, yielding faster page load times for users, better resource utilization, and faster access to the databases on a memcache miss.

Memcached optimizes specific high-load serving applications that are designed to take advantage of its versatile no-locking memory access system. Clients are available in several different programming languages, to suit the needs of the specific application. Traditionally this has been used in mod_perl apps to avoid storing large chunks of data in Apache memory, and to share this burden across several machines. Caching other content is a good idea too.

Homepage: http://www.danga.com/memcached/

Follow this guide to install memcached.

Here is a simple script to query and cache MySQL data for 10 seconds.

$start = microtime(true);

$mem = new Memcached();
$mem->addServer("127.0.0.1", 11211);

mysql_connect("localhost", "dbuser", "dbpass") or die(mysql_error());
mysql_select_db("databasename") or die(mysql_error());

$query = "SELECT COUNT(*) AS TotalUsers FROM databasename WHERE 1";
$querykey = "TotalUsers" . md5($query);

$result = $mem->get($querykey);

if ($result) {
    
    print "

Data was: " . $result[0] . "

";
    print "

Caching success!

Retrieved data from memcached!

";

} else {
    
    $result = mysql_fetch_array(mysql_query($query)) or die(mysql_error());
    $mem->set($querykey, $result, 10);
    print "

Data was: " . $result[0] . "

";
    print "

Data not found in memcached.

Data retrieved from MySQL and stored in memcached for next time.

";
}

$end = microtime(true);
$time = number_format(($end - $start), 6);
 
echo 'This page loaded in ', $time, ' seconds';




For me, the cached page loaded 0.022076 seconds faster than without Memcache per page. A small MySQL database is fast but a cached database is faster.
V1.2 added memcached link

Donate and make this blog better




Ask a question or recommend an article
[contact-form-7 id=”30″ title=”Ask a Question”]

Filed Under: Development, Linux, MySQL Tagged With: caching, memory, php

Primary Sidebar

Poll

What would you like to see more posts about?
Results

Support this Blog

Create your own server today (support me by using these links

Create your own server on UpCloud here ($25 free credit).

Create your own server on Vultr here.

Create your own server on Digital Ocean here ($10 free credit).

Remember you can install the Runcloud server management dashboard here if you need DevOps help.

Advertisement:

Tags

2FA (9) Advice (17) Analytics (9) App (9) Apple (10) AWS (9) Backup (21) Business (8) CDN (8) Cloud (49) Cloudflare (8) Code (8) Development (26) Digital Ocean (13) DNS (11) Domain (27) Firewall (12) Git (7) Hosting (18) HTTPS (6) IoT (9) LetsEncrypt (7) Linux (20) Marketing (11) MySQL (24) NGINX (11) NodeJS (11) OS (10) PHP (13) Scalability (12) Scalable (14) Security (44) SEO (7) Server (26) Software (7) SSH (7) ssl (17) Tech Advice (9) Ubuntu (39) Uncategorized (23) UpCloud (12) VM (44) Vultr (24) Website (14) Wordpress (25)

Disclaimer

Terms And Conditions Of Use All content provided on this "www.fearby.com" blog is for informational purposes only. Views are his own and not his employers. The owner of this blog makes no representations as to the accuracy or completeness of any information on this site or found by following any link on this site. Never make changes to a live site without backing it up first.

Advertisement:

Footer

Popular

  • Backing up your computer automatically with BackBlaze software (no data limit)
  • How to back up an iPhone (including photos and videos) multiple ways
  • Add two factor auth login protection to WordPress with YubiCo hardware YubiKeys and or 2FA Authenticator App
  • Setup two factor authenticator protection at login on Ubuntu or Debian
  • Using the Yubico YubiKey NEO hardware-based two-factor authentication device to improve authentication and logins to OSX and software
  • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
  • Monitor server performance with NixStats and receive alerts by SMS, Push, Email, Telegram etc
  • Speeding up WordPress with the ewww.io ExactDN CDN and Image Compression Plugin
  • Add Google AdWords to your WordPress blog

Security

  • Check the compatibility of your WordPress theme and plugin code with PHP Compatibility Checker
  • Add two factor auth login protection to WordPress with YubiCo hardware YubiKeys and or 2FA Authenticator App
  • Setup two factor authenticator protection at login on Ubuntu or Debian
  • Using the Yubico YubiKey NEO hardware-based two-factor authentication device to improve authentication and logins to OSX and software
  • Setting up DNSSEC on a Namecheap domain hosted on UpCloud using CloudFlare
  • Set up Feature-Policy, Referrer-Policy and Content Security Policy headers in Nginx
  • Securing Google G Suite email by setting up SPF, DKIM and DMARC with Cloudflare
  • Enabling TLS 1.3 SSL on a NGINX Website (Ubuntu 16.04 server) that is using Cloudflare
  • Using the Qualys FreeScan Scanner to test your website for online vulnerabilities
  • Beyond SSL with Content Security Policy, Public Key Pinning etc
  • Upgraded to Wordfence Premium to get real-time login defence, malware scanner and two-factor authentication for WordPress logins
  • Run an Ubuntu VM system audit with Lynis
  • Securing Ubuntu in the cloud
  • No matter what server-provider you are using I strongly recommend you have a hot spare ready on a different provider

Code

  • How to code PHP on your localhost and deploy to the cloud via SFTP with PHPStorm by Jet Brains
  • Useful Java FX Code I use in a project using IntelliJ IDEA and jdk1.8.0_161.jdk
  • No matter what server-provider you are using I strongly recommend you have a hot spare ready on a different provider
  • How to setup PHP FPM on demand child workers in PHP 7.x to increase website traffic
  • Installing Android Studio 3 and creating your first Kotlin Android App
  • PHP 7 code to send object oriented sanitised input data via bound parameters to a MYSQL database
  • How to use Sublime Text editor locally to edit code files on a remote server via SSH
  • Creating your first Java FX app and using the Gluon Scene Builder in the IntelliJ IDEA IDE
  • Deploying nodejs apps in the background and monitoring them with PM2 from keymetrics.io

Tech

  • Backing up your computer automatically with BackBlaze software (no data limit)
  • How to back up an iPhone (including photos and videos) multiple ways
  • US v Huawei: The battle for 5G
  • Check the compatibility of your WordPress theme and plugin code with PHP Compatibility Checker
  • Is OSX Mojave on a 2014 MacBook Pro slower or faster than High Sierra
  • Telstra promised Fibre to the house (FTTP) when I had FTTN and this is what happened..
  • The case of the overheating Mac Book Pro and Occam’s Razor
  • Useful Linux Terminal Commands
  • Useful OSX Terminal Commands
  • Useful Linux Terminal Commands
  • What is the difference between 2D, 3D, 360 Video, AR, AR2D, AR3D, MR, VR and HR?
  • Application scalability on a budget (my journey)
  • Monitor server performance with NixStats and receive alerts by SMS, Push, Email, Telegram etc
  • Why I will never buy a new Apple Laptop until they fix the hardware cooling issues.

Wordpress

  • Replacing Google Analytics with Piwik/Matomo for a locally hosted privacy focused open source analytics solution
  • Setting web push notifications in WordPress with OneSignal
  • Telstra promised Fibre to the house (FTTP) when I had FTTN and this is what happened..
  • Check the compatibility of your WordPress theme and plugin code with PHP Compatibility Checker
  • Add two factor auth login protection to WordPress with YubiCo hardware YubiKeys and or 2FA Authenticator App
  • Monitor server performance with NixStats and receive alerts by SMS, Push, Email, Telegram etc
  • Upgraded to Wordfence Premium to get real-time login defence, malware scanner and two-factor authentication for WordPress logins
  • Wordfence Security Plugin for WordPress
  • Speeding up WordPress with the ewww.io ExactDN CDN and Image Compression Plugin
  • Installing and managing WordPress with WP-CLI from the command line on Ubuntu
  • Moving WordPress to a new self managed server away from CPanel
  • Moving WordPress to a new self managed server away from CPanel

General

  • Backing up your computer automatically with BackBlaze software (no data limit)
  • How to back up an iPhone (including photos and videos) multiple ways
  • US v Huawei: The battle for 5G
  • Using the WinSCP Client on Windows to transfer files to and from a Linux server over SFTP
  • Connecting to a server via SSH with Putty
  • Setting web push notifications in WordPress with OneSignal
  • Infographic: So you have an idea for an app
  • Restoring lost files on a Windows FAT, FAT32, NTFS or Linux EXT, Linux XFS volume with iRecover from diydatarecovery.nl
  • Building faster web apps with google tools and exceed user expectations
  • Why I will never buy a new Apple Laptop until they fix the hardware cooling issues.
  • Telstra promised Fibre to the house (FTTP) when I had FTTN and this is what happened..

Copyright © 2023 · News Pro on Genesis Framework · WordPress · Log in

Some ads on this site use cookies. You can opt-out if of local analytics tracking by scrolling to the bottom of the front page or any article and clicking "You are not opted out. Click here to opt out.". Accept Reject Read More
GDPR, Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT