• Skip to primary navigation
  • Skip to main content
  • Skip to primary sidebar
  • Skip to footer
  • Home
  • Create a VM ($25 Credit)
  • Buy a Domain
  • 1 Month free Back Blaze Backup
  • Other Deals
    • Domain Email
    • Nixstats Server Monitoring
    • ewww.io Auto WordPress Image Resizing and Acceleration
  • About
  • Links

IoT, Code, Security, Server Stuff etc

Views are my own and not my employer's.

Personal Development Blog...

Coding for fun since 1996, Learn by doing and sharing.

Buy a domain name, then create your own server (get $25 free credit)

View all of my posts.

  • Cloud
    • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
    • How to buy a new domain and SSL cert from NameCheap, a Server from Digital Ocean and configure it.
    • Setting up a Vultr VM and configuring it
    • All Cloud Articles
  • Dev
    • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
    • How to setup pooled MySQL connections in Node JS that don’t disconnect
    • NodeJS code to handle App logins via API (using MySQL connection pools (1000 connections) and query parameters)
    • Infographic: So you have an idea for an app
    • All Development Articles
  • MySQL
    • Using the free Adminer GUI for MySQL on your website
    • All MySQL Articles
  • Perf
    • PHP 7 code to send object oriented sanitised input data via bound parameters to a MYSQL database
    • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
    • Measuring VM performance (CPU, Disk, Latency, Concurrent Users etc) on Ubuntu and comparing Vultr, Digital Ocean and UpCloud – Part 1 of 4
    • Speeding up WordPress with the ewww.io ExactDN CDN and Image Compression Plugin
    • Setting up a website to use Cloudflare on a VM hosted on Vultr and Namecheap
    • All Performance Articles
  • Sec
    • Using the Qualys FreeScan Scanner to test your website for online vulnerabilities
    • Using OWASP ZAP GUI to scan your Applications for security issues
    • Setting up the Debian Kali Linux distro to perform penetration testing of your systems
    • Enabling TLS 1.3 SSL on a NGINX Website (Ubuntu 16.04 server) that is using Cloudflare
    • PHP implementation to check a password exposure level with Troy Hunt’s pwnedpasswords API
    • Setting strong SSL cryptographic protocols and ciphers on Ubuntu and NGINX
    • Securing Google G Suite email by setting up SPF, DKIM and DMARC with Cloudflare
    • All Security Articles
  • Server
    • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
    • All Server Articles
  • Ubuntu
    • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
    • Useful Linux Terminal Commands
    • All Ubuntu Articles
  • VM
    • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
    • All VM Articles
  • WordPress
    • Speeding up WordPress with the ewww.io ExactDN CDN and Image Compression Plugin
    • Installing and managing WordPress with WP-CLI from the command line on Ubuntu
    • How to backup WordPress on a host that has CPanel
    • Moving WordPress to a new self managed server away from CPanel
    • Moving a CPanel domain with email to a self managed VPS and Gmail
    • All WordPress Articles
  • All

Database

I thought my website was hacked. Here is how I hardened my Linux servers security with Lynis Enterprise

October 24, 2020 by Simon

Disclaimer

I have waited a year before posting this, and I have tried my best to hide the bank’s identity as I never got a good explanation back from them about they the were whitelisting my website.

Background

I was casually reading Twitter one evening and found references to an awesome service (https://publicwww.com/) that allows you to find string references in CSS, JS, CSP etc files on websites.

Search engine that searches the web for the source code of the sites, not the content of them: https://t.co/G7oYQZ4Cbp

— @mikko (@mikko) March 8, 2018

https://t.co/DUyxFD4QbV is one of my new favorite search tools. Finally I can search for html/css/js and see which websites are using it. Really powerful when you think of the right searches…

— Allan Thraen (@athraen) April 26, 2019

See how people are using the publicwww service on Twitter.

I searched https://publicwww.com/ for “https://fearby.com“. I was expecting to only see only resources that were loading from my site.

I was shocked to see a bank in Asia was whistling my website and my websites CDN (hosted via ewww.io) in it’s Content Security Policy.

Screenshot of publicwww.com scan of "fearby.com

I was not hosting content for a bank and they should not be whitelisting my site?

Were they hacked? Was I hacked and delivering malware to their customers? Setting up a Content Security Policy (CSP) is not a trivial thing to do and I would suggest you check out https://report-uri.com/products/content_security_policy (by Scott Helme) for more information on setting up a good Content Security Policy (CSP).

Were we both hacked or was I serving malicious content?

Hacked Koala meme

I have written a few blog posts on creating Content Security Policies, and maybe they did copy my starter Content Security Policy and added it to their site?

I do have a lot of blog readers from their country.

Analytics map of Asia

I went to https://www.securityheaders.com and scanned their site and yes they have whitelisted my website and CDN. This was being sent in a header from their server to any connecting client.

I quickly double-checked the banks Content Security Policy (CSP) with https://cspvalidator.org/ and they too confirmed the bank was telling their customers that my website was ok to load files from.

I would not be worried if a florist’s website had white-listed my website but a bank that has 250 physical branches, 2,500 employees in a country that has 29 million people.

Below is the banks Content Security Policy.

https://cspvalidator.org/ screenshot of the banks csp

I thought I had been hacked into so I downloaded my Nginx log files (with MobaXTerm,) and scanned them for hits to my site from their website.

Screenshot of a years nginx logs.

After I scanned the logs I could see that I had zero traffic from their website

I sent a direct message to Scott Helme on Twitter (CSP Guru) and he replied with advice on the CSP.

Blocking Traffic

As a precaution, I edited my /etc/nginx/sites-available/default file and added this to block all traffic from their site.

if ($http_referer ~* "##########\.com") {
        return 404;
}

I tested and reloaded my Nginx config and restarted my web server

nginx -t
nginx -s reload
/etc/init.d/nginx restart

I also emailed my website CDN’s admin at https://ewww.io/ and asked them to block traffic from the bank as a precaution. They responded quickly as said this was done and they enabled extra logging in case more information was needed data.

If you need a good and fast WordPress Content Delivery Network (CDN) check out https://ewww.io/. They are awesome. Read my old review of ewww.io here.

I contacted the Bank

I searched the bank’s website for a way to contact them, their website was slow, their contact page was limited, they have a chat feature but I needed to log in with FaceBook (I don’t use FaceBook)

I viewed their contact us web page and they had zero dedicated security contacts listed. The CIO was only contactable via phone only.

They did not have a security.txt file on their website.

http://www.bankdomain.com/.well-known/security.txt file not found

TIP: If you run a website, please consider creating a security.txt file, information here.

I then viewed their contact us page and emailed everyone I could.

I asked if they could..

  • Check their logs for malicious files loaded from my site
  • Please remove the references to my website and CDN from their CSP.
  • Hinted they may want to review your CI/CD logs to see why this happened

My Server Hardening (to date)

My website was already hardened but was my site compromised?

Hardening actions to date..

  • Using a VPS firewall, Linux firewall 2x software firewalls
  • I have used the free Lynis Scan
  • Whitelisting access to port 22 to limited IP’s
  • Using hardware 2FA keys on SSH and WordPress Logins
  • Using the WordFence Security Plugin
  • Locked down unwanted ports.
  • I had a strong HTTPS certificate and website configuration (test here)
  • I have set up appropriate security headers (test here). I did need to re-setup a Content Security Policy (keep reading)
  • Performed many actions (some blogged a while ago) here: https://fearby.com/article/securing-ubuntu-cloud/
  • etc

I had used the free version of Lynis before but now is the time to use the Lynis Enterprise.

A free version of Lynis can be installed from Github here: https://github.com/CISOfy/lynis/

What is Lynis Enterprise?

Lynis Enterprise software is commercial auditing, system hardening, compliance testing tool for AIX, FreeBSD, HP-UX, Linux, macOS, NetBSD, OpenBSD, Solaris etc. The Enterprise version is a paid version (with web portal). Lynis Enterprise has more features over the free version.

Snip from here: “Lynis is a battle-tested security tool for systems running Linux, macOS, or Unix-based operating system. It performs an extensive health scan of your systems to support system hardening and compliance testing. The project is open-source software with the GPL license and available since 2007.”

Visit the Lynis Enterprise site here: https://cisofy.com/solutions/#lynis-enterprise.

I created a Lynis Enterprise Trial

I have used the free version of Lynis in the past (read here), but the Enterprise version offers a lot of extra features (read here).

Screenshot of https://cisofy.com/lynis-enterprise/why-upgrade/

View the main Lynis Enterprise site here and the pricing page here

View a tour of features here: https://cisofy.com/lynis-enterprise/

Create a Cisofy Trial Account

You can request a trial of Lynis Enterprise here: https://cisofy.com/demo/

Request a Lynis Enterprise trial screenshot

After the trial account was set up I logged in here. Upon login, I was prompted to add a system to my account (also my licence key was visible)

Lynis portal  main screen

Install Lynis (Clone GIT Repo/latest features)

I am given 3 options to install Lynis from the add system page here.

  1. Add the software repository and install the client (The suggested and easiest way to install Lynis and keep it up-to-date).
  2. Clone the repository from Github (The latest development version, containing the most recent changes)
  3. Manually install or activate an already installed Lynis.

I will clone a fresh install from Github as I prefer seeing the latest issues, latest changes from GitHub notifications. I like getting notifications about security.

I logged into my server via SSH and ran the following command(s).

sudo apt-get instal git
mkdir /thefolder
cd /thefolder
git clone https://github.com/CISOfy/lynis

Cloning into 'lynis'...
remote: Enumerating objects: 7, done.
remote: Counting objects: 100% (7/7), done.
remote: Compressing objects: 100% (7/7), done.
remote: Total 10054 (delta 0), reused 1 (delta 0), pack-reused 10047
Receiving objects: 100% (10054/10054), 4.91 MiB | 26.60 MiB/s, done.
Resolving deltas: 100% (7387/7387), done.

I logged into https://portal.cisofy.com/ and clicked ‘Add’ system to find my API key

I noted my licence key.

I then changed to my Lynis folder

cd lynis

I then created a “custom.prf” file

touch custom.prf

I ran this command to activate my licence (I have replaced my licence with ########’s).

View the documentation here.

./lynis configure settings license-key=########-####-####-####-############:upload-server=portal.cisofy.com

Output:

Configuring setting 'license-key'
Setting changed
Configuring setting 'upload-server'
Setting changed

I performed my first scan and uploaded the report.

TIP: Make sure you have curl installed

./lynis audit system --upload

After the scan is complete, make sure you see the following.

Data upload status (portal.cisofy.com) [ OK ]

I logged into https://portal.cisofy.com/enterprise/systems/ and I could view my systems report.

You can read the basic Lynis documentation here: https://cisofy.com/documentation/lynis/

Manual Lynis Scans

I can run a manual scan at any time

cd /thefolder/lynis/
sudo ./lynis audit system --upload

To view results I can login to https://portal.cisofy.com/

Automated Lynis Scans

I have created a bash script that updates Lynis (basically running ‘sudo /usr/bin/git pull origin master’ in the lynis folder)

#!/bin/bash

sendemail -f [email protected] -t [email protected] -u "CRON: Updating Lynis (yourserver.com) START" -m "/folder/runlynis.sh" -s smtp.gmail.com:587 -o tls=yes -xu [email protected] -xp ***my*google*gsuite*email*app*password***

echo "Changing Directory to /folder/lynis"
cd /folder/lynis

echo "Updating Lynis"
sudo /usr/bin/git pull origin master

sendemail -f [email protected] -t [email protected] -u "CRON: Updated Lynis (yourserver.com) END" -m "/folder/runlynis.sh" -s smtp.gmail.com:587 -o tls=yes -xu [email protected] -xp ***my*google*gsuite*email*app*password***

This is my bash script that runs Lynis scans and emails the report

#!/bin/bash

sendemail -f [email protected] -t [email protected] -u "CRON: Run Lynis (yourserver.com) START" -m "/folder/runlynis.sh" -s smtp.gmail.com:587 -o tls=yes -xu [email protected] -xp ***my*google*gsuite*email*app*password***

echo "Running Lynis Scan"
cd /utils/lynis/
sudo /utils/lynis/lynis audit system --upload > /folder/lynis/lynis.txt

sendemail -f [email protected] -t [email protected] -u "CRON: Run Lynis (yourserver.com) END" -m "/folder/runlynis.sh" -s smtp.gmail.com:587 -o tls=yes -xu [email protected] -xp ***my*google*gsuite*email*app*password***  -a /folder/lynis/lynis.txt

I set up two cron jobs to update Lynis (from Git) and to scan with Lynis every day.

#Lynis Update 11:55PM
55 21 * * * /bin/bash /folder/runlynis.sh && curl -fsS --retry 3 https://hc-ping.com/########-####-####-####-############ > /dev/null

#Lynis Scan 2AM
0 2 * * * /bin/bash /folder/runlynis.sh && curl -fsS --retry 3 https://hc-ping.com/########-####-####-####-############ > /dev/null

Thanks to sendemail I get daily emails

I have set up cronjob motoring and emails at the start and end of the bash scripts.

The attachment is not a pretty text file report but a least I can see the output of the scan (without logging into the portal).

Maybe I add the following file also

/var/log/lynis.log

Lynis Enterprise (portal.cisofy.com)

Best of all Lynis Enterprise comes with a great online dashboard available at
https://portal.cisofy.com/enterprise/dashboard/.

Lynis Enterprise Portal

Dashboard (portal.cisofy.com)

Clicking the ‘Dashboard‘ button in the toolbar at the top of the portal reveals a summary of your added systems, alerts, compliance, system integrity, Events and statistics.

Dashboard button

The dashboard has three levels

  • Business (less information)
  • Operational
  • Technical (more information)

Read about the differences here.

three dashboard breadcrumbs

Each dashboard has a limited number of elements, but the technical dashboard has all the elements.

Technical Dashboard

Lynis Enterprise Dashboard https://portal.cisofy.com/enterprise/dashboard/

From here you can click and open server scan results (see below)

Server Details

If you click on a server name you can see detailed information. I created 2 test servers (I am using the awesome UpCloud host)

A second menu appears when you click on a server

Linus Menu

Test Server 01: Ubuntu 18.04 default Scan Results (66/100)

Ubuntu Server Score 66/100

Test Server 02: Debian 9.9 default Scan Results (65/100)

Server

It is interesting to see Debian is 1 point below Ubuntu.

The server page will give a basic summary and highlights like the current and previous hardening score, open ports, firewall status, installed packages, users.

When I click the server name to load the report I can click to see ‘Warnings’ or ‘Suggestions’ to resolve

Suggested System Hardening Actions

I had 47 system hardening recommendations on one system

Lynis identified quick wins.

Some of the security hardening actions included the following.

e.g

  • Audit daemon is enabled with an empty ruleset. Disable the daemon or define rules
  • Incorrect permissions for file /root/.ssh
  • A reboot of the system is most likely needed
  • Found some information disclosure in SMTP banner (OS or software name)
  • Configure maximum password age in /etc/login.defs
  • Default umask in /etc/login.defs could be more strict like 027
  • Add a legal banner to /etc/issue.net, to warn unauthorized users
  • Check available certificates for expiration
  • To decrease the impact of a full /home file system, place /home on a separate partition
  • Install a file integrity tool to monitor changes to critical and sensitive files
  • Check iptables rules to see which rules are currently not used
  • Harden compilers like restricting access to root user only
  • Disable the ‘VRFY’ command
  • Add the IP name and FQDN to /etc/hosts for proper name resolving
  • Purge old/removed packages (59 found) with aptitude purge or dpkg –purge command. This will clean up old configuration files, cron jobs and startup scripts.
  • Remove any unneeded kernel packages
  • Determine if automation tools are present for system management
  • etc

Hardening Suggestion (Ignore or Solve)

If you click ‘Solve‘ Cisofy will provide a link to detailed information to help you solve issues.

Suggested fix: ACCT-9630 Audit daemon is enabled with an empty ruleset. Disable the daemon or define rules

I will not list every suggested problem and fix but here are some fixes below.

ACCT-9630 Audit daemon is enabled with an empty ruleset. Disable the daemon or define rules (fixed)

TIP: If you don’t have auditd installed run this command below to install it

sudo apt-get install auditd
/etc/init.d/auditd start
/etc/init.d/auditd status

I added the following to ‘/etc/audit/rules.d/audit.rules‘ (thanks to the solution recommendations on the Cisofy portal.

# This is an example configuration suitable for most systems
# Before running with this configuration:
# - Remove or comment items which are not applicable
# - Check paths of binaries and files

###################
# Remove any existing rules
###################

-D

###################
# Buffer Size
###################
# Might need to be increased, depending on the load of your system.
-b 8192

###################
# Failure Mode
###################
# 0=Silent
# 1=printk, print failure message
# 2=panic, halt system
-f 1

###################
# Audit the audit logs.
###################
-w /var/log/audit/ -k auditlog

###################
## Auditd configuration
###################
## Modifications to audit configuration that occur while the audit (check your paths)
-w /etc/audit/ -p wa -k auditconfig
-w /etc/libaudit.conf -p wa -k auditconfig
-w /etc/audisp/ -p wa -k audispconfig

###################
# Monitor for use of audit management tools
###################
# Check your paths
-w /sbin/auditctl -p x -k audittools
-w /sbin/auditd -p x -k audittools

###################
# Special files
###################
-a exit,always -F arch=b32 -S mknod -S mknodat -k specialfiles
-a exit,always -F arch=b64 -S mknod -S mknodat -k specialfiles

###################
# Mount operations
###################
-a exit,always -F arch=b32 -S mount -S umount -S umount2 -k mount
-a exit,always -F arch=b64 -S mount -S umount2 -k mount

###################
# Changes to the time
###################
-a exit,always -F arch=b32 -S adjtimex -S settimeofday -S stime -S clock_settime -k time
-a exit,always -F arch=b64 -S adjtimex -S settimeofday -S clock_settime -k time
-w /etc/localtime -p wa -k localtime

###################
# Use of stunnel
###################
-w /usr/sbin/stunnel -p x -k stunnel

###################
# Schedule jobs
###################
-w /etc/cron.allow -p wa -k cron
-w /etc/cron.deny -p wa -k cron
-w /etc/cron.d/ -p wa -k cron
-w /etc/cron.daily/ -p wa -k cron
-w /etc/cron.hourly/ -p wa -k cron
-w /etc/cron.monthly/ -p wa -k cron
-w /etc/cron.weekly/ -p wa -k cron
-w /etc/crontab -p wa -k cron
-w /var/spool/cron/crontabs/ -k cron

## user, group, password databases
-w /etc/group -p wa -k etcgroup
-w /etc/passwd -p wa -k etcpasswd
-w /etc/gshadow -k etcgroup
-w /etc/shadow -k etcpasswd
-w /etc/security/opasswd -k opasswd

###################
# Monitor usage of passwd command
###################
-w /usr/bin/passwd -p x -k passwd_modification

###################
# Monitor user/group tools
###################
-w /usr/sbin/groupadd -p x -k group_modification
-w /usr/sbin/groupmod -p x -k group_modification
-w /usr/sbin/addgroup -p x -k group_modification
-w /usr/sbin/useradd -p x -k user_modification
-w /usr/sbin/usermod -p x -k user_modification
-w /usr/sbin/adduser -p x -k user_modification

###################
# Login configuration and stored info
###################
-w /etc/login.defs -p wa -k login
-w /etc/securetty -p wa -k login
-w /var/log/faillog -p wa -k login
-w /var/log/lastlog -p wa -k login
-w /var/log/tallylog -p wa -k login

###################
# Network configuration
###################
-w /etc/hosts -p wa -k hosts
-w /etc/network/ -p wa -k network

###################
## system startup scripts
###################
-w /etc/inittab -p wa -k init
-w /etc/init.d/ -p wa -k init
-w /etc/init/ -p wa -k init

###################
# Library search paths
###################
-w /etc/ld.so.conf -p wa -k libpath

###################
# Kernel parameters and modules
###################
-w /etc/sysctl.conf -p wa -k sysctl
-w /etc/modprobe.conf -p wa -k modprobe
###################

###################
# PAM configuration
###################
-w /etc/pam.d/ -p wa -k pam
-w /etc/security/limits.conf -p wa -k pam
-w /etc/security/pam_env.conf -p wa -k pam
-w /etc/security/namespace.conf -p wa -k pam
-w /etc/security/namespace.init -p wa -k pam

###################
# Puppet (SSL)
###################
#-w /etc/puppet/ssl -p wa -k puppet_ssl

###################
# Postfix configuration
###################
#-w /etc/aliases -p wa -k mail
#-w /etc/postfix/ -p wa -k mail
###################

###################
# SSH configuration
###################
-w /etc/ssh/sshd_config -k sshd

###################
# Hostname
###################
-a exit,always -F arch=b32 -S sethostname -k hostname
-a exit,always -F arch=b64 -S sethostname -k hostname

###################
# Changes to issue
###################
-w /etc/issue -p wa -k etcissue
-w /etc/issue.net -p wa -k etcissue

###################
# Log all commands executed by root
###################
-a exit,always -F arch=b64 -F euid=0 -S execve -k rootcmd
-a exit,always -F arch=b32 -F euid=0 -S execve -k rootcmd

###################
## Capture all failures to access on critical elements
###################
-a exit,always -F arch=b64 -S open -F dir=/etc -F success=0 -k unauthedfileacess
-a exit,always -F arch=b64 -S open -F dir=/bin -F success=0 -k unauthedfileacess
-a exit,always -F arch=b64 -S open -F dir=/home -F success=0 -k unauthedfileacess
-a exit,always -F arch=b64 -S open -F dir=/sbin -F success=0 -k unauthedfileacess
-a exit,always -F arch=b64 -S open -F dir=/srv -F success=0 -k unauthedfileacess
-a exit,always -F arch=b64 -S open -F dir=/usr/bin -F success=0 -k unauthedfileacess
-a exit,always -F arch=b64 -S open -F dir=/usr/local/bin -F success=0 -k unauthedfileacess
-a exit,always -F arch=b64 -S open -F dir=/usr/sbin -F success=0 -k unauthedfileacess
-a exit,always -F arch=b64 -S open -F dir=/var -F success=0 -k unauthedfileacess

###################
## su/sudo
###################
-w /bin/su -p x -k priv_esc
-w /usr/bin/sudo -p x -k priv_esc
-w /etc/sudoers -p rw -k priv_esc

###################
# Poweroff/reboot tools
###################
-w /sbin/halt -p x -k power
-w /sbin/poweroff -p x -k power
-w /sbin/reboot -p x -k power
-w /sbin/shutdown -p x -k power

###################
# Make the configuration immutable
###################
-e 2

# EOF

I reloaded my audit daemon config

auditctl -R /etc/audit/rules.d/audit.rules

Further configuration can be added (read this), read the auditd man page here or read logs you can use the ‘auditsearch‘ tool (read the Ubuntu Man Page here)

Here is a great guide on viewing audit events.

Because we have this rule ( ‘-w /etc/passwd -p wa -k etcpasswd ) to monitor the passwords file, If I read the contents of \etc\passwd it will show up in the audit logs.

We can verify the access of this file by running this command

ausearch -f /etc/passwd

Output

ausearch -f /etc/passwd
----
time->Mon Jun 10 16:58:13 2019
type=PROCTITLE msg=audit(##########.897:3639): proctitle=##########################
type=PATH msg=audit(##########.897:3639): item=1 name="/etc/passwd" inode=1303 dev=fc:01 mode=0100644 ouid=0 ogid=0 rdev=00:00 nametype=NORMAL cap_fp=0000000000000000 cap_fi=0000000000000000 cap_fe=0 cap_fver=0
type=PATH msg=audit(##########.897:3639): item=0 name="/etc/" inode=12 dev=fc:01 mode=040755 ouid=0 ogid=0 rdev=00:00 nametype=PARENT cap_fp=0000000000000000 cap_fi=0000000000000000 cap_fe=0 cap_fver=0
type=CWD msg=audit(##########.897:3639): cwd="/root"
type=SYSCALL msg=audit(##########.897:3639): arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=556241ea9650 a2=441 a3=1b6 items=2 ppid=1571 pid=1572 auid=0 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=pts0 ses=446 comm="nano" exe="/bin/nano" key="etcpasswd"

I might write a list of handy ausearech commands and blog about this in the future

SSH Permissions (fixed)

to fish the ssh permissions warning I ran the command to show the issue on my server

./lynis show details FILE-7524
2019-05-25 23:00:04 Performing test ID FILE-7524 (Perform file permissions check)
2019-05-25 23:00:04 Test: Checking file permissions
2019-05-25 23:00:04 Using profile /utils/lynis/default.prf for baseline.
2019-05-25 23:00:04 Checking /etc/lilo.conf
2019-05-25 23:00:04   Expected permissions:
2019-05-25 23:00:04   Actual permissions:
2019-05-25 23:00:04   Result: FILE_NOT_FOUND
2019-05-25 23:00:04 Checking /root/.ssh
2019-05-25 23:00:04   Expected permissions: rwx------
2019-05-25 23:00:04   Actual permissions: rwxr-xr-x
2019-05-25 23:00:04   Result: BAD
2019-05-25 23:00:04 Warning: Incorrect permissions for file /root/.ssh [test:FILE-7524] [details:-] [solution:-]
2019-05-25 23:00:04 Using profile /utils/lynis/custom.prf for baseline.
2019-05-25 23:00:04 Checking permissions of /utils/lynis/include/tests_homedirs
2019-05-25 23:00:04 File permissions are OK
2019-05-25 23:00:04 ===---------------------------------------------------------------===

I tightened permissions on the /root/.ssh folder with this command

chmod 700 /root/.ssh

Configure minimum/maximum password age in /etc/login.defs (fixed)

I set a maximum and minimum password age in ‘/etc/login.defs‘

Defaults

PASS_MAX_DAYS   99999
PASS_MIN_DAYS   0
PASS_WARN_AGE   7

Add a legal banner to /etc/issue, to warn unauthorized users (fixed)

I edited ‘/etc/issue’ on Ubuntu and Linux

Ubuntu 18.04 default

Ubuntu 18.04.2 LTS \n \l

Debian Default

Debian GNU/Linux 9 \n \l

Cisofy said this “Define a banner text to inform both authorized and unauthorized users about the machine and service they are about to access. The purpose is to share your policy before an access attempt is being made. Users should know that there privacy might be invaded, due to monitoring of the system and its resources, to protect the integrity of the system. Also unauthorized users should be deterred from trying to access it in the first place.“

Done

Default umask in /etc/login.defs could be more strict like 027 (fixed)

Related files..

  • /etc/profile
  • /etc/login.defs
  • /etc/passwd

I edited ‘/etc/login.defs’ and set

UMASK           027

I ran

umask 027 /etc/profile
umask 027 /etc/login.defs
umask 027 /etc/passwd

Check iptables rules to see which rules are currently not used (fixed)

I ran the following command to review my firewall settings

iptables --list --numeric --verbose

TIP: Scan for open ports with ‘nmap’

Watch this handy video if you are not sure how to use nmap

Install nmap

sudo apt-get install nmap

I do set firewall rules in ufw (guide here) and ufw is a front end for iptables.

Scan for open ports with nmap

nmap -v -sT localhost

Starting Nmap 7.60 ( https://nmap.org ) at 2019-06-12 22:09 AEST
Initiating Connect Scan at 22:09
Scanning localhost (127.0.0.1) [1000 ports]
Discovered open port 443/tcp on 127.0.0.1
Discovered open port 22/tcp on 127.0.0.1
Discovered open port 8080/tcp on 127.0.0.1
Discovered open port 25/tcp on 127.0.0.1
Discovered open port 80/tcp on 127.0.0.1
Completed Connect Scan at 22:09, 0.02s elapsed (1000 total ports)
Nmap scan report for localhost (127.0.0.1)
Host is up (0.00012s latency).
Not shown: 994 closed ports
PORT     STATE SERVICE
22/tcp   open  ssh
25/tcp   open  smtp
80/tcp   open  http
443/tcp  open  https
8080/tcp open  http-proxy

Everything looked good.

Harden compilers like restricting access to root user only (fixed)

Cicofy said

Compilers turn source code into binary executable code. For a production system a compiler is usually not needed, unless package upgrades are performed by means of their source code (like FreeBSD ports collection). If a compiler is found, execution should be limited to authorized users only (e.g. root user).

To solve this finding, remove any unneeded compiler or change the file permissions. Usually chmod 700 or chmod 750 will be enough to prevent normal users from using a compiler. Related compilers are as, cc, ld, gcc, go etc. To determine what files are affected, check the Lynis log file, then chmod these files.

I ran

chmod 700 /usr/bin/as
chmod 700 /usr/bin/gcc

Turn off PHP information exposure (fixed)

Cisofy siad

Disable the display of version information by setting the expose_php option to 'Off' in php.ini. As several instances of PHP might be installed, ensure that all related php.ini files have this setting turned off, otherwise this control will show up again.

This was already turned off but a unused php.ini may have been detected.

I searched for all php.ini files

find / -name php.ini

Output

/etc/php/7.3/apache2/php.ini
/etc/php/7.3/fpm/php.ini
/etc/php/7.3/cli/php.ini

yep, the cli version of php.ini had the following

expose_php = On

I set this to Off

Purge old/removed packages (59 found) with aptitude purge or dpkg –purge command. This will cleanup old configuration files, cron jobs and startup scripts. (fixed)

Cisofy said

While not directly a security concern, unpurged packages are not installed but still have remains left on the system (e.g. configuration files). In case software is reinstalled, an old configuration might be applied. Proper cleanups are therefore advised.

To remove the unneeded packages, select the ones marked with the 'rc' status. This means the package is removed, but the configuration files are still there.

I ran the following recommended command

dpkg -l | grep "^rc" | cut -d " " -f 3 | xargs dpkg --purge

Done

Install debsums utility for the verification of packages with known good database. (fixed)Cisofy said

Install the debsums utility to do more in-depth auditing of your packages.

I ran the following suggested command

apt-get install debsums

I googled and found this handy page

I scanned packages and asked ‘debsums” to only show errors with this command

sudo debsums -s

The only error was..

debsums: missing file /usr/bin/pip (from python-pip package)

I did not need pip so I removed it

apt-get remove --purge python-pip

Install a PAM module for password strength testing like pam_cracklib or pam_passwdqc (fixed)

I ignore this as I do not allow logins via password and only I have an account (it’s not a multi user system).

I white list logins to IP’s.

I only allow ssh access with a private key and long passphrase.

I have 2FA OTP enabled at logins.

I have cloudflare over my domain.

I setup fail2ban to auto block logins using this guide

Reboot (fixed)

I restated the server

shutdown -r now

Done

Check available certificates for expiration (fixed)

I tested my SSL certificate with https://dev.ssllabs.com

https://dev.ssllabs.com/ scan of my site

Add legal banner to /etc/issue.net, to warn unauthorized users (fixed)

Cisofy said…

Define a banner text to inform both authorized and unauthorized users about the machine and service they are about to access. The purpose is to share your policy before an access attempt is being made. Users should know that there privacy might be invaded, due to monitoring of the system and its resources, to protect the integrity of the system. Also unauthorized users should be deterred from trying to access it in the first place.

Do not reveal sensitive information, like the specific goal of the machine, or what can be found on it. Consult with your legal department, to determine appropriate text.

I edited the file ‘/etc/issue.net’ and added a default pre login message (same as ‘/etc/issue’).

Install Apache mod_evasive to guard webserver against DoS/brute force attempts (ignored)

I ignored this message and I don’t use the Apache (I use the Nginx web server). I have added Apache to be blocked from installing.

I clicked Ignore in the Cisofy portal.

Ignore Button

Install Apache modsecurity to guard webserver against web application attacks (ignored)

I clicked Ignore for this one too

Ignore Button

Check your Nginx access log for proper functioning (reviewed)

Cisofy said…

Disabled logging:
Check in the Lynis log for entries which are disabled, or in the nginx configuration (access_log off).

Missing logging:
Check for missing log files. They are references in the configuration of nginx, but not on disk. The Lynis log will reveal to what specific files this applies.

I checked my Nginx config (‘/etc/nginx/nginx.conf‘) for all log references and ensured the logs were writing to disk (OK).

I checked my ‘/etc/nginx/sites-available/default‘ config and I did have 2 settings of ‘access_log off ‘ (this was added during the setup for two sub reporting subfolders for the Nixstats agent.

I restarted Nginx

nginx -t
nginx -s reload
/etc/init.d/nginx restart

Check what deleted files are still in use and why. (fixed)

Cisofy said..

Why it matters
Deleted files may sometimes be in use by applications. Normally this should not happen, as an application should delete a file and release the file handle. This test might discover malicious software, trying to hide its presence on the system. Investigate the related files by determining which application keeps it open and the related reason.

Details
The following details have been found as part of the scan.

/lib/systemd/systemd-logind(systemd-l)
/tmp/ib1ekCtf(mysqld)
/tmp/ibhuK1At(mysqld)
/tmp/ibmTO5F5(mysqld)
/tmp/ibR0dkxD(mysqld)
/tmp/ibvf69KH(mysqld)
/tmp/.ZendSem.gq3mnz(php-fpm7.)
/usr/bin/python3.6(networkd-)
/usr/bin/python3.6(unattende)
/var/log/mysql/error.log.1(mysqld)

I ran the following command to show deleted files in use

lsof | grep deleted

I noticed on my database server a php-fpm service was using files. I don’t have a webserver enabled on this server, so I uninstalled the web-based services.

I have separate web and database servers.

sudo apt-get remove apache*
sudo apt-get remove -y --purge nginx*
sudo apt-get remove -y --purge php7*
sudo apt autoremove

Check DNS configuration for the dns domain name (fixed)

Cisofy said..

Some software can work incorrectly when the system can't resolve itself. 
Add the IP name and fully qualified domain name (FQDN) to /etc/hosts. Usually this is done with an entry of 127.0.0.1, or 127.0.1.1 (to leave the localhost entry alone). 

I edited my ‘/etc/hosts’ file

I added a domain name to the end of the localhost entry and added a new line with my server(s) IP and domain name

Disable the ‘VRFY’ command (fixed)

I was advised to run this command

postconf -e disable_vrfy_command=yes

(Debian) Enable sysstat to collect accounting (no results) (fixed)

Cisofy said..

The sysstat is collection of utilities to provide system information insights. While one should aim for the least amount of packages, the sysstat utilities can be a good addition to help recording system details. They can provide insights for performance monitoring, or guide in discovering unexpected events (like a spam run). If you already use extensive system monitoring, you can safely ignore this control.

I ran the suggested commands

apt-get install sysstat
sed -i 's/^ENABLED="false"/ENABLED="true"/' /etc/default/sysstat

More info on sysstat here.

Consider running ARP monitoring software (arpwatch,arpon) (fixed)

Cisofy said

Networks are very dynamic, often with devices come and go as they please. For sensitive machines and network zones, you might want to know what happens on the network itself. An utility like arpwatch can help tracking changes, like new devices showing up, or others leaving the network.

I read this page to setup and configure arpwatch

sudo apt-get install arpwatch
/etc/init.d/arpwatch start

I will add more on how to use arpwatch soon

Disable drivers like USB storage when not used, to prevent unauthorized storage or data theft (fixed)

Cosofy siad..

Disable drivers like USB storage when not used. This helps preventing unauthorized storage, data copies, or data theft.

I ran the suggested fix

echo "# Block USB storage" >> /etc/modprobe.d/disable-usb-storage.conf
echo "install usb-storage /bin/false" >> /etc/modprobe.d/disable-usb-storage.conf

Determine if automation tools are present for system management (ignored)

I ignored this one

Ignore Button

One or more sysctl values differ from the scan profile and could be tweaked

Cisofy said..

By means of sysctl values we can adjust kernel related parameters. Many of them are related to hardening of the network stack, how the kernel deals with processes or files. This control is a generic test with several sysctl variables (configured by the scan profile).

I was advised to adjust these settings

  • net.ipv4.conf.all.send_redirects=0
  • net.ipv4.conf.default.accept_source_route=0
  • kernel.sysrq=0
  • net.ipv4.conf.all.log_martians=1
  • net.ipv4.conf.default.log_martians=1
  • kernel.core_uses_pid=1
  • kernel.kptr_restrict=2
  • fs.suid_dumpable=0
  • kernel.dmesg_restrict=1

I edited ‘/etc/sysctl.conf‘ and made the advised changes along with these (I Googled each item first)

Install a file integrity tool to monitor changes to critical and sensitive files (fixed)

Cisofy said..

To monitor for unauthorized changes, a file integrity tool can help with the detection of such event. Each time the contents or the properties of a file change, it will have a different checksum. With regular checks of the related integrity database, discovering changes becomes easy. Install a tool like AIDE, Samhain or Tripwire to monitor important system and data files. Additionally configure the tool to alert system or security personnel on events.

It also gave a solution

# Step 1: Install package with appropriate command
apt-get install aide
yum install aide

# Step 2: Initialise database
aide --init
# If this fails: try aideinit

# Step 3: Copy newly created database (/var/lib/aide)
cp /var/lib/aide/aide.db.new.gz /var/lib/aide/aide.db.gz

# Step 4:
aide --check

I installed ‘aide’ (read the guide here).

TIP: Long story but the steps above were not exactly correct. Thanks to this post for I was able to set up aide. without seeing this error.

Couldn't open file /var/lib/aide/please-dont-call-aide-without-parameters/aide.db.new for writing

This is how I installed aide

apt-get install aide
apt-get install aide-common

I initialised aide.

aideinit

This was the important part (I was stuck for hours on this one)

aide.wrapper --check

I can run the following to see what files have changed.

I could see many files have changed since the initial scan (e.g mysql, log files nano search history).

Nice

Now lets schedule daily checks and create a cron job.

cat /folder/runaide.sh
#!/bin/bash

sendemail -f [email protected] -t [email protected] -u "CRON: AIDE Run (yourserver.com) START" -m "/folder/runaide.sh" -s smtp.gmail.com:587 -o tls=yes -xu [email protected] -xp ***my*google*gsuite*email*app*password***

MYDATE=`date +%Y-%m-%d`
MYFILENAME="Aide-"$MYDATE.txt
/bin/echo "Aide check !! `date`" > /tmp/$MYFILENAME
/usr/bin/aide.wrapper --check > /tmp/myAide.txt
/bin/cat /tmp/myAide.txt|/bin/grep -v failed >> /tmp/$MYFILENAME
/bin/echo "**************************************" >> /tmp/$MYFILENAME
/usr/bin/tail -100 /tmp/myAide.txt >> /tmp/$MYFILENAME
/bin/echo "****************DONE******************" >> /tmp/$MYFILENAME

#/usr/bin/mail -s"$MYFILENAME `date`" [email protected] < /tmp/$MYFILENAME

sendemail -f [email protected] -t [email protected] -u "CRON: AIDE Run (yourserver.com) END" -m "/folder/runaide.sh" -s smtp.gmail.com:587 -o tls=yes -xu [email protected] -xp ***my*google*gsuite*email*app*password*** -a /tmp/$MYFILENAME -a /tmp/myAide.txt

Above thanks to this post

I setup a cron job to run this daily

#Run AIDE
0 6 * * * /folder/runaide.sh && curl -fsS --retry 3 https://hc-ping.com/######-####-####-####-############> /dev/null

ACCT-9622 – Enable process accounting. (fixed)

Solution:

Install “acct” process and login accounting.

sudo apt-get install acct

Start the “acct” service

/etc/init.d/acct start
touch /var/log/pacct
chown root /var/log/pacct
chmod 0644 /var/log/pacct
accton /var/log/pacct 

Check the status

/etc/init.d/acct status
* acct.service - LSB: process and login accounting
   Loaded: loaded (/etc/init.d/acct; generated)
   Active: active (exited) since Sun 2019-05-26 19:42:15 AEST; 4min 42s ago
     Docs: man:systemd-sysv-generator(8)
    Tasks: 0 (limit: 4660)
   CGroup: /system.slice/acct.service

May 26 19:42:15 servername systemd[1]: Starting LSB: process and login accounting...
May 26 19:42:15 servername acct[27419]: Turning on process accounting, file set to '/var/log/account/pacct'.
May 26 19:42:15 servername systemd[1]: Started LSB: process and login accounting.
May 26 19:42:15 servername acct[27419]:  * Done.

Run CISOfy recommended commands

touch /var/log/pacct
chown root /var/log/pacct
chmod 0644 /var/log/pacct
accton /var/log/pacct 

Manual Scan of Lynis

I re-ran an audit of the system (and uploaded the report to the portal) so I can see how I am progressing.

./lynis audit system --upload

I then checked the error status and the warnings were resolved.

Progress?

I rechecked my servers and all warnings are solved, now I just need to work on information level issues

Warning level errors fixed,  and informational to go

Cisofy Portal Overview

Quick breakdown of the Cisofy Portal

Overview Tab (portal.cisofy.com)

The Overview lab displays any messages, change log, API information, add a new system link, settings etc.

Lynis Overview tab

Dashboard Tab (portal.cisofy.com)

The dashboard tab will display compliant systems any outdated systems, alerts and events.

Lynis Dashboard screenshot https://portal.cisofy.com/enterprise/dashboard/

TIP: If you have a system that reports “Outdated” run the following command.

./lynis audit system --upload

Systems Tab (portal.cisofy.com)

The systems tab shows all systems, OS version, warnings, information counts, the date the system’s client last uploaded a report and the client version.

Systems tab shows all systems, OS version, warnings, information counts, date client last uploaded a report update and client version

If you are making many changes and manual Lynis scans keep an eye on your upload credits, You can see by the above and below image, I have lowered my suggested actions to harden my servers (red text).

Lynis scans reached

Clicking a host name reveals a summary of the system.

Clicking a system reveals a summary of the system.

Remaining information level issues are listed.

I can click Solve and see more information about the issue to resolve.

TIP: I thought it would be a good idea to copy this list to a spreadsheet for detailed tracking.

Spreadsheet listing issues to complete and done

I had another issue appear a few days later.

Compliance Tab (portal.cisofy.com)

A lot of information is listed here.

Compliance Tab

Best practice guides are available

best practice ghttps://portal.cisofy.com/compliance/udes

I could go on an on but https://cisofy.com/ is awesome.

TIP: Manually updating Lynis

from the command line I can view the Linus version with this command

./lynis --version
2.7.4

To update the Lynis git repository from the Lynis folder run this command

git pull
Already up to date.

Automatically updating and running Lynis scans

I added the following commands to my crontab to update then scan and report Lynis results to the portal.

TIP: Use https://crontab.guru/ to choose the right time to run commands (I chose 5 mins past 1 AM every day to update and 5 mins past 2 AM to run a scan.


#Lynis Update
5 1 * * * root -s /bin/bash -c 'cd /utils/lynis && /usr/bin/git pull origin master'

#Lynis Scan
5 2 * * * root -s /bin/bash -c '/utils/lynis/lynis audit system --upload'

Troubleshooting

fyi: Lynis Log file location: /var/log/lynis.log

Cisofy Enterprise Conclusion

Pros:

  • I can learn so much about securing Linux just from the Cisofy Fix recommendations.
  • I have secured my server beyond what I thought possible.
  • Very active development on Github: https://github.com/CISOfy/lynis/
  • Cisofy has a very good inteface and updates often.
  • New security issues are synced down and included in new scans (if you update)

Cons:

  • I am unable to pay for this for my servers here in Australia (European legal issues).
  • Needs Hardware 2FA

Tips

Make sure you have curl installed to allow reports to upload. I had this error on Debian 9.4.

View the latest repository version information here.

I added my Lynis folder to the Linux $PATH variable

export PATH=$PATH:/folder/lynis

Fatal: can’t find curl binary. Please install the related package or put the binary in the PATH. Quitting..

Lynis Enterprise API

View the Lynis Enterprise API documentation here

Lynis Enterprise Support

Support can be found here, email support [email protected].

Getting started guide is found here.

Bonus: Setting Up Content Security Policy and reporting violations to https://report-uri.com/

I have a few older posts on Content Security Policies (CSP) but they are a bit dated.

  • 2016 – Beyond SSL with Content Security Policy, Public Key Pinning etc
  • 2018 – Set up Feature-Policy, Referrer-Policy and Content Security Policy headers in Nginx

Wikipedia Definition of a Content Security Policy

Content Security Policy (CSP) is a computer security standard introduced to prevent cross-site scripting (XSS), clickjacking and other code injection attacks resulting from execution of malicious content in the trusted web page context.[1] It is a Candidate Recommendation of the W3C working group on Web Application Security,[2] widely supported by modern web browsers.[3] CSP provides a standard method for website owners to declare approved origins of content that browsers should be allowed to load on that website—covered types are JavaScript, CSS, HTML frames, web workers, fonts, images, embeddable objects such as Java applets, ActiveX, audio and video files, and other HTML5 features.

If you want to learn about to setup CSP’s head over to https://report-uri.com/products/content_security_policy or https://report-uri.com/home/tools and read more.

I did have Content Security Policies (CSP) set up a few years back, but I had issues with broken resources. A lack of time on my behalf to investigate the issues forced me to disable the Content Security Policy (CSP). I should have changed the “Content-Security-Policy” header to “Content-Security-Policy-Report-Only.”

I will re-add the Content Security Policy (CSP) to my site but this time I will not disable it and will report to https://report-uri.com/, and if need be I will change the header from “content-security-policy” to “content-security-policy-report-only”. That way a broken policy won’t take down my site in future.

If you want to set up a Content Security Policy header and with good reporting of any violations of your CSP policy simply head over to https://report-uri.com/ and create a new account.

Read the official Report URI help documents here: https://docs.report-uri.com/.

Create a Content Security Policy

The hardest part of creating a Content Security Policy is knowing what to add where.

You could generate your own Content Security Policy by heading here (https://report-uri.com/home/generate) but that will take a while.

Create a CSP

TIP: Don’t make your policy live straight away by using the “Content-Security-Policy” header, instead use the “Content-Security-Policy-Report-Only” header.

To create a content Security Policy faster I would recommend you to use this Firefox plugin to generate a starter Content Security Policy.

Screenshot of https://addons.mozilla.org/en-US/firefox/addon/laboratory-by-mozilla/

Install this plugin to Firefox, enable it and click the Plugins icon and ensure “Record this site…” is ticked.

Laboratory plugin inFirefix

Then simply browse to your site (browse as many pages as possible) and a Content Security Policy will be generated based on the content on the page(s) loaded.

TIP: Always review the generated CSP, it allows everything needed to display your site.

Export the CSP from the Firefox plugin to the clipboard

This is the policy that was generated for me in 5 minutes browsing 20 pages.

default-src 'none'; connect-src 'self' https://onesignal.com/api/v1/apps/772f27ad-0d58-494f-9f06-e89f72fd650b/icon https://onesignal.com/api/v1/notifications https://onesignal.com/api/v1/players/67a2f360-687f-4513-83e8-f477da085b26 https://onesignal.com/api/v1/players/67a2f360-687f-4513-83e8-f477da085b26/on_session https://yoast.com/feed/widget/; font-src 'self' data: https://fearby-com.exactdn.com https://fonts.gstatic.com; form-action 'self' https://fearby.com https://syndication.twitter.com https://www.paypal.com; frame-src 'self' https://en-au.wordpress.org https://fearby.com https://googleads.g.doubleclick.net https://onesignal.com https://platform.twitter.com https://syndication.twitter.com https://www.youtube.com; img-src 'self' data: https://a.impactradius-go.com https://abs.twimg.com https://fearby-com.exactdn.com https://healthchecks.io https://pagead2.googlesyndication.com https://pbs.twimg.com https://platform.twitter.com https://secure.gravatar.com https://syndication.twitter.com https://ton.twimg.com https://www.paypalobjects.com; script-src 'self' 'unsafe-eval' 'unsafe-inline' https://adservice.google.com.au/adsid/integrator.js https://adservice.google.com/adsid/integrator.js https://cdn.onesignal.com/sdks/OneSignalPageSDKES6.js https://cdn.onesignal.com/sdks/OneSignalSDK.js https://cdn.syndication.twimg.com/tweets.json https://fearby-com.exactdn.com/wp-content/cache/fvm/1553589606/out/footer-45a3439e.min.js https://fearby-com.exactdn.com/wp-content/cache/fvm/1553589606/out/footer-e6604f67.min.js https://fearby-com.exactdn.com/wp-content/cache/fvm/1553589606/out/footer-f4213fd6.min.js https://fearby-com.exactdn.com/wp-content/cache/fvm/1553589606/out/header-1583146a.min.js https://fearby-com.exactdn.com/wp-content/cache/fvm/1553589606/out/header-823c0a0e.min.js https://fearby-com.exactdn.com/wp-content/piwik.js https://onesignal.com/api/v1/sync/772f27ad-0d58-494f-9f06-e89f72fd650b/web https://pagead2.googlesyndication.com/pagead/js/adsbygoogle.js https://pagead2.googlesyndication.com/pagead/js/r20190610/r20190131/show_ads_impl.js https://pagead2.googlesyndication.com/pub-config/r20160913/ca-pub-9241521190070921.js https://platform.twitter.com/js/moment~timeline~tweet.a20574004ea824b1c047f200045ffa1e.js https://platform.twitter.com/js/tweet.73b7ab8a56ad3263cad8d36ba66467fc.js https://platform.twitter.com/widgets.js https://s.ytimg.com/yts/jsbin/www-widgetapi-vfll-F3yY/www-widgetapi.js https://www.googletagservices.com/activeview/js/current/osd.js https://www.youtube.com/iframe_api; style-src 'self' 'unsafe-inline' https://fonts.googleapis.com/ https://onesignal.com/sdks/ https://platform.twitter.com/css/ https://ton.twimg.com/tfw/css/; worker-src 'self' 

I can truncate starter Content Security Polity and remove some elements. Remove duplicated entries to separate files on a remote server add a wildcard (if I trust the server).

I truncated the policy with the help of the sublime text editor and Report URI CSP Generator.

I added this to the file ‘/etc/nginx/sites-available/default’

add_header "Content-Security-Policy-Report-Only" "default-src 'self' https://fearby.com/; script-src 'self' 'unsafe-inline' 'unsafe-eval' https://adservice.google.com.au https://adservice.google.com https://cdn.onesignal.com https://cdn.syndication.twimg.com https://fearby-com.exactdn.com https://onesignal.com https://pagead2.googlesyndication.com https://platform.twitter.com https://s.ytimg.com https://www.googletagservices.com https://www.youtube.com; style-src 'self' 'unsafe-inline' https://fonts.googleapis.com https://onesignal.com https://platform.twitter.com https://ton.twimg.com; img-src 'self' data: https://a.impactradius-go.com https://abs.twimg.com https://fearby-com.exactdn.com https://healthchecks.io https://pagead2.googlesyndication.com https://pbs.twimg.com https://platform.twitter.com https://secure.gravatar.com https://syndication.twitter.com https://ton.twimg.com https://www.paypalobjects.com; font-src 'self' data: https://fearby-com.exactdn.com https://fonts.gstatic.com; connect-src 'self' https://onesignal.com https://yoast.com; object-src https://fearby.com/; frame-src 'self' https://en-au.wordpress.org https://fearby.com https://googleads.g.doubleclick.net https://onesignal.com https://platform.twitter.com https://syndication.twitter.com https://www.youtube.com; worker-src 'self'; form-action 'self' https://fearby.com https://syndication.twitter.com https://www.paypal.com; report-uri https://fearby.report-uri.com/r/d/csp/reportOnly";

I added the following to the file ‘/etc/nginx/sites-available/default‘ (inside the server node).

Any issues with the Content Security policy will be reported to my web browsers development console and to https://report-uri.com/.

My Chrome development console reports an issue with a graphic not loading from Namecheap.

Namecleap icon not loading

The event was also reported to the Report URI server.

Screenshot of reports at https://report-uri.com/account/reports/csp/

Don’t forget to check the reports often. When you have no more issues left you can make the Policy live by renaming the “Content-Security-Policy-Report-Only” header to “Content-Security-Policy”.

FYI: I had directive reports of ‘script-src-elem’ and it looks like they are new directives added to Chrome 75.

Don’t forget to visit the Report URI setup page and get a URL for where live reports get sent to.

Screenshot of https://report-uri.com/account/setup/

If you go to the Generate CSP page and import your website’s policy you can quickly add new exclusions to your policy

After a few months of testing and tweaking the policy, I can make it live (‘Content-Security-Policy’).

Lynis Enterprise

I have learned so much by using Lynis Enterprise from https://cisofy.com/

I am subscribed to issues notifications at https://github.com/CISOfy/lynis/issues/ and observe about 20 notifications a day in this GitHub community. Maybe one day I will contribute to this project?

Finally, Did the Bank reply?

Yes but it was not very informative.

Dear Simon,

Thank you very much  for the information and we have completely removed the reference that you have raised concern.
We are extremely sorry and apology for the inconvenience caused due to this mistake.

We are thankful for the information and support you have extended.

I tried to inquire how this happened and each time the answer was vague.

Thank you for your support. This was mistakenly used during the testing and we have warned the vendor as well.
I like to request you to close the ticket for this as we have already removed this.

We like to assure such things won’t happen in future.

It looks like the bank used my blog post to create their CSP.

Oh well at least I have a secured my servers.

Thanks for reading.

 

 

Version:

v1.1 – Changed the URL, Removed Ads and added a Lynis Enterprise Conclusion

v1.01 – Fixed the URL

v1.0 – Initial Version

Filed Under: 2nd Factor, CDN, Content Security Policy, Cron, Database, Debian, NGINX, One Signal, PHP, Security, Ubuntu, Vulnerabilities, Vulnerability, Weakness, Website Tagged With: Bank, Cisofy, Content Security Policy, Hacked, Linus

Goodbye Dropbox, One Drive, iCloud and Hello Nextcloud private cloud on UpCloud

June 14, 2020 by Simon

I recently came across NextCloud Hub server (free on self-hosted servers) and I wanted to set up my own private cloud server to store my own files.

I wan’t to be able to access my files on Windows, Mac, Android and iOS.

Most of all I want a place in the cloud (that I own) that I can upload my Acronis backup of C Drive as the Backblaze client (read my review of Backblaze here) is a bit slow at uploading a 150GB backup file to the USA.

To create my own Nextcloud server I will need to login to these services.

  • I logged into my Domain Name provider porkbun.com (to ensure I had a domain name)
  • I logged into Cloudflare.com (to manage my DNS for a subdomain (redirected from PorkBun)).
  • I logged into my UpCloud.com account. (to deploy a new virtual machine)

Fyi: If you don’t have a favourite virtual machine provider you can use my referral link to obtain $25 free credit (only if you are new to UpCloud). Every new user who signs up with my referral link will receive a $25 bonus to get started. That’s 5 months free server (1 CPU and 1GB memory Linux server) 

Post Index

  1. NextCloud System Requirements
  2. Creating a new Virtual Machine at UpCloud
  3. Updating Ubuntu
  4. Installing Common Software Packages
  5. Securing SSH with the Google Authentication PAM module
  6. Installing a Firewall
  7. Installing NGINX and DNS
  8. Installing PHP/PHP-FPM
  9. Installing MySQL
  10. Nixstats
  11. CronTab Updates
  12. Misc Security Stuff

1. NextCloud System Requirements

I checked the NextCloud version 18  system requirements and it needs the following to deploy.

  • Ubuntu 18.04 LTS (recommended)
  • MySQL 5.7+ or MariaDB 10.2+ (recommended)
  • Nginx with php-fpm
  • PHP 4 (recommended)

Minimum Memory Requirements 

Nextcloud needs a minimum of 128MB RAM, and they recommend a minimum of 512MB.

I can deploy a server with at least 512MB memory free. The minimum UpCloud server I can deploy comes with 1GB of memory for $5 a month.

Time to create a new server.

2. Creating a new Virtual Machine at UpCloud

I logged into UpCloud and  clicked “Deploy Server“

Deploy Server Button

I selected Singapore as the place to deploy my new server (as it was closest to me here in Australia). UpCloud does not have servers here in Australia yet.

I chose to deploy a server in Singapore

I checked https://wondernetwork.com/pings/ to ensure Singapore is the fastest location near.  My server https://fearby.com is located in Chicago as it’s closer to my average readers and search engines.

Ping Speeds

I would like my NextCloud server to be as fast as possible to me. Singapore is the faster UpCloud datacenter near me.

I selected a server with 1 CPU Core, 1GB of Memory, 25GB of storage and 1TB of network traffic. I will add a 500GB drive to this server for additional storage.

If the server needs more resources I will upgrade it later.

Server tier's $5 month to $640 a month

The only downside of a $5/m server is the 1TB network quota. If I overuse the network (downloads) I will get an extra charge. 

I reached out to the support to verify the costs if I go over my quota.

Long answered the question.

My question to UpCloud chat support.

Q1) With a $5/m server with 1TB quota what is the over charge costs if I go over 1TB
Q2) Is 1TB quota up and down or just down?

Prompt Answer

Hi Simon,

Good to speak to you again.

A1) Only Simple plans include monthly allowance of outgoing network traffic. After the allowance, the cost is $0.01/GB. It was a lot higher, but we reduced it to make it more competitive.

A2) The quota is for outgoing network traffic from your servers, all incoming and private traffic between your UpCloud servers is free of charge.
Regards,
~Long Lam

I hope this is helpful, let us know if you have any further questions. 

Based on this information if I use all of my 1TB Monthly quota downloading files and I download and extra 150GB (e.g A 150GB Acronis backup image) it will cost $1.5 extra. That’s not bad.

UpCloud Chat Support

Before I selected a server type (Simple or Flexible) or storage type  (MAX IOPS or HDD) I jumped onto the UpCloud chat and asked a few questions.

Q1) Hello, When deploying a server is there a cost difference between MAX IOPS and HDD storage? I am looking at a 500GB drive

A1) Storage (MaxIOPS), per GB $0.00031/ hourly $0.22/ monthly, Storage (HDD), per GB $0.000078/hourly  $0.06 / monthly 

Q2) What’s the difference between Simple and Flexible?

A2) Flexible will/turn out more expensive depending on your use case, generally, it is more suited for short term deployments.

> With our flexible plans, you decide yourself how much CPU, memory and block storage your cloud server is allocated. This gives you incredible flexibility and allows you to fully customise your cloud server according to your specific needs.
 
>Do also note when flexible plans are shutdown we only charge you for allocated storages and IPv4. Whereas in simple plans, it will be charged fully even when shutdown.
 
> Our simple plans are billed by the hour, up to a limit of 672 hours per month. Should you decide to use your fewer hours, you will only be billed for the hours you actually used.
Question 1 to UpCloud chat

UpCloud has very responsive and helpful chat staff.  I never had this level of help with Vultr, Digital Ocean or AWS.

Question 2 to UpCloud

After I chatted with UpCloud support I decided to deploy a simple (Ubuntu 18.04) Server with 1 CPU Core, 1TB network traffic, 1GB of memory, 25GB system drive and an extra 500GB storage device.

When you create a server you can add an extra storage device. Nice.

Add a new device to the main storage device.

When adding an extra storage device you can choose faster MaxIOPS storage or slower HDD based storage. 

I will choose HDD storage as it will be cheaper for a 500GB device.

Second storage MaxIOPS or HDD storage

I created a 500GB storage device for a Nextcloud data drive.

You can create up to 2TB storage devices with UpCloud.

Name of the second storage device

I selected Ubuntu 18.04 LTS as the operating system.

I chose Ubuntu as the operating system

I configured a login method as “Only SSH Keys” as I have already added my SSH key with a passphrase.

Login method SSH Keys only

I selected my SSH key.

If you have not previously added an SSH Key to UpCloud then click Add new. Read more here.

I selected an Initialisation script I previously created (that just outputs a “Hello World” to a text file). One day I will create an Ansible or Terraform script to set up a server.

Select SSH Key and choose an init script

I clicked Deploy

Fyi: If you don’t have a favourite virtual machine provider you can use my referral link to obtain $25 free credit (only if you are new to UpCloud). Every new user who signs up with my referral link will receive a $25 bonus to get started. That’s 5 months free server (1 CPU and 1GB memory Linux server).

I entered my desired hostname 

Deploying a server at UpCloud

I had a notification that the UpCloud Deploy is being deployed..

Deploy Underway

I could see in my UpCloud dashboard that the server was being deployed.

List of all my servers at UpCloud

Server deploy is underway

Wow that took a whole minute to deploy a 525GB Server.

Deploy log said it took 1 minute to deploy

Wow UpCloud are fast

Configuring the server with Putty

Now it is time to connect to the Ubuntu Servers CLI and configure the server.  I grabbed the IP address that was listed at UpCloud.

I opened Putty  and added the IP address for the server.

New Putty connection

Under the Auth section in Putty I added the path to my SSH Private Key (the same one that configured in the new server)

Putty add ppk file

I saved the connection and clicked Open. I clicked Yes to the SSH fingerprint when I verified it was correct.

SSH Connect Verity

I now had root access to my new server.

Default login

Time to update Ubuntu.

3. Updating Ubuntu

I ran this command to update Ubuntu.

sudo apt-get update && sudo apt-get upgrade

Confirming the 2x storage disks

I ran this command to verify I had the 2 storage devices I selected at server deploy.

sudo lsblk |grep disk
vda    252:0    0   25G  0 disk
vdb    252:16   0  500G  0 disk

Yes, I have a 25GB disk and a 500GB disk

4. Installing Common Software Packages

I installed these packages

sudo apt-get install htop
sudo apt-get install lshw
sudo apt-get install ufw
sudo apt-get install ncdu
sudo apt-get install nmap
sudo apt-get install iozone3
sudo apt install pydf
sudo apt install mc
sudo apt install nnn

5. Securing SSH with the Google Authentication PAM module

Before I carry on any further I need to enable hardware 2FA login protections to all SSH logins. I will follow the guide I created here (Setup two factor authenticator protection at login on Ubuntu or Debian).

Warning: Take a backup of your server first. If you set this up wrong say bye-bye to your server. If I lose my YubiCo YubiKey and forget my backup codes I will have a hard time getting back in.

I will force all SSH logins to require my Hardware YubiCo YubiKey to be inserted (to generate a temporary One Time Password (OTP)).

You don’t need a YubiCo YubiKey, a generic software authentication app is OK but I prefer hardware devices as they are more secure.

YubiKey In USB Port Photo

I set the timezone to match Australia/Sydney. If I enabled a 2FA (OTP) at login with a different timezone than my connecting machine I would never be able to login to my server as my server and local PC need to be in the same timezone.

I ran this command to set the time in Ubuntu.

pkg-reconfigure tzdata

I then checked the time

sudo hwclock --show
2020-05-31 23:17:02.873751+1000

I installed the Google Authentication PAM Module (read more)

sudo apt install libpam-google-authenticator

I ran this command to configure the Google PAM Module

google-authenticator

I was presented with these questions

Do you want authentication tokens to be time-based (y/n) y

I was presented with a secret key, verification code and backup codes (I saved these somewhere safe)

Do you want me to update your “/root/.google_authenticator” file? (y/n) y

Do you want to disallow multiple uses of the same authentication
token? This restricts you to one login about every 30s, but it increases
your chances to notice or even prevent man-in-the-middle attacks (y/n) y

By default, a new token is generated every 30 seconds by the mobile app.
In order to compensate for possible time-skew between the client and the server, we allow an extra token before and after the current time. This allows for a time skew of up to 30 seconds between authentication server and client. If you experience problems with poor time synchronization, you can increase the window from its default size of 3 permitted codes (one previous code, the current code, the next code) to 17 permitted codes (the 8 previous codes, the current
code, and the 8 next codes). This will permit for a time skew of up to 4 minutes between client and server.

Do you want to do so? (y/n) y

If the computer that you are logging into isn’t hardened against brute-force
login attempts, you can enable rate-limiting for the authentication module.
By default, this limits attackers to no more than 3 login attempts every 30s.

Do you want to enable rate-limiting? (y/n) y

I can review all config values later with this command

sudo nano ~/.google_authenticator

Now I will enable 2FA at login by editing this file

sudo nano /etc/pam.d/sshd

I searched for “@include common-auth” then added this line after it.

auth required pam_google_authenticator.so

I then comment out the following line (this is the most important step, this forces 2FA)

#@include common-auth

Picture of my /etc/pam.d/sshd changes

pam chnages

I saved the file /etc/pam.d/sshd 

Now I can enable the PAM Module by editing this file

sudo nano /etc/ssh/sshd_config

I searched for

ChallengeResponseAuthentication

And change the value to “yes”

I ensured the following line exists

UsePAM yes

I added this line then saved /etc/ssh/sshd_config

AuthenticationMethods publickey,password publickey,keyboard-interactive

Now I edited /etc/pam.d/common-auth

sudo nano /etc/pam.d/common-auth

I added the following line before the line that says “auth [success=1 default=ignore] pam_unix.so nullok_secure”

auth required pam_google_authenticator.so

Now I can restart the SSH Service and test the 

/etc/init.d/ssh restart
[ ok ] Restarting ssh (via systemctl): ssh.service.

I restarted my putty session and reconnected to my server and I was prompted for the password for my private key and the randomly generated one-time password that was linked to my YubiCo YubiKey. Nice

Now I need to whitelist my SSH port to select IP’s.

6. Installing a Firewall

I installed the UFW firewall by typing this command

sudo apt-get install ufw

I configured UFW to rate limit SSH logins by typing this command

sudo ufw limit ssh comment 'Rate limit hit for openssh server'
Rules updated
Rules updated (v6)

I configured some common ports

sudo ufw allow ssh/tcp
sudo ufw logging on
sudo ufw allow http
sudo ufw allow https
sudo ufw allow 22
sudo ufw allow 53
sudo ufw allow 80
sudo ufw allow 443
sudo ufw allow 873

I added Cloudflare firewall rules (as my domain is behind their firewall and I will remove all direct IP access to my server later)

sudo ufw allow from 173.245.48.0/20
sudo ufw allow from 103.21.244.0/22
sudo ufw allow from 103.22.200.0/22
sudo ufw allow from 103.31.4.0/22
sudo ufw allow from 141.101.64.0/18
sudo ufw allow from 108.162.192.0/18
sudo ufw allow from 190.93.240.0/20
sudo ufw allow from 188.114.96.0/20
sudo ufw allow from 197.234.240.0/22
sudo ufw allow from 198.41.128.0/17
sudo ufw allow from 162.158.0.0/15
sudo ufw allow from 104.16.0.0/12
sudo ufw allow from 172.64.0.0/13
sudo ufw allow from 2400:cb00::/32
sudo ufw allow from 2405:8100::/32
sudo ufw allow from 2405:b500::/32
sudo ufw allow from 2606:4700::/32
sudo ufw allow from 2803:f800::/32
sudo ufw allow from 2c0f:f248::/32
sudo ufw allow from 2a06:98c0::/29

I added appropriate Whitelisted IP’s that can connect to Port 22 (SSH), removed blanket port 22 access and I configured my firewall to allow 91 incoming and outgoing rules (this is a secret)

I reloaded and enabled the firewall.

sudo ufw reload
sudo ufw disable
sudo ufw enable

7. Installing NGINX and DNS

I update Ubuntu again

sudo apt-get update && sudo apt-get upgrade

I installed Nginx

sudo apt-get install nginx

I edited my NGINX config and I change the default www folder location. 

I also configured the log file location, mime types, max body size, gzip, default ports, ssl cert paths, security headers, default page, server name, sensitive file block rules, dns server, cache headers etc.

Read more to here to configure Nginx etc.

Fyi: Nginx config file locations

sudo nano /etc/nginx/nginx.conf
sudo nano /etc/nginx/sites-available/default

I typed my servers IP address into a web browser

Nginx installed

I created an index.html file in the www folder and added “Hello World” to the file.

If I type my server’s IP address into a browser I can see this file.

My DNS is with Cloud flare so I logged in and added 2 DNS entries (IPv4 and IPv6) that direct traffic my new server IP(s) for this subdomain. To obtain the IP addresses I logged into UpCloud and clicked my server then clicked Network and noted my IPv4 and IPv6 addresses.

I then went to Cloudflare and added a DNS record for IPv4 and IPv6 pointing to my servers IP(s). I enabled Cloudflare Proxying to allow Cloud flare to try and hide the IP of the server.  I then configured my firewall to block access to the IP except via Cloudflare and my whitelist.

I then checked for worldwide DNS propagation with https://www.whatsmydns.net/. After 3 minutes my DNS changes were all around the world. Thanks, Cloudflare.

I tried loading my site but CLiudflare said it was down.

Site wont load.

I created a new HTTPS certificate at Cloud flare just to be sure and added it to my sites.

Generated  new SSL cert

After investigating further I found this was because my primary website has a “Strict-Transport-Security header and I had enabled Full (Strict) SSL/TLS Encryption. I changed this to Full at Cloudflare.

Cloudflare HTTPS section

My site was now working.

SIte works

8. Installing PHP/PHP-FPM

To Install PHP 7.4 I ran this command to be able to get the latest version of PHP

sudo apt-get update
sudo apt -y install software-properties-common
sudo add-apt-repository ppa:ondrej/php
sudo apt-get update

I installed PHP 7.4 with this command

sudo apt -y install php7.4

I checked that PHP is installed by running 

php -v
PHP 7.4.6 (cli) (built: May 14 2020 10:02:44) ( NTS )
Copyright (c) The PHP Group
Zend Engine v3.4.0, Copyright (c) Zend Technologies
    with Zend OPcache v7.4.6, Copyright (c), by Zend Technologies

I setup some PHP Modules

sudo apt install php7.4-common php7.4-mysql php7.4-xml php7.4-xmlrpc php7.4-curl php7.4-gd php7.4-imagick php7.4-cli php7.4-dev php7.4-imap php7.4-mbstring php7.4-soap php7.4-zip php7.4-bcmath php7.4-tidy 

I noticed apache2 installed (and broke my Nginx)  so I uninstalled it.

 sudo apt-get remove apache2

I also blocked apache from installing again

apt-mark hold apache2
apache2 set on hold.

I checked to make sure Apache was blocked from installing

apt-mark hold apache*

apache2 was already set on hold.
apache2-bin set on hold.
apache2-utils set on hold.
apache2-data set on hold.
apache2-doc set on hold.
apache2-suexec-pristine set on hold.
apache2-suexec-custom set on hold.
apache2-dbg set on hold.
apache2-dev set on hold.
apache2-ssl-dev set on hold.
apachedex set on hold.
apacheds set on hold.
apachetop set on hold.

Now I will install PHP-FPM.

FPM is a process manager to manage FastCGI in PHP

sudo apt-get install php7.4-fpm

I checked the status of the PHP FPM service with

sudo service php7.4-fpm status

Output

php7.4-fpm.service - The PHP 7.4 FastCGI Process Manager
   Loaded: loaded (/lib/systemd/system/php7.4-fpm.service; enabled; vendor preset: enabled)
   Active: active (running) since Sat 2020-06-06 21:34:31 AEST; 1min 54s ago
     Docs: man:php-fpm7.4(8)
  Process: 7767 ExecStopPost=/usr/lib/php/php-fpm-socket-helper remove /run/php/php-fpm.sock /etc/php/7.4/fpm/pool.d/www.conf 74 (code=exited, status=0/SUCCESS)
  Process: 7772 ExecStartPost=/usr/lib/php/php-fpm-socket-helper install /run/php/php-fpm.sock /etc/php/7.4/fpm/pool.d/www.conf 74 (code=exited, status=0/SUCCESS)
 Main PID: 7769 (php-fpm7.4)
   Status: "Processes active: 0, idle: 2, Requests: 0, slow: 0, Traffic: 0req/sec"
    Tasks: 3 (limit: 1147)
   CGroup: /system.slice/php7.4-fpm.service
           |-7769 php-fpm: master process (/etc/php/7.4/fpm/php-fpm.conf)
           |-7770 php-fpm: pool www
           `-7771 php-fpm: pool www

I might add some PHP child workers if I add more CPU’s to this server later

I edited my php.ini

sudo nano /etc/php/7.4/fpm/php.ini

I made these changes to php.ini

file_uploads = On
allow_url_fopen = On
memory_limit = 512M
post_max_size = 50M
upload_max_filesize = 50M
cgi.fix_pathinfo = 0
max_execution_time = 360
date.timezone = Australia/Sydney

I added read this page (Nginx Configuration) and edited my /etc/nginx/sites-enabled/default

I tested and reloaded the Nginx config and restarted NGINX and PHP

nginx -t
nginx -s reload

sudo systemctl restart nginx.service
sudo systemctl restart php7.4-fpm

sudo systemctl status nginx.service
sudo systemctl status php7.4-fpm

To test PHP FPM I created a php file in my website root and added the following text

<?php phpinfo( ); ?>

I loaded this file in a browser and I confirmed that PHP-FPM was installed.

The test was ok (I deleted this test file), I deleted the index.html and created an index.php file

PHP-FPM test ok

9. Installing MySQL

To install MySQL I ran the following command

fyi: All usernames and database names are for example only.

sudo apt install mysql-server

I configured MySQL With this command

sudo mysql_secure_installation
Securing the MySQL server deployment.

Connecting to MySQL using a blank password.

...
Would you like to setup VALIDATE PASSWORD plugin?
y


There are three levels of password validation policy:
STRONG

Please set the password for root here.
New password:
**************************************************

Re-enter new password:
**************************************************

Estimated strength of the password: 100

Do you wish to continue with the password provided?
y


Remove anonymous users?
y

Disallow root login remotely?
y

Remove test database and access to it?
y

Reload privilege tables now?
y

Now to test MySQL I will login to it

sudo mysql -u root -p
************************************************************

Now I ran the following to create a database for Nextcloud

mysql> CREATE DATABASE databasename CHARACTER SET utf8mb4 COLLATE utf8mb4_general_ci;
Query OK, 1 row affected (0.00 sec)

I verified the database was created

mysql> SHOW DATABASES;
+--------------------+
| Database           |
+--------------------+
| information_schema |
| mysql              |
| databasename       |
| performance_schema |
| sys                |
+--------------------+
5 rows in set (0.00 sec)

I created a database user 

mysql> CREATE USER 'username'@'localhost' IDENTIFIED BY '************************************';
Query OK, 0 rows affected (0.00 sec)

I verified the use was created with this command

mysql> SELECT User,Host FROM mysql.user;
+------------------+-----------+
| User             | Host      |
+------------------+-----------+
| **************** | localhost |
| **************** | localhost |
| **************** | localhost |
| username         | localhost |
| **************** | localhost |
+------------------+-----------+
5 rows in set (0.00 sec)

I set permissions to add the user to the database

mysql> GRANT ALL PRIVILEGES ON `databasename`.* TO 'username'@'localhost';
Query OK, 0 rows affected (0.00 sec)

I verified the permissions with this command

mysql> SHOW GRANTS FOR 'username'@'localhost';
+--------------------------------------------------------------------------+
| Grants for [email protected]                                      |
+--------------------------------------------------------------------------+
| GRANT USAGE ON *.* TO 'username'@'localhost'                       |
| GRANT ALL PRIVILEGES ON `databasename`.* TO 'username'@'localhost' |
+--------------------------------------------------------------------------+
2 rows in set (0.00 sec)

Finally I flushed permissions

mysql> FLUSH PRIVILEGES;

Now the databases is ready for Nextcloud

10. Nixstats

If you do not know what Nixstat’s is check out my post here Monitor server performance with NixStats and receive alerts by SMS, Push, Email, Telegram etc

I logged into Nixstats and click Add Server. I ran the provided install command.

wget -q -N --no-check-certificate https://nixstats.com/nixstatsagent.sh && bash nixstatsagent.sh ################## ##########################

Todo: Configure Nixstats PHP-FPM and NGINX Reporting (work in progress). My firewall rules are too tight for this install.

Handy Links

  • Monitoring Nginx with Nixstats
  • https://help.nixstats.com/en/article/monitoring-php-fpm-1tlyur6/

11. CronTab Updates

I created a update.sh file that I can call from a crontab entry to update Ubuntu and other software every xx hours.

I added this to my crontab.

12. Misc Security Stuff

I made sure my firewall only allowed traffic to my server was from Cloudflare IP’s and Whitelisted IP’s

Cloud flare IP’s can be found here.

https://www.cloudflare.com/ips-v4/
https://www.cloudflare.com/ips-v6/

At the time of writing the IP’s are 

173.245.48.0/20
103.21.244.0/22
103.22.200.0/22
103.31.4.0/22
141.101.64.0/18
108.162.192.0/18
190.93.240.0/20
188.114.96.0/20
197.234.240.0/22
198.41.128.0/17
162.158.0.0/15
104.16.0.0/12
172.64.0.0/13
131.0.72.0/22
2400:cb00::/32
2606:4700::/32
2803:f800::/32
2405:b500::/32
2405:8100::/32
2a06:98c0::/29
2c0f:f248::/32

I blocked access to my webserver (port 80 and 443) to anyone but Cloudflare.

I whitelisted DNS traffic to only Up Cloud. Thanks, Lon.

Up Cloud support is awesome.

UpCloud Support

Installing NextCloud

Finally I can Install Nextcloud, I navigated to https://nextcloud.com/install/ and clicked Download for Server

Download Nextcloud

I will use the Web installer to Install Nextcloud.

Web Installer Tab

Nextcloud web installer instructions

Setup Instructions

Snip about the Nextcloud Installer from the download page

The Web Installer is the easiest way to install Nextcloud on a web space. It checks the dependencies, downloads Nextcloud from the official server, unpacks it with the right permissions and the right user account. Finally, you will be redirected to the Nextcloud installer.

1) Right-click here and save the file to your computer
2) Upload setup-nextcloud.php to your web space
3) Point your web browser to setup-nextcloud.php on your webspace
4) Follow the instructions and configure Nextcloud
5) Login to your newly created Nextcloud instance!

You can find further instructions in the Nextcloud Admin Manual.

Note that the installer uses the same Nextcloud version as available for the built in updater in Nextcloud. After a major release it can take up to a month before it becomes available through the web installer and the updater. This is done to spread the deployment of new major releases out over time.

I used WinSCP to upload the setup-nextcloud.php to my Nginx web root  folder

WinSCP uploading

I loaded the setup-nextcloud.php file from, my web browser.

Loading setup-nextcloud.php

I entered “.” to install Nextcloud to the website root.

Install Next cloud to .

There is no way Nextcloud installed in 2 seconds, I checked the size of the disk usage in my website root.

sudo du -hs /web-root
313M

Nextcloud took about 10 seconds to download 313MB onto my UpCloud Server.

Fyi: I installed the SpeedTest CLI app and ran a benchmark and UpCloud Chicago can download as 937Mbps/sec and UpCloud Singapore can download at 717Mbps/sec. 

Nextcloud is installed.

Now I need to enter the data root folder for Nextcloud . I installed lswh to be able to see my 500GB disk.

sudo apt-get install lshw

I ran the following to see my disks

sudo lshw -class disk -short
H/W path        Device     Class      Description
=================================================
**********      /dev/vda   disk       26GB Virtual I/O device
**********      /dev/vdb   disk       536GB Virtual I/O device

I formatted my disk

sudo mkfs.ext4 /dev/vdb

I created a new folder under mount to connect to the partition. The folder name is a made-up sample

sudo mkdir -p /mnt/foldername

I mounted the partition to the folder

sudo mount /dev/vdb /mnt/foldername

I made sure Nginx can access the folder

sudo chown -R www-data:www-data /mnt/foldername

I changed to the partition mount

cd /mnt/foldername

I created a test 490GB file

fallocate -l 490G test.file

I checked the file

ls -al
-rw-r--r-- 1 username useername     526133493760 Jun  9 19:38 test.file

I deleted this test file and set this mount point as the data file in Nextcloud setup.

I added a new Nextcloud admin username and password,  mount folder for Nextcloud data folder, the SQL database user/password/database name and host and clicked Finish Setup

Nextcloud details

Nextcloud was setup.

Misc Setup

I ran the /settings/admin/overview report to see if I needed to perform andy final setup steps. I have a few missing php modules and a few optimisation tasks that need resolving.

Links to resolve.

  • Path Fixes
  • PHP Memory Limit
  • PHP Server Tuning

Nextcloud External Security Scan

I loaded https://scan.nextcloud.com/ to perform a external security scan.

Security Scan

Scan Results

All good so far.

Adding Two-Factor Authentication (YubiKeys)

I noticed in the Nextcloud security setting page I can setup a YubiKey as a pass-wordless  login device.

Web AuthN device

This would allow me to insert my YubiKey to login automatically

Auto login.

I added my YubiKey and gave it a name.

Name a YubiKey

The password-less login method is a bit insecure as anyone that has my YubiKey can access my site.

I think I will set up a Two-Factor Authentication/OTP login method and link that to my YubiKey.  I visited the /settings/apps/security page and installed the Two-Factor TOTP Provider app.

Install the OTP App
Install; the OTL app

I clicked the checkbox next to TOTP

Enable TOPT

The app generated a QR code that my YubiCo Authentication App can use to link to Nectcloud

I verified the QR scan and entered the 6 number verification code from my YubiCo Authenticator app

Scan the QR Code

Two Factor logins are now enabled.

2FA Enabled/

Now after I log in I have to enter a temporary 6 digit number that is only valid for 30 seconds (and only after entering my YubiCo YubiKey into my PC and entering its password)

2FA enabled at login/

Nice

Nextcloud Overview

I logged into Nextcloud and was greeted with a wizard.

Welcome screen

The sample images in the welcome screen are a bit small.

welcome screen summary

I can add native apps to Windows, Mac, iOS and Android or I can log in via the web page.

App downalod options

Pointers to the manual, community help and forums.

Help options

Main screen is clean.

Main Screen

A user context Menu is linked in the top right.

Drop down menu.

I setup email alerts (I allowed outgoing ports in my firewall)

sudo ufw allow out 465/tcp
sudo ufw allow out 465/udb

I used my GSuite account to send emails.

email settings

Syncing Files from my PC to Nextcloud

I tried uploading my 150GB Acronis Backup image file to Nextcloud by the web interface but this will fail for sure, this will take many hours.

Acronis image uploading.

I decided to configure Acronis True Image to split backups into 100MB chunks.

100GB file sizes

I created another Acronis image of my Windows Drive.

Nextcloud Windows App

I visited https://nextcloud.com/install/ and installed the Nextcloud Windows app to sync files.

Download windows app

I clicked Windows

Windows Download

Click Next

Click Next

Click Next

Click Next

Click Install

Click Install

Nextcloud sync app is now installing

Installing Wizard

Next cloud sync is now installed.

Run Nextcloud

Click Log in

Login Screen

Enter your Nextcloud server https address and click Next

Enter https server

A web browser login screen appeared and I logged in 

Login to the web app.

After I logged in Nextcloud sync was connected

Sync Connected

I was prompted to sync everything online to my local PC or choose folders to Sync .

Sync File dialog.

All files that were in Nextcloud synced down (that I selected)

Nextcloud sync

I set Nextcloud to start at Windows start.

Start at startup.

I reviewed Download and Upload limits

I decided to add my U:\AcronisBackup folder to my Nextcloud server.

U:\AcronisBackup added rto sync

I was asked to add this to a remote Nextcloud folder.

add to destination folder dialog

Files were backing up.

I has 150GB of Acronis backup files backing up.

I could see each 100MB section of the Acronis Backup appearing in the Nextcloud web app.

Nextcloud Web site

I noticed that the raw file system list of files was about 30 seconds ahead of the web list.

ls -al list of the file system

I had an Alert from my Acronis Backup software that new backup files were downloading.

The Acronis backup folder started backing up but I noticed it was redownloading to a new folder.  I don’t want this.

I allowed Nextcloud to access backup files

I paused the Nextloud Sync and my 150GB Backup was re-downloading to a new folder.

pause backup

It looks like U:\AcronisBackup was backing up then downloading to U:\Nextcloud\Simon\AcronisBackup.

File dialog

I moved my Acronis backup from U:\AcronisBackup to U:\Nextcloud\Simon\ZENigma (ZENnigma is the name of my PC)

I moved my 150GB backup files into Nextcloud folder/

I deleted the old sync of U:\AcronisBackup and started the Nextcloud Sync again

Sync restarted

Now my Acronis backup (150GB) was backing up to Nextcloud.

Backup working

It took 24 hours to backup 150GB from my PC to my server in Singapore.

I can see a handy summary of synced files and disk space used/free.

Done

I can control the sync with a System Tray App.

Sys Tray APp

Nextcloud Conclusion

Pros

  • Free
  • Works well.
  • I have an offsite location for backups and an area for file sharing with my family
  • Faster than Backblaze and Dropbox

Cons

  • Needs better Hardware 2FA support
  • Some Nextcloud web pages are not mobile-friendly (e.g add new user)
  • Needs better post install security checks
  • Web view of files could be updated more often, there is as 30-second delay between the web list of files and a CLI list in Putty of /mnt/foldername/username/files/

Troubleshooting

NGINX website is not loading

Check to see if a package has downloaded apache (this will take out Nginx).

Also, make sure you have set permissions on the folder that holds your SSL Certificates and allow your Nginx www-data user read access.

sudo chown -R www-data:www-data /etc/nginx/https-cert/

Deleting a MySQL Database

I had an issue where Nextcloud did not like the database I created so I ran the following to revoke the database users permissions, remove the user and I deleted the database.

Command to revoke the users MySQL permissions

sudo mysql -u root -p
*************************************
mysql> REVOKE ALL PRIVILEGES, GRANT OPTION FROM 'databaseusername'@'localhost';

Delete the MySQL user

sudo mysql -u root -p
Enter password: *************************************
mysql> DROP USER 'databaseusername'@'localhost';

I reset flushed permissions

sudo mysql -u root -p
Enter password: *************************************
mysql> 
FLUSH PRIVILEGES;

To delete the database run the following.

mysqladmin -u root -p drop databasename
Enter password: *************************************
Dropping the database is potentially a very bad thing to do.
Any data stored in the database will be destroyed.

Do you really want to drop the 'databasename' database [y/N] y
Database "databasename" dropped

Thanks for Reading

Fyi: If you don’t have a favourite virtual machine provider you can use my referral link to obtain $25 free credit (only if you are new to UpCloud). Every new user who signs up with my referral link will receive a $25 bonus to get started. That’s 5 months free server (1 CPU and 1GB memory Linux server) 

v1.1

Filed Under: 2nd Factor, Backblaze, Backup, Database, Domain, Google, Nextcloud, Putty, SSH, UpCloud Tagged With: backblaze, Dropbox, Google One, Nextcloud

How to backup and restore a MySQL database on Windows and Linux

April 21, 2019 by Simon

Why backup and restore

This is a quick guide demonstrating how you can backup and restore a MySQL database on Windows and Linux using Adminer.

You may need to know how to backup a restore a database for a number of reasons..

e.g

  • Send the database to someone to debug or give feedback while learning.
  • Move the database from a local machine to the cloud
  • Move the database from cloud vendor A to cloud vendor B
  • etc.

Having a backup of the VM is good but having a backup of the database too is better. I use UpCloud for hosting my VM’s and setting backups is easy. But I cannot download those backups.

UpCloud Backup Screen

Murphy’s Law

“If anything can go wrong, it will”

The most important reason for taking a backup and knowing how to restore it is for disaster recovery reasons.

Backup (the easiest way) with Adminer

Adminer is a free PHP based IDE for MySQL and other databases. Simply install Adminer and save the file on your local computer or remote web server directory.

FYI: The Adminer author Jakub Vrana has a patron page, I am a patron of this awesome software.

Snip from Adminers website. “Adminer (formerly phpMinAdmin) is a full-featured database management tool written in PHP. Conversely to phpMyAdmin, it consist of a single file ready to deploy to the target server. Adminer is available for MySQL, MariaDB, PostgreSQL, SQLite, MS SQL, Oracle, Firebird, SimpleDB, Elasticsearch andMongoDB.”

adminer.php file icon screenshot

TIP: The file would be publicly accessible to anyone so don’t save it to a common area, obfuscate the file, protect it of delete the file when you are done using it.

Once Adminer is installed load it in a web browser, login with your MySQL credentials. Once you login you will see all databases and an Import and Export menu.

Adminer main screen, all databases and import and export menu.

tbtest is a simple database with one table and 4 fields (ID, Key, Value and Modified)

.Click Export to open the export screen.

Export screen showing a list of databases and export options

Click Export, a SQL file will be generated (this is the export of the database).

Here is a save of the file:
https://fearby.com/wp-content/uploads/export.txt

Exported view of https://dev.mysql.com/doc/workbench/en/wb-admin-export-import-management.html

Its that simple.

If I add a binary blob file to the table and upload a PNG file lets see how the export looks.

Screenshot o the new table with a blog field in Adminer UI

Let export the database again in Adminer and check out the output. I used Sublime Text editor to view the export file.

New Export shows the binary file in the Backup SQL file

Restore (the easiest way) with Adminer

OK lets delete the tbtest database and then restore it with Adminer. I used Adminer to delete (DROP) the database.

Database dropped with Adminer

Database “dbtest” deleted.

Now lets create a blank database to restore to (same name).

Create database screen.

Database created.

dbtest created.

Now lets import the database backup using Adminer.

Click Import, select the backup file and un-tick Stop on errors.

Import screenshot, dxtest selectded, Restore file selected, stop on errors disabled

TIP: The 2MB next the the choose file button is defined by your web server and PHP configuration. If you are trying to import a larger database (e.g 80MB) first increase the limits in your web server and PHP (via php.ini).

The Import (restore should take seconds)

Import Success

The database was imported from a backup, all tables and records imported just fine.

The database was imported from a backup

Bonus methods.

On Ubuntu use this guide to backup from the command line. If you use the Oracle MySQL Workbench read this.

I hope this helps someone.

Filed Under: Adminer, Backup, Database, MySQL, Restore Tagged With: and, Backup, How, Linux, MySQL, on, restore, to, windows

Setting up the free MySQL database server on Windows 10

April 20, 2019 by Simon

This guide assumes you are a beginner using Windows 10 and maybe have a default website configured on Windows 10 using the built in Information Server (IIS) along with PHP (e.g PHP 7.3.4 or greater).

If you are have never used Internet Information Server (IIS) then XAMPP is a great way (single install) to setup a web server, database server and PHP with little fuss.

In this case I will manually install MySQL Community Server to add alongside Internet Information Server (IIS) on Windows 10.

Downloading
MySQL Server (Community Edition)

Go to here and click MySQL Installer for Windows.

Screenshot of Download installer link

Click MySQL Installer for Windows.

fyi: I sent myself on a goose chase as I could only see a 32 bit installer (I spent days trying to find a 64-bit installer or manual install binaries that were 64 bit). I should have read the bit that said “MySQL Installer is 32 bit, but will install both 32 bit and 64 bit binaries.“

MySQL Installer is 32 bit, but will install both 32 bit and 64 bit binaries.

You can read the installer documentation here if you wish.

I downloaded the larger of the two available installers (one 16MB the other 300+ MB, same version etc.). I had to login with an Oracle ID to start the download.

Download MySQL Installer screenshot

Install file downloaded

Installing MySQL Server (Community Edition)

I started the installer (accepted the licence agreement)

I accepted the licence agreement

I selected “Full Install“

I could have selected server only or custom.

I selected Full Install

I downloaded and installed Python 3.7, Thanks MySQL Installer for adding a link to the Python download.

I Installed Python 3.7

After Python was installed I clicked refresh in MySQL and now MySQL can see Python 3.7.

Now MySQL can see Python 3.7.

I had a Visual Studio plugin install error (because Visual Studio was not installed).

Visual Studio plugin install error (Visual Studio is not installed)

Full Install (all components selected) reported the items are ready for install.

Full Install (all components selected)

Installation status complete.

Installation status list (Full Install)

List of items to configure (post install)

List of items to configure.

I setup a standard MySQL Server (not a Cluster)

I setup a standard MySQL Server (not a Cluster)

I setup MySQL as a standard Development computer on port 3306 over TCP/IP (no Named Pipe, Shared Memory etc).

I setup MySQL as a standard Development computer on port 3306 over TCP/IP.

I enforced strong passwords.

I enforced strong passwords.

I set a root password and added few app usernames (with passwords).

I set a root password and a few app usernames (with passwords)

I named the MySQL Instance and set it to auto start when windows starts as a standard system account.

I named the MySQL Instance and set it to auto start when windows starts as a standard account.

Post installation in progress.

Installation in progress.

I accepted the defaults for the next screen (configuring routers). I tested the connection to MySQL Server.

Connect to MySQL server test screen.

Installation complete.

Installation complete screen.

MySQL WorkBench

I opened the MySQL Workbench software and viewed the server instance information. Nice.

MySQL Workbench Instance information

MySQL Workbench server performance graphs are nice.

MySQL Workbench performance graphs.

I am used to Adminer for managing MySQL on Linux so I install that now.

Install Adminer (formerly phpMinAdmin)

I prefer the PHP based Admirer database management tool from here.

https://adminer.net website  screenshot.

I downloaded a single PHP file and placed it in my IIS website root folder (as /adminer.php).

I tried to connect to my MySQL instance but received this error “The server requested authentication method unknown to the client”.

Unable to connect to MySQL with Adminer: Error "he server requested authentication method unknown to the client"

I googled and checked that I had a MySQL extension in php.ini (It did).

I opened MySQL Workbench and opened Manage Server Connections and located my my.ini file location (“C:\ProgramData\MySQL\MySQL Server 8.0\my.ini“). I opened my my.ini in a text editor and commented out the following line

#default_authentication_plugin=caching_sha2_password

and added

default_authentication_plugin= mysql_native_password

I saved the my.ini file, stopped and started the MySQL Service.

MySQL Workbench restarting database service UI screenshot.

I opened the MySQL Workbench and ran a query (File then “New Query Tab“) to set/reset the password.

ALTER USER 'root'@'localhost' IDENTIFIED WITH mysql_native_password
BY 'your_password_goes_here';  

Screenshot

Reset root password SQL statement "ALTER USER

I then tested logging in to MySQL with Adminer.

Adminer view showing default databases installed by MySQL.

Success, I can now created databases and tables.

Adminer create datbase table screenshot showing a table with 4 fields.

I hope this helps someone.

Filed Under: Database, MySQL, PHP Tagged With: MySQL, php, Setup

Primary Sidebar

Poll

What would you like to see more posts about?
Results

Support this Blog

Create your own server today (support me by using these links

Create your own server on UpCloud here ($25 free credit).

Create your own server on Vultr here.

Create your own server on Digital Ocean here ($10 free credit).

Remember you can install the Runcloud server management dashboard here if you need DevOps help.

Advertisement:

Tags

2FA (9) Advice (17) Analytics (9) App (9) Apple (10) AWS (9) Backup (21) Business (8) CDN (8) Cloud (49) Cloudflare (8) Code (8) Development (26) Digital Ocean (13) DNS (11) Domain (27) Firewall (12) Git (7) Hosting (18) HTTPS (6) IoT (9) LetsEncrypt (7) Linux (20) Marketing (11) MySQL (24) NGINX (11) NodeJS (11) OS (10) PHP (13) Scalability (12) Scalable (14) Security (44) SEO (7) Server (26) Software (7) SSH (7) ssl (17) Tech Advice (9) Ubuntu (39) Uncategorized (23) UpCloud (12) VM (44) Vultr (24) Website (14) Wordpress (25)

Disclaimer

Terms And Conditions Of Use All content provided on this "www.fearby.com" blog is for informational purposes only. Views are his own and not his employers. The owner of this blog makes no representations as to the accuracy or completeness of any information on this site or found by following any link on this site. Never make changes to a live site without backing it up first.

Advertisement:

Footer

Popular

  • Backing up your computer automatically with BackBlaze software (no data limit)
  • How to back up an iPhone (including photos and videos) multiple ways
  • Add two factor auth login protection to WordPress with YubiCo hardware YubiKeys and or 2FA Authenticator App
  • Setup two factor authenticator protection at login on Ubuntu or Debian
  • Using the Yubico YubiKey NEO hardware-based two-factor authentication device to improve authentication and logins to OSX and software
  • I moved my domain to UpCloud (on the other side of the world) from Vultr (Sydney) and could not be happier with the performance.
  • Monitor server performance with NixStats and receive alerts by SMS, Push, Email, Telegram etc
  • Speeding up WordPress with the ewww.io ExactDN CDN and Image Compression Plugin
  • Add Google AdWords to your WordPress blog

Security

  • Check the compatibility of your WordPress theme and plugin code with PHP Compatibility Checker
  • Add two factor auth login protection to WordPress with YubiCo hardware YubiKeys and or 2FA Authenticator App
  • Setup two factor authenticator protection at login on Ubuntu or Debian
  • Using the Yubico YubiKey NEO hardware-based two-factor authentication device to improve authentication and logins to OSX and software
  • Setting up DNSSEC on a Namecheap domain hosted on UpCloud using CloudFlare
  • Set up Feature-Policy, Referrer-Policy and Content Security Policy headers in Nginx
  • Securing Google G Suite email by setting up SPF, DKIM and DMARC with Cloudflare
  • Enabling TLS 1.3 SSL on a NGINX Website (Ubuntu 16.04 server) that is using Cloudflare
  • Using the Qualys FreeScan Scanner to test your website for online vulnerabilities
  • Beyond SSL with Content Security Policy, Public Key Pinning etc
  • Upgraded to Wordfence Premium to get real-time login defence, malware scanner and two-factor authentication for WordPress logins
  • Run an Ubuntu VM system audit with Lynis
  • Securing Ubuntu in the cloud
  • No matter what server-provider you are using I strongly recommend you have a hot spare ready on a different provider

Code

  • How to code PHP on your localhost and deploy to the cloud via SFTP with PHPStorm by Jet Brains
  • Useful Java FX Code I use in a project using IntelliJ IDEA and jdk1.8.0_161.jdk
  • No matter what server-provider you are using I strongly recommend you have a hot spare ready on a different provider
  • How to setup PHP FPM on demand child workers in PHP 7.x to increase website traffic
  • Installing Android Studio 3 and creating your first Kotlin Android App
  • PHP 7 code to send object oriented sanitised input data via bound parameters to a MYSQL database
  • How to use Sublime Text editor locally to edit code files on a remote server via SSH
  • Creating your first Java FX app and using the Gluon Scene Builder in the IntelliJ IDEA IDE
  • Deploying nodejs apps in the background and monitoring them with PM2 from keymetrics.io

Tech

  • Backing up your computer automatically with BackBlaze software (no data limit)
  • How to back up an iPhone (including photos and videos) multiple ways
  • US v Huawei: The battle for 5G
  • Check the compatibility of your WordPress theme and plugin code with PHP Compatibility Checker
  • Is OSX Mojave on a 2014 MacBook Pro slower or faster than High Sierra
  • Telstra promised Fibre to the house (FTTP) when I had FTTN and this is what happened..
  • The case of the overheating Mac Book Pro and Occam’s Razor
  • Useful Linux Terminal Commands
  • Useful OSX Terminal Commands
  • Useful Linux Terminal Commands
  • What is the difference between 2D, 3D, 360 Video, AR, AR2D, AR3D, MR, VR and HR?
  • Application scalability on a budget (my journey)
  • Monitor server performance with NixStats and receive alerts by SMS, Push, Email, Telegram etc
  • Why I will never buy a new Apple Laptop until they fix the hardware cooling issues.

Wordpress

  • Replacing Google Analytics with Piwik/Matomo for a locally hosted privacy focused open source analytics solution
  • Setting web push notifications in WordPress with OneSignal
  • Telstra promised Fibre to the house (FTTP) when I had FTTN and this is what happened..
  • Check the compatibility of your WordPress theme and plugin code with PHP Compatibility Checker
  • Add two factor auth login protection to WordPress with YubiCo hardware YubiKeys and or 2FA Authenticator App
  • Monitor server performance with NixStats and receive alerts by SMS, Push, Email, Telegram etc
  • Upgraded to Wordfence Premium to get real-time login defence, malware scanner and two-factor authentication for WordPress logins
  • Wordfence Security Plugin for WordPress
  • Speeding up WordPress with the ewww.io ExactDN CDN and Image Compression Plugin
  • Installing and managing WordPress with WP-CLI from the command line on Ubuntu
  • Moving WordPress to a new self managed server away from CPanel
  • Moving WordPress to a new self managed server away from CPanel

General

  • Backing up your computer automatically with BackBlaze software (no data limit)
  • How to back up an iPhone (including photos and videos) multiple ways
  • US v Huawei: The battle for 5G
  • Using the WinSCP Client on Windows to transfer files to and from a Linux server over SFTP
  • Connecting to a server via SSH with Putty
  • Setting web push notifications in WordPress with OneSignal
  • Infographic: So you have an idea for an app
  • Restoring lost files on a Windows FAT, FAT32, NTFS or Linux EXT, Linux XFS volume with iRecover from diydatarecovery.nl
  • Building faster web apps with google tools and exceed user expectations
  • Why I will never buy a new Apple Laptop until they fix the hardware cooling issues.
  • Telstra promised Fibre to the house (FTTP) when I had FTTN and this is what happened..

Copyright © 2023 · News Pro on Genesis Framework · WordPress · Log in

Some ads on this site use cookies. You can opt-out if of local analytics tracking by scrolling to the bottom of the front page or any article and clicking "You are not opted out. Click here to opt out.". Accept Reject Read More
GDPR, Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT