Mysql speed improvements and query optimisation Mysql speed improvements and query optimisation

January 19, 2023

laravel

Yesterday, I came across an issue where a Laravel spider I had running crawled to a snails pace. It was running by selecting a url from the database, using curl to get the html, extracting the title, meta tags and links and then storing this in 2 tables. In all it was 6 queries with MYSQL/ MariaDB and I went around the houses to optimise and solve the problem.

The first thing I did was to look into slow query logging, so editing my.cnf file with a text editor and add the following block of code under the mysqld section:

slow_query_log = 1
slow-query_log_file = /var/log/mysql-slow.log
long_query_time = 2

To restart the mariadb server

sudo systemctl restart mariadb.service

However, after monitoring it for a while, it never logged any slow queries. This got me thinking it was something with the MYsql configuration.

So secondly I installed 'mysqltuner' and it suggested the following optimisations in the my.cnf file.

running

mysqltuner --host 127.0.0.1 --port 3306 --user root --pass 'password'

gave good feedback and suggestions like so:

Variables to adjust:
    query_cache_size (=0)
    query_cache_type (=0)
    query_cache_limit (> 1M, or use smaller result sets)
    tmp_table_size (> 16M)
    max_heap_table_size (> 16M)
    performance_schema = ON enable PFS
    innodb_buffer_pool_size (>= 765.8M) if possible.
    innodb_log_file_size should be (=16M) if possible, so InnoDB total log files size equals to 25% of buffer pool size.

I also checked the database tables with

mysqlcheck -u root -p  --check  --all-databases

and also I checked the logs in for obvious errors.

/var/log/mysql/error.log
/var/log/apache/error.log

I also added Indexes and Primary keys on the tables, but although this helped a little, it didnt fix the problem.

Finally, I wanted to get live performance stats for the queries, so I installed the laravel debugbar and this gave me the answer. The following query was taking over 10 seconds!

$data['website'] = websites::where('updated_at', '=', '2022-12-12 12:34:44')
    ->where('dns', '!=', 'failed')
    ->inRandomOrder()
    ->take(1)
    ->first();

The inRandomOrder() I had added was slowing it down dramatically, and once i removed it, the query was back to running in 25 MILLIseconds!

I have found and am using a number of laravel packages in recent days that solve small problems and improve my project. I will write about each one in turn.


If you would like to contact me with this form on londinium.com, ilminster.net or via Twitter @andylondon