Speeding up local Postgres databases for development
Local DB performance tip - when running projects locally (for development or running tests), you may be able to speed up database performance by adjusting its data durability settings. This would translate in faster setUp()
/tearDown()
in between tests as well as faster database migrations.
The default config Postgres config favors data safety/reliability, but in a local case the DB shouldn't have anything we care about, so in that case you can change that. If the DB gets corrupted as a result (power failure, etc) it's easy enough to drop it and restart from scratch.
Assuming you are running Postgres with the default configuration, it should be possible to change the settings straight from a superuser (postgres
user) database shell and then restart the server (under the hood it will edit its postgresql.auto.conf
file which is included from its main config file).
Run the following:
ALTER SYSTEM SET fsync = off;
ALTER SYSTEM SET synchronous_commit = off;
ALTER SYSTEM SET full_page_writes = off;
ALTER SYSTEM SET checkpoint_timeout = '30min';
-- take more RAM but avoid spilling to disk
ALTER SYSTEM SET work_mem = '32MB';
-- if running on an SSD, this is good as well
ALTER SYSTEM SET random_page_cost = '1.1';
-- if you want to log all queries - useful for debugging, etc
ALTER SYSTEM SET log_statement = 'all';
Then restart the server - if you're on Mac and installed Postgres using Homebrew, then:
brew services restart postgresql
You can confirm if your changes were applied by issuing SHOW fsync
in a database shell.
If the above didn't work for whatever reason, you can also manually merge the above settings in your server's configuration file (depends on how you installed Postgres).