Can you run Dawarich against an external PostgreSQL instance?

I already have a PostgreSQL instance set up and running on my network, that gets backed up on a schedule. So, when I tried to install Dawarich I set it up to use a database on that server.

When I try to import data, the file upload bar goes to 100%, there are no errors in the logs that I can see, but at some point an error flashes up on the UI and I never see an import job in sidekiq.

After I try to run it a row gets added to the active_storage_blobs table, but not to the imports table. I am wondering if it is trying to use the PostgreSQL copy command to load the data and that is why it is not working with the DB not sharing disk space with the main app?

Any hints would be appreciated.

It’s possible to run Dawarich with external DB, of course.

Check out browser console for errors during and after an uploading attempt

Great!

Thanks for the info. I’ll play around some more. I just wanted to make sure it didn’t need access to the PostgreSQL disk. Thought maybe you were using COPY to speed up the import when I noticed the dawarich_db_data and dawarich_shared volumes were used by more than just the db service in the docker-compose.

Doesnt it need to have Postgis installed? Which might not be the case for an already existing DB

1 Like

Yes, there is a DB defined in the docker-compose.yml I only put it’s db on another instance because I already have a fairly large postgres instance set up and it is set up to have automated backups done already.

Just wanted to follow up. I could never find an error in the logs or the console, but I re-exported my timeline from google maps and that new file imports and works perfectly using the external PostgreSQL. Somehow that previous timeline file must have had some corruption in it. Thanks for letting me know that it could run without the shared volume, that gave me the hint to try to recreate the timeline file!

1 Like

Yes, PostGIS extension is a requirement