Web application
At its core Nextcloud is a PHP application; setting those up isn’t complicated and there’s a good guide for doing so on a Raspberry Pi. I broke with it and opted for PHP 7 over for performance reasons. I used part of Andy Miller’s guide for that (ignoring the Nginx stuff) but I found I needed more PHP modules:
sudo apt-get install -t stretch php7.0 php7.0-gd php7.0-sqlite3 php7.0-curl php7.0-opcache php7.0-zip php7.0-xml php7.0-mbstring
Data storage
I went down a number of blind alleys with the backend. If you’re planning to have the bulk of your storage live external to the Pi you shoud still keep Nextcloud’s data directory local. The first thing I tried doing was simply putting it on an NFS share as I’d done with Plex. This was a bad idea and didn’t work. Nextcloud supports a concept of external storage; users can choose to add Samba shares, Google Drive folders, etc. That’s the proper way to attack the issue.
I tried Samba/CIFS first. Mounting a share from the Synology NAS worked fine but after a successful initial sync I encountered an error in which I was notified every couple minutes about remote changes and prompted to resolve them, although no such changes had taken place. I ensured that the clocks were synced between my laptop, the Pi, and the NAS, but didn’t solve it. I think the root cause is this issue in the ownCloud client; it’s solved in master but not in a compiled release. I encountered environment problems trying to compile the client and decided to try a different method.
Happily, I encountered no problems setting up WebDAV. As with NFS, you do have to enable it on the Synology NAS, but once you’ve done that you can just add it as external storage. I found that I needed to re-do the sync with the client after changing external storage methods, even though they had the same directory structure. I’m still getting some spurious notifications from the Nextcloud client but they’re infrequent, don’t require action, and don’t steal focus.
External access
I did all this as a proof-of-concept with HTTP access only and a local IP. To make this truly useful you need to be able to access your files from offsite. To make that practical and secure you need a domain name and an SSL certificate.
I registered a domain from hover; a friend recommended them and they seemed reasonably priced. I pointed an A-record at my current IP, which is somewhat static. I didn’t bother with any dynamic DNS solutions; I can accept the 24 hours it takes for the record to propagate when my IP eventually changes.
The Pi is behind two NATs: my cable modem and my wireless router. As with Plex I set 443 to forward from the outside world to the Pi. I think port 80 is blocked by my provider and I’m not offering anything on 80 anyway.
For the SSL certificate I went with Let’s Encrypt. I’ve used them with other small projects. The instructions for Certbot on Debian 8 (jessie) mostly worked; I found that I needed to import two repository keys. Once I did that I was able to run Certbot which handled all the SSL configuration on the host.
Cleaning up
With an HTTPS-fronted domain live I added that domain to the list of trusted domains on the server, then swapped out the client configuration on my laptop. Nextcloud appeared to recognize that it was dealing with the same server and didn’t need re-sync anything.
I’ve been running in this configuration for close to a week and it’s been smooth sailing.
Featured image by fir0002 | flagstaffotos.com.au [GFDL 1.2], via Wikimedia Commons