Note: It seems this no longer work (see comments)
Synology Hyper Backup can backup to WebDAV servers. Stack has support for WebDAV but They don’t seem to like each other. If you follow a few Dutch forum threads (1 2 ). Users mention that the use of Sparse files is the problem. Hyper Backup creates a Sparse file in WebDAV backups in nas_1.hbk/Pool/bucketID.counter.1
The filename tells us that it’s used as a counter. After creation of the backup task it seems to be empty (I tested this with another WebDAV server 😉 ). If we manually try to upload this file to Stack we receive an error. “Method not allowed”
So, let’s see what is actually in this file. Well it’s empty…
gijs@Carbonite:~$ hexdump -C bucketID.counter.1 00000000 00 00 00 00 00 00 00 00 |........| 00000008
After creating the backup task, this file is empty but as soon as the backup runs the file will be replaced with another one. When I check the contents of this new file it was not empty.
gijs@Carbonite:~$ hexdump -C bucketID.counter.3 00000000 00 00 00 00 00 00 01 5e |.......^| 00000008
Uploading this new file is accepted by Stack. Below I will guide you through the steps to get Hyper Backup use Stack as a backup destination.
Step 1: Have another WebDAV server
Before we can create a backup to Stack we need a separate WebDAV server that supports the use of Sparse files.
This can be remote or local with the WebDAV server package.
I choose to use the local package.
Step 2: create a WebDAV backup task to this server.
Create a new backup task and select the server from step 1 as the destination.
- The server address should be the address of your WebDAV server (in my case the Synology itself)
- The folder needs to be the folder you like to use on Stack (you may need to create a temporary folder on your WebDAV server.)
After clicking Next
you can choose the folders you want to backup and all the options you like to use. It’s not important for now so I don’t cover it here.
Hyper Backup will ask if you like to create a backup now. Click NO.
Step 3: Copy files to Stack
Upload the complete nas_1.hbk
directory to Stack in the folder with the same name as in step 2. Stack will not accept the file nas_1hbk/Pool/bucketID.counter.1
but you can skip it.
In my situation my folder in Stack was: stack_backup > nas_1.hbk
Step 4: Edit the backup task
Go to your Stack settings page and copy the WebDAV url: https://YOUR-STACK.stackstorage.com/remote.php/webdav/
In Hyper Backup: Edit the backup task by clicking on the menu and then Edit.
Open the Target
tab and replace the current values with those from your Stack account.
Click OK and Hyper Backup will recognize the backup task at your Stack storage 🙂
I have tested the following and they all work fine:
- Initial backup (17GB)
- Adding a file and doing an incremental backup
- Recovering 1 file
- Recovering the complete folder (17GB)
Note: Having a decent backup plan is important. Test your backups and have multiple destinations. Also Off-line.
Nice Work! Thx
LikeLike
Thanks this seems to work!
LikeLike
Doesn’t work for me unfortunately…After changing it to my stack url it stays offline.
LikeLike
I had the same problem. It went wrong during copying the BACKUP.hbk to Stack. What I did is downloaded the hbk folder to from my NAS, deleted the bucketID.counter.1 file from the pool folder and uploaded the hbk folder to stack. After that, my backup came online again.
Maybe it helps….
LikeLiked by 2 people
Bedankt Gijs. Works for me.
LikeLike
It seems the local NAS username and the remote TransIP username password combination needs to be the same. Otherwise you cannot modify the Hyper backuptask I found.
LikeLike
They can be changed later in the process (in step 4).
LikeLike
Thanx Gijs, an excellent addition to my use of Cloud Sync to STACKs Webdav.
LikeLiked by 1 person
Works great, thanks for your help!!!
LikeLike
Thanks for sharing!
LikeLike
I cannot seem to get the connection to work. I can create a folder, but when I want to apply the task, I get the error: Insufficient privileges to access this destination shared folder
LikeLike
Make a new share on your Synology NAS with File manager
LikeLike
Thanks, works fine for me!
LikeLike
Does not work for me, even not after having same user and password and deleting the mentioned counter file and even not after deleting all 0 byte files.
LikeLike
Thanks, worked like a charme 🙂
LikeLike
Thanks for sharing @Gijs
In my case even though the destination is online, the backup fails at start.
I must say that apart from the mentioned file, I also have trouble copying the NAS_1.hbk/Control/@Writer/sequence_id file.
It’s also that file I found an error message in the messages log file.
Anybody solved that problem?
LikeLike
Werkt, super!
LikeLike
Thank you! Works!
LikeLike
Works great. Thanks!
LikeLike
I had trouble getting this to work. The file I downloaded from the temp webdav server was a zip file. It was not clear to me that I needed to extract it and place the extracted folder onto stack.
After doing that it worked immediately.
Thank you for the information.
LikeLike
Thanks for your feedback! I’ll try to make that more clear.
LikeLike
De afgelopen twee jaar kijk ik eens in de paar maanden of er een oplossing is. Teleurstellend dat TransIP niet hier actie op heeft ondernomen, maar geweldig dat jij het probleem hebt gehackt en deelt! Veel dank Gijs, ik ben hier erg blij mee
LikeLike
Works perfectly. Thank you very much, Gijs!
I missed some information about how to setup the temporary webdav server. So perhaps helpful for others:
I just installed it from the package center and enabled the http server.
Next, when you create the backup task you should login with an account which has administrative rights (perhaps not stricly needed, but it is only for the temporary webdav server. So doesn’t matter much).
Remember that the last field called directory (in Dutch both fields are called folder. blehh) will be eventually be the target directory on your stack storage.
I miss step 5 which is removing the webdav server. Go back into the package center and remove the webdav server as it is not needed anymore.
LikeLike
Did anyone succeed in restoring files from their Windows workstation using the hyperbackup windows client? I was testing the scenario where my nas would be stolen or fried and I wanted to recover my stuff directly to my Windows pc. And unfortunately this does not seem to work. Don’t know whether or not this has to do with the stack storage or just a general bug in the hyperbackup Windows client.
LikeLike
Thanks for the info, makes STACK much more useful for Synology backups!
LikeLike
Thanks for sharing @Gijs
Sorry for posting this twice, but i just noticed that my previous reply was on the wrong position within the thread ;(
In my case even though the destination is online, the backup fails at start.
I must say that apart from the mentioned file, I also have trouble copying the NAS_1.hbk/Control/@Writer/sequence_id file.
It’s also that file I found an error message in the messages log file.
Anybody solved that problem?
LikeLike
Bedankt Gijs!!
Heeft iemand dit in 2019 nog aan de praat gekregen?
Samen met Trans-IP kom ik er niet uit.
Het blijkt dan niet alleen de sparse bestanden een probleem zijn maar ook de lock files.
Waren die lock file problemen er in 2018 ook?
Foutmelding:
failed to create lock keepalive: keepalive file: [lock/lock_keep_alive.@writer_version_0.f4f67b15-d29c-163e-7125-0240846ec14d]
Trans-IP zegt:
“De foutmelding waar je naar verwijst betreft het ‘locken’ van een bestand. Hiermee wordt het bestand in kwestie vergrendeld om te voorkomen dat deze bewerkt kan worden door een andere gebruiker of een ander proces. Omdat STACK geen file locking ondersteunt, lukt het niet om de lock aan te maken.
Zoals aangegeven hebben we momenteel helaas geen andere oplossing als de stappen op gijs.io niet werken.”
LikeLike
In reactie op mezelf:
Na dagen lang foutmelding-mails van mijn NAS te hebben ontvangen dat de backup ‘FAILED’ is, krijg ik sinds de nacht van 6 maart ineens mails dat de backups geslaagd zijn!
(Ik heb in die dagen niets gewijzigd.)
Op een NAS van een kennis begon dezelfde nacht (bleek uit de logging) de backup ook ineens te lopen.
Het probleem lijkt dus te zijn opgelost !
Bij Trans-IP zeggen ze: “Ik heb dit voor de zekerheid voor je nagevraagd, maar aan onze kant is er ook niets gebeurd. Ik durf dus helaas ook niet te zeggen of dit stand houdt.”
René
LikeLike
Hi,
Thank you for the guide, followed the steps and everything seems ok but once I start the backup I get the following error message:
Exception occurred while backing up data. (No permission to access the backup destination [backup]. Please check you have the right permission to the backup destination
I can see HyperBackup changes files on my stack but the backup fails.
Tried the setup several times but it keeps failing, does anybody have suggestions on how to fix this ?
Regards,
Jeroen
LikeLike
Thanks Gijs, Works very good!
Kind regards,
Henk
LikeLike
I’ve been busy trying to follow this tutorial today and somehow I didn’t get this to work. After I followed all the steps and tried to start an initial backup it simple failed. I gave it a last shot this evening with my windows PC (first I tried to copy the files (step 3) using a Macbook) and it worked. Copied the files using WinSCP using the webdav protocol.
Maybe its a good idea to mention this in the article!
LikeLike
Thanks Ronald – using WinSCP icm with webdav did the trick for me.
LikeLike
Thank you, sir! That was my missing link: I used Commander One on Mac to copy the files to STACK via WebDAV. Copying them through the web interface didn’t work, and using a WebDAV connection in Synology DSM File Manager didn’t work either.
LikeLike
Awesome! Took 5 minutes to set up and initial backup to my Stack is running. Thank you so very much!!
LikeLike
Werkt helemaal. perfect! Dank voor de uitgebreide uitleg!
LikeLike
The steps I did to make it work:
Make a webdav backup towards the NAS itself (install packages make a shared folder make a user with the same creds as for stack. Just saving time when trying to get it working)
Select a small folder to backup.
Make a full backup.
After this use the webbrowser to download the .hbk
Use the Transip desktop client to upload the .hbk to stack. (using the webpage failed.)
After the upload change the target in the job to stack.
LikeLike
In the user guide, It’s stated to not backup immediately as it is not necessary. For me, it didn’t work and I had to perform the first backup. My target medium wasn’t larger enough, so I just started it, waited until Synology created all the database files and config files and then I just aborted the backup task. At that moment I copied the .hbk folder to Stack. Then I had to remove the lock file that Synology has placed under /Control/lock. And now, everything works fine!
For those having difficulties to upload a folder, it’s not needed to get the client app or use the web browser. Under Linux, WebDAV FS can be mounted immediately.
Ubuntu (Nautilus): davs://username@username.stackstorage.com/remote.php/webdav
Kubuntu (Dolphin): webdav://username@username.stackstorage.com/remote.php/webdav
More info: https://doc.owncloud.com/server/user_manual/files/access_webdav.html#accessing-files-using-linux
LikeLike
I have found that the backup only works after I shortly start and cancel the backup, as @Thomas Devoogdt mentions. Also be sure to generate a one-time-password if you have 2 factor authentication enabled.
LikeLike
Make sure you actually copy ALL the files from XXX.hbk to Stack. Preferably by web interface, since it will accept all the files instead of the desktop app (syncing will cause at least one file to not be uploaded to stack), which causes the backup to fail.
Once you’ve got that going, Synology will run a fine backup of whatever you want to Stack. Which is great, since Stack was (or still is) 1000GB for free at the time being.
LikeLike
What worked for me:
• connect to your webdav with https://127.0.0.1:5006
• let a backup run on hyper-backup for a pretty small folder on your local webdav
• Use WinSCP to copy everything (the “directory”) over to Stack into the “folder” name on Stack
• Change the address and credentials to the ones of Stack in Hyperbackup
Works like a charm ❤
LikeLike
With this extra steps my backup works!
How is restoring a option with a directory named .hbk do i need zip it? or how?
LikeLike
It worked for me, but only after taken the useful comments into consideration! Especially the mention of @Thomas Devoogdt. Key is to Run the Backup = Yes. And after that to copy the .hbk directory to your Stack Webdav.
LikeLike
Thanks for this guide. Took me some tinkering but it works. I was a it confused about folders, directories and shares. In one of the first screenshots you see stack_backup and nas_1 (first one is an existing “share”, second one you can name what you want).
Folders: now when loging into Stack I created a directory start_backup in the root, and in that directory I uploaded the folder nas_1.hbk (when you download it you will see the .hbk part has been added, contrary to what you see in the first screenshot when you name it. Also if your names are different, substitute as you like, I did but thought it would be easies to explain using the names in the screenshots here.
-I did run the backup, just select a very small folder as you can easily change what needs to be backupped when all is done
-I used winscp, the server is YOURSTACKNAME.stackstorage.com, make sure under advanced you don’t point to the login directory (will give an error) but under directories point to: /remote.php/webdav. Went fine first time but second time somehow it pointed to the wrong folder.
-For up and downloading I used winscp, got no errors, saw no zip files and could down and upload everything. As said I downloaded nas_1.hbk folder completely and uploaded it completely
-I wanted encryption. You can’t add that when you are done so needs to be activated when you create the first local hyperbackup job.
Many, many thanks to the author and the tips above!
LikeLike
Hi,
It worked like a charm since a very long time: thanks!
But since 3 days: backup failed.
No access to destination folder on stack…?!
Make sure you have access to the destination folder… .
What can i do to solve this issue? And what is the reason?
There is still +600Gb free on Stack.
LikeLike
Tried many things, including above hints, but I cannot get it working (no connection).
I’ve created a new user in the Synology that can only be accessed via FTP and use Deja-Dup to backup my files to this special user. Subsequently, Synology Cloudsync transfers the dejadup backup files to Stack Storage. So, at the end I have achieved the same as with Synology’s HyperBackup. There is one advantage with this solution, I can use Deja-Dup from my Linux system to restore the files straight from Stack if necessary, i.e. I’m not dependent on a Synology server in between.
I’ve also contacted Stack about this issue but they are not doing anything to resolved it (they referred me to this blog) and as it is for free, I cannot complain.
LikeLike
Thanks for the guide and all the very helpful comments!
For me, running an empty (no files/apps) local backup before copying the complete folder (via winscp!) to stack, did the trick for me.
TLDR; follow the guide and read the comments by Thomasdevoogdt, Smokingcrop and Jaspov.
LikeLike
Thanks for the guide and all the very helpful comments (thomas dv, smokingcrop and jaspov)!
For me, creating ánd running an empty (no files/apps) local backup before copying the complete .hbk folder (via winscp!) to stack, did the trick for me.
LikeLike
This guide no longer works. Neither do the comments help. As for what I have tried:
1. Create job for local webdav. Made sure that folder names are the same on Stack and locally.
2. Added the Hyper backup config to have something in the job, else bucketid.counter and other files won’t even upload as they are empty. You would get method not allowed error.
3. Job locally completed and uploaded files to Stack with no errors.
4. Credentials and URL are correct using a password manager and copy paste to avoid errors.
5. Enough space on Stack drive.
6. Stack goes offline as I have rerouted from local to Stack. Won’t come online anymore.
7. Tried removing bucketid files and Lock folder.
8. Tried adding more folders to backup, still frozen in offline.
9. Did same steps with a job that has encryption and one that has none.
10. Rebooting NAS.
11. Tried upping the .hbk folder one level as it’s folder.hbk -> folder.hbk -> data.
12. Made sure there are no files in the lock folder.
I think either Transip or Synology patched or changed something and now it no longer works. I have a local backup disk that is recognized immediately.
So as a warning to newcomers on this site: Don’t waste your time. It won’t work anymore.
LikeLike
Thanks for your comment. I decided to add a note on top of this post to point it out.
LikeLike
It just works for me (21-11-2020), however, contrary to the article here you DO have to make a backup to your local Webdav (I took a small folder of 20MB).
If you then upload the file to stack and do an integrity check and a backup, it just works.
When initially creating the hyperback files, not all files are created anymore, so if you do a small backup.
good luck!!
LikeLike
I did what marc0janssen suggested, but when I transfered the folder to Stack, the backup status changed from ‘success’ to ‘Restore Only – Destination corrupted’. This sucks, will do some additional testing.
LikeLike
I can confirm that the method of marc0janssen works, with a little extra effort.
After making a initial backup (in my case around 200MB) to the local Webdav server, manually copy (upload) all files to the directory in Stack. I first copied the backup.hbk file/folder to my normal computer and then uploaded it to Stack using the webinterface.
Some files keep getting error messages. What I did was adding ‘.txt’ to every file that does not voluntarily wants to be uploaded. As a .txt file they will upload. After that remove the ‘.txt’ extension through the rename option in the webinterface of Stack.
Then follow the procedure above, change destination to the Stack webdav and first run a integrity check. After that it works!
LikeLike
I moved away from stack in the meantime.
Moved to Wasabi. Great support. Cheaper. Based on Amazon S3. Pay as you go.
Good luck with stack.
LikeLike