Note 1: It seems this no longer work (see comments)

Note 2: I currently use Hetzner as a backup location. They have a product called storage box. It supports WebDAV and they have different options. From €9,56 for 1TB up to €48,28 for 10TB of storage. Which brings it down to €4,83 per TB per month. It’s fast and reliable. Using it for some time now and it’s definitely worth the money.

Synology Hyper Backup can backup to WebDAV servers. Stack has support for WebDAV but They don’t seem to like each other. If you follow a few Dutch forum threads  (1 2 ). Users mention that the use of Sparse files is the problem. Hyper Backup creates a Sparse file in WebDAV backups in nas_1.hbk/Pool/bucketID.counter.1

The filename tells us that it’s used as a counter. After creation of the backup task it seems to be empty (I tested this with another WebDAV server 😉 ). If we manually try to upload this file to Stack we receive an error. “Method not allowed”

So, let’s see what is actually in this file. Well it’s empty…

gijs@Carbonite:~$ hexdump -C bucketID.counter.1
 00000000 00 00 00 00 00 00 00 00 |........|
 00000008

After creating the backup task, this file is empty but as soon as the backup runs the file will be replaced with another one. When I check the contents of this new file it was not empty.

gijs@Carbonite:~$ hexdump -C bucketID.counter.3
 00000000 00 00 00 00 00 00 01 5e |.......^|
 00000008

Uploading this new file is accepted by Stack. Below I will guide you through the steps to get Hyper Backup use Stack as a backup destination.

Step 1: Have another WebDAV server

Before we can create a backup to Stack we need a separate WebDAV server that supports the use of Sparse files.
This can be remote or local with the WebDAV server package.
I choose to use the local package.

Step 2: create a WebDAV backup task to this server.

Create a new backup task and select the server from step 1 as the destination.

  • The server address should be the address of your WebDAV server (in my case the Synology itself)
  • The folder needs to be the folder you like to use on Stack (you may need to create a temporary folder on your WebDAV server.)

step-21

After clicking Next you can choose the folders you want to backup and all the options you like to use. It’s not important for now so I don’t cover it here.

Hyper Backup will ask if you like to create a backup now. Click NO.

step-22

Step 3: Copy files to Stack

Upload the complete nas_1.hbk directory to Stack in the folder with the same name as in step 2. Stack will not accept the file nas_1hbk/Pool/bucketID.counter.1 but you can skip it.

In my situation my folder in Stack was:  stack_backup > nas_1.hbk

Step 4: Edit the backup task

Go to your Stack settings page and copy the WebDAV url: https://YOUR-STACK.stackstorage.com/remote.php/webdav/

In Hyper Backup: Edit the backup task by clicking on the menu and then Edit.

step-41a

Open the Target tab and replace the current values with those from your Stack account.

step-42a

Click OK and Hyper Backup will recognize the backup task at your Stack storage 🙂

I have tested the following and they all work fine:

  • Initial backup (17GB)
  • Adding a file and doing an incremental backup
  • Recovering 1 file
  • Recovering the complete folder (17GB)

Note: Having a decent backup plan is important. Test your backups and have multiple destinations. Also Off-line.

63 thoughts on “Make Synology <3 TransIP Stack

    1. I had the same problem. It went wrong during copying the BACKUP.hbk to Stack. What I did is downloaded the hbk folder to from my NAS, deleted the bucketID.counter.1 file from the pool folder and uploaded the hbk folder to stack. After that, my backup came online again.
      Maybe it helps….

      Liked by 2 people

  1. It seems the local NAS username and the remote TransIP username password combination needs to be the same. Otherwise you cannot modify the Hyper backuptask I found.

    Like

  2. I cannot seem to get the connection to work. I can create a folder, but when I want to apply the task, I get the error: Insufficient privileges to access this destination shared folder

    Like

  3. Does not work for me, even not after having same user and password and deleting the mentioned counter file and even not after deleting all 0 byte files.

    Like

    1. Thanks for sharing @Gijs

      In my case even though the destination is online, the backup fails at start.
      I must say that apart from the mentioned file, I also have trouble copying the NAS_1.hbk/Control/@Writer/sequence_id file.
      It’s also that file I found an error message in the messages log file.

      Anybody solved that problem?

      Like

  4. I had trouble getting this to work. The file I downloaded from the temp webdav server was a zip file. It was not clear to me that I needed to extract it and place the extracted folder onto stack.
    After doing that it worked immediately.
    Thank you for the information.

    Like

  5. De afgelopen twee jaar kijk ik eens in de paar maanden of er een oplossing is. Teleurstellend dat TransIP niet hier actie op heeft ondernomen, maar geweldig dat jij het probleem hebt gehackt en deelt! Veel dank Gijs, ik ben hier erg blij mee

    Like

    1. Ik heb dit probleem toevallig ook recent bij TransIP gedropt.
      Daar kreeg ik een verwijzing naar dit blog.

      Ik had al jaren een HyperBackup sessie direct naar een HiDrive account (protocollen nodig denk ik), en daarnaast een HyperBackup naar een 2e Synology waarbij die dan vervolgens naar de HiDrive cloud ging met Cloud Sync.
      Maar, voor HiDrive is voor beiden een native driver bij Synology.

      Met de 2e NAS even niet beschikbaar was mijn idee direct vanuit HyperBackup met WebDAV naar Stack…. Helaas dus.

      Ik ga het toch weer even proberen via de 2e NAS en dan Cloud Sync naar Stack. Ik ben benieuwd.

      Like

  6. Works perfectly. Thank you very much, Gijs!

    I missed some information about how to setup the temporary webdav server. So perhaps helpful for others:
    I just installed it from the package center and enabled the http server.
    Next, when you create the backup task you should login with an account which has administrative rights (perhaps not stricly needed, but it is only for the temporary webdav server. So doesn’t matter much).
    Remember that the last field called directory (in Dutch both fields are called folder. blehh) will be eventually be the target directory on your stack storage.
    I miss step 5 which is removing the webdav server. Go back into the package center and remove the webdav server as it is not needed anymore.

    Like

  7. Did anyone succeed in restoring files from their Windows workstation using the hyperbackup windows client? I was testing the scenario where my nas would be stolen or fried and I wanted to recover my stuff directly to my Windows pc. And unfortunately this does not seem to work. Don’t know whether or not this has to do with the stack storage or just a general bug in the hyperbackup Windows client.

    Like

  8. Thanks for sharing @Gijs

    Sorry for posting this twice, but i just noticed that my previous reply was on the wrong position within the thread ;(

    In my case even though the destination is online, the backup fails at start.
    I must say that apart from the mentioned file, I also have trouble copying the NAS_1.hbk/Control/@Writer/sequence_id file.
    It’s also that file I found an error message in the messages log file.

    Anybody solved that problem?

    Like

  9. Bedankt Gijs!!

    Heeft iemand dit in 2019 nog aan de praat gekregen?
    Samen met Trans-IP kom ik er niet uit.

    Het blijkt dan niet alleen de sparse bestanden een probleem zijn maar ook de lock files.
    Waren die lock file problemen er in 2018 ook?

    Foutmelding:
    failed to create lock keepalive: keepalive file: [lock/lock_keep_alive.@writer_version_0.f4f67b15-d29c-163e-7125-0240846ec14d]

    Trans-IP zegt:
    “De foutmelding waar je naar verwijst betreft het ‘locken’ van een bestand. Hiermee wordt het bestand in kwestie vergrendeld om te voorkomen dat deze bewerkt kan worden door een andere gebruiker of een ander proces. Omdat STACK geen file locking ondersteunt, lukt het niet om de lock aan te maken.
    Zoals aangegeven hebben we momenteel helaas geen andere oplossing als de stappen op gijs.io niet werken.”

    Like

    1. In reactie op mezelf:

      Na dagen lang foutmelding-mails van mijn NAS te hebben ontvangen dat de backup ‘FAILED’ is, krijg ik sinds de nacht van 6 maart ineens mails dat de backups geslaagd zijn!
      (Ik heb in die dagen niets gewijzigd.)
      Op een NAS van een kennis begon dezelfde nacht (bleek uit de logging) de backup ook ineens te lopen.
      Het probleem lijkt dus te zijn opgelost !

      Bij Trans-IP zeggen ze: “Ik heb dit voor de zekerheid voor je nagevraagd, maar aan onze kant is er ook niets gebeurd. Ik durf dus helaas ook niet te zeggen of dit stand houdt.”

      René

      Like

  10. Hi,

    Thank you for the guide, followed the steps and everything seems ok but once I start the backup I get the following error message:

    Exception occurred while backing up data. (No permission to access the backup destination [backup]. Please check you have the right permission to the backup destination

    I can see HyperBackup changes files on my stack but the backup fails.

    Tried the setup several times but it keeps failing, does anybody have suggestions on how to fix this ?

    Regards,

    Jeroen

    Like

  11. I’ve been busy trying to follow this tutorial today and somehow I didn’t get this to work. After I followed all the steps and tried to start an initial backup it simple failed. I gave it a last shot this evening with my windows PC (first I tried to copy the files (step 3) using a Macbook) and it worked. Copied the files using WinSCP using the webdav protocol.

    Maybe its a good idea to mention this in the article!

    Like

    1. Thank you, sir! That was my missing link: I used Commander One on Mac to copy the files to STACK via WebDAV. Copying them through the web interface didn’t work, and using a WebDAV connection in Synology DSM File Manager didn’t work either.

      Like

  12. Awesome! Took 5 minutes to set up and initial backup to my Stack is running. Thank you so very much!!

    Like

  13. The steps I did to make it work:

    Make a webdav backup towards the NAS itself (install packages make a shared folder make a user with the same creds as for stack. Just saving time when trying to get it working)
    Select a small folder to backup.
    Make a full backup.
    After this use the webbrowser to download the .hbk
    Use the Transip desktop client to upload the .hbk to stack. (using the webpage failed.)
    After the upload change the target in the job to stack.

    Like

  14. In the user guide, It’s stated to not backup immediately as it is not necessary. For me, it didn’t work and I had to perform the first backup. My target medium wasn’t larger enough, so I just started it, waited until Synology created all the database files and config files and then I just aborted the backup task. At that moment I copied the .hbk folder to Stack. Then I had to remove the lock file that Synology has placed under /Control/lock. And now, everything works fine!

    For those having difficulties to upload a folder, it’s not needed to get the client app or use the web browser. Under Linux, WebDAV FS can be mounted immediately.
    Ubuntu (Nautilus): davs://username@username.stackstorage.com/remote.php/webdav
    Kubuntu (Dolphin): webdav://username@username.stackstorage.com/remote.php/webdav
    More info: https://doc.owncloud.com/server/user_manual/files/access_webdav.html#accessing-files-using-linux

    Like

  15. I have found that the backup only works after I shortly start and cancel the backup, as @Thomas Devoogdt mentions. Also be sure to generate a one-time-password if you have 2 factor authentication enabled.

    Like

  16. Make sure you actually copy ALL the files from XXX.hbk to Stack. Preferably by web interface, since it will accept all the files instead of the desktop app (syncing will cause at least one file to not be uploaded to stack), which causes the backup to fail.

    Once you’ve got that going, Synology will run a fine backup of whatever you want to Stack. Which is great, since Stack was (or still is) 1000GB for free at the time being.

    Like

  17. What worked for me:
    • connect to your webdav with https://127.0.0.1:5006
    • let a backup run on hyper-backup for a pretty small folder on your local webdav
    • Use WinSCP to copy everything (the “directory”) over to Stack into the “folder” name on Stack
    • Change the address and credentials to the ones of Stack in Hyperbackup

    Works like a charm ❤

    Like

  18. It worked for me, but only after taken the useful comments into consideration! Especially the mention of @Thomas Devoogdt. Key is to Run the Backup = Yes. And after that to copy the .hbk directory to your Stack Webdav.

    Like

  19. Thanks for this guide. Took me some tinkering but it works. I was a it confused about folders, directories and shares. In one of the first screenshots you see stack_backup and nas_1 (first one is an existing “share”, second one you can name what you want).

    Folders: now when loging into Stack I created a directory start_backup in the root, and in that directory I uploaded the folder nas_1.hbk (when you download it you will see the .hbk part has been added, contrary to what you see in the first screenshot when you name it. Also if your names are different, substitute as you like, I did but thought it would be easies to explain using the names in the screenshots here.

    -I did run the backup, just select a very small folder as you can easily change what needs to be backupped when all is done
    -I used winscp, the server is YOURSTACKNAME.stackstorage.com, make sure under advanced you don’t point to the login directory (will give an error) but under directories point to: /remote.php/webdav. Went fine first time but second time somehow it pointed to the wrong folder.
    -For up and downloading I used winscp, got no errors, saw no zip files and could down and upload everything. As said I downloaded nas_1.hbk folder completely and uploaded it completely
    -I wanted encryption. You can’t add that when you are done so needs to be activated when you create the first local hyperbackup job.

    Many, many thanks to the author and the tips above!

    Like

  20. Hi,

    It worked like a charm since a very long time: thanks!
    But since 3 days: backup failed.
    No access to destination folder on stack…?!
    Make sure you have access to the destination folder… .

    What can i do to solve this issue? And what is the reason?
    There is still +600Gb free on Stack.

    Like

  21. Tried many things, including above hints, but I cannot get it working (no connection).
    I’ve created a new user in the Synology that can only be accessed via FTP and use Deja-Dup to backup my files to this special user. Subsequently, Synology Cloudsync transfers the dejadup backup files to Stack Storage. So, at the end I have achieved the same as with Synology’s HyperBackup. There is one advantage with this solution, I can use Deja-Dup from my Linux system to restore the files straight from Stack if necessary, i.e. I’m not dependent on a Synology server in between.
    I’ve also contacted Stack about this issue but they are not doing anything to resolved it (they referred me to this blog) and as it is for free, I cannot complain.

    Like

  22. Thanks for the guide and all the very helpful comments!
    For me, running an empty (no files/apps) local backup before copying the complete folder (via winscp!) to stack, did the trick for me.
    TLDR; follow the guide and read the comments by Thomasdevoogdt, Smokingcrop and Jaspov.

    Like

  23. Thanks for the guide and all the very helpful comments (thomas dv, smokingcrop and jaspov)!
    For me, creating ánd running an empty (no files/apps) local backup before copying the complete .hbk folder (via winscp!) to stack, did the trick for me.

    Like

  24. This guide no longer works. Neither do the comments help. As for what I have tried:

    1. Create job for local webdav. Made sure that folder names are the same on Stack and locally.
    2. Added the Hyper backup config to have something in the job, else bucketid.counter and other files won’t even upload as they are empty. You would get method not allowed error.
    3. Job locally completed and uploaded files to Stack with no errors.
    4. Credentials and URL are correct using a password manager and copy paste to avoid errors.
    5. Enough space on Stack drive.
    6. Stack goes offline as I have rerouted from local to Stack. Won’t come online anymore.
    7. Tried removing bucketid files and Lock folder.
    8. Tried adding more folders to backup, still frozen in offline.
    9. Did same steps with a job that has encryption and one that has none.
    10. Rebooting NAS.
    11. Tried upping the .hbk folder one level as it’s folder.hbk -> folder.hbk -> data.
    12. Made sure there are no files in the lock folder.

    I think either Transip or Synology patched or changed something and now it no longer works. I have a local backup disk that is recognized immediately.

    So as a warning to newcomers on this site: Don’t waste your time. It won’t work anymore.

    Like

  25. It just works for me (21-11-2020), however, contrary to the article here you DO have to make a backup to your local Webdav (I took a small folder of 20MB).

    If you then upload the file to stack and do an integrity check and a backup, it just works.

    When initially creating the hyperback files, not all files are created anymore, so if you do a small backup.

    good luck!!

    Like

  26. I did what marc0janssen suggested, but when I transfered the folder to Stack, the backup status changed from ‘success’ to ‘Restore Only – Destination corrupted’. This sucks, will do some additional testing.

    Like

  27. I can confirm that the method of marc0janssen works, with a little extra effort.

    After making a initial backup (in my case around 200MB) to the local Webdav server, manually copy (upload) all files to the directory in Stack. I first copied the backup.hbk file/folder to my normal computer and then uploaded it to Stack using the webinterface.

    Some files keep getting error messages. What I did was adding ‘.txt’ to every file that does not voluntarily wants to be uploaded. As a .txt file they will upload. After that remove the ‘.txt’ extension through the rename option in the webinterface of Stack.

    Then follow the procedure above, change destination to the Stack webdav and first run a integrity check. After that it works!

    Liked by 1 person

    1. I moved away from stack in the meantime.

      Moved to Wasabi. Great support. Cheaper. Based on Amazon S3. Pay as you go.

      Good luck with stack.

      Like

  28. Strangely enough; after reading the comments and trying every option, when changing the backup target over to the stackstorage.com webdav URI, the Hyperbackup job immediately jumps to “Off-line”. Does anyone have a clue?

    Like

  29. 2021 augustus versie DSM 7.0

    DSM 7.0-41890
    Hyper Backup versie 3.0.1-2412
    WebDAV Server 2.4.1.-10108
    Methode Marco Janssen (21-11-2020) gevolgd, door een backup te maken naar lokale webDAV van een relatief kleine folder homes user foto).
    Let op dat je het goede poortnummer invult: 5005 (zoals bij Niek). Dan een integriteits controle uitvoeren.
    Lokale gedeelde folder en nieuwe folder op Stack hebben identieke naam (zoals door Niek al aangegeven).
    Deze gecreëerde back-up met de windows-desktop Stack Client gesync’d naar Stack, naar de nieuwe folder.
    Controleer even of de lokale folder er identiek uitziet met dezelfde folders / files op Stack.
    Vervolgens de sync koppeling PC – Stack verbreken en de Hyper back-up taak verwijderen.
    Daarna (Synology) een nieuwe taak gemaakt die verwijst naar de Stack nieuwe folder.
    Overigens bij sommige Synology modellen moet de de laatste slash verwijderd worden (zoals bij DS 118). https://.stackstorage.com/remote.php/webdav/ wordt dus https://.stackstorage.com/remote.php/webdav

    Helaas kan HyperBackupExplorer (windows desktop) niet gebruikt worden wanneer de Hyper Backup zich op Stack bevindt.
    Overigens zal ik na het aflopen van mijn Stack abonnement ergens anders gaan backuppen, deze hele procedure is voor mij niet robuust genoeg en ik kan er niet op vertrouwen dan een herstel zonder problemen zal verlopen.
    Overigens lukte het om een Hyper Backup ‘herstel’ op basis van deze initiele backup uit te voeren.

    Like

  30. DSM 7.0-41890
    Hyper Backup version 3.0.1-2412
    WebDAV Server 2.4.1.-10108
    Marco Janssen method (21-11-2020) followed, by making a backup to local webDAV of a relatively small folder homes user photo).
    Make sure you enter the correct port number: 5005 (as with Niek). Then perform an integrity check.
    Local shared folder and new folder on Stack have identical names (as indicated by Niek).
    This backup created with the windows desktop Stack Client synced to Stack, to the new folder.
    Check if the local folder looks identical with the same folders / files on Stack.
    Then break the sync link PC – Stack and delete the Hyper backup job.
    Then (Synology) created a new task that points to the Stack new folder.
    Incidentally, on some Synology models, the last slash must be removed (as with DS 118). So https://.stackstorage.com/remote.php/webdav/ becomes https://.stackstorage.com/remote.php/webdav

    Unfortunately, HyperBackupExplorer (windows desktop) cannot be used when the Hyper Backup is on Stack.
    By the way, I will backup somewhere else after my Stack subscription, this whole procedure is not robust enough for me and I can’t rely on it that a restore will go without problems.
    Incidentally, it was possible to perform a Hyper Backup ‘restore’ based on this initial backup.

    Like

  31. Tja, helaas. Ik dacht ook even direct een HyperBackup met de optie via WebDAV in Stack te zetten net als dat ik een sessie naar een HiDrive account heb. Die laatste is een native driver.

    Ik heb nu het plan om toch weer een 2e HyperBackup op te zetten naar de backup Synology (een krakkemikkige DS413j miskoop) en dan zien of de bestanden dat middels Cloud Sync en WebDAV weel naar het Stack account te synchroniseren zijn om zo twee (encrypted) backups in de cloud te hebben staan.
    Voor spreiding is twee verschillende cloud accounts wel prettig, maar desnoods ook de 2e dan maar naar de HiDrive met Cloud Sync.

    Het was overigens dat ik vanuit TransIP op dit blog geattendeerd werd.

    De KeePass database met WebDAV in het Stack account werkt ook niet vanaf de Android mobieltjes.
    Die staat dus nu maar hewoon op de Synology NAS met WebDAV en gaat met de HyperBackup ook mee.

    Like

    1. Ik heb inderdaad nu ingesteld dat de 2e HyperBackup sessie van de DS10190+ naar de DS413j gaat waar dus HyperBackup Vault op staat.
      De data in de share hiervoor wordt probleemloos naat Stack gesynchroniseerd middels Cloud Sync en WebDAV.

      De data daar lijkt volledig identiek aan de data op de DS413j, dus die data zou ik in geval van een probleem terug moeten kunnen schrijven naar de DS413j zodat ik daar via de normale weg weer bij de opgeslagen bestanden zou moeten kunnen.

      HyperBackup schrijft alles encrypted weg hier. Zowel naar die DS413j als naar het HiDrive account.

      Like

      1. Wat een gedoe allemaal. Ik ben gewoon weg gegaan bij Stack.

        Een account geopend bij Wasabi. Voor 6 dollar. In de praktijk net voor de 5 euro.

        Is een pay as you go service als je voorbij de 1TB gaat. Je betaalt dan bij voor ieder GB.

        Het mooie is dat dit een service is compatibel aan amazon S3. Daar heb je meer plezier van dan Stack en goedkoper.

        Like

      2. OK, Marco.

        Ik zie $5.99 voor 1TB per maand.
        Ik ben al heel lang bij TransIP en betaal 3.03 euro per maand. Voor mij dus vooralsnog goedkoper, tenzij ik naar meer dan 4x de hoeveelheid data ga dan ik nu in Stack heb staan en een upgrade nodig heb.

        Dat Wasibi Amerikaans is maakt voor een encrypted backup niet echt uit, want dat is met een goede encryptie behoorlijk veilig.
        Ik zag een Carbonite achtergrond. Die was wel bekend. Blijkbaar in 2017 gestart.

        De backup NAS had ik er vroeger ook al tussen zitten (toen een DS115j), maar nadat ik daar de DS413j voor in de plaats gezet had bleek dat die 413j gewoon een nog veel hopelozer geval was dan ik al bij het stand-alone gebruik gemerkt had. Kortom, ik had toen alleen nog de HiDrive backup copy.
        Nu er, in tegenstelling tot voorheen op de DS115j, geen camera’s meer aan die DS413j hangen werkt het wel.
        Voordeel, een van de backups staat dichtbij (nee, ze staan niet naast elkaar).

        Het is dus nu een backup op die Backup NAS en een backup van de backup in Stack.

        Like

  32. Vandaag toch eens tijd ingestoken, en nu werkt het gelukkig weer zonder problemen. De truc was bij mij om eerst een succesvolle backup te laten lopen naar de locale webdav, nadat deze succesvol was de NAS_1.hbk gekopieerd naar stack. Settings in de backup aangepast van 127.0.0.1 naar stack en daarna werkt het al een zonnetje, Thanks!

    Like

Leave a Reply to bousix Cancel reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s