Backup your Delicious Bookmarks!

[UPDATED 2012.07.05 : Apparently Delicious changed their API to automatically limit to 1000 entries if you don’t specify via the “results” URL argument.  Updated the script below, but you should customize to your own limit. ]

If you’re not paying attention, (Yahoo and) social bookmarking service Delicious is in trouble. Although no one can know for sure what will happen, you can bet Delicious will never be the same, one way or another.  So while we’re all looking for alternatives (so far nothing awesome found), you should think about backing up your bookmarks just in case the worst happens.

So you’ve got plenty of options for backing up your cloud’d bookmarks. You can import your bookmarks into another service, you can copy/backup (or export/edit) the ybookmarks.sqlite file in your Library/Application\ Support/Mozilla/Firefox/Profiles/BLAH dir. Here’s your other 2 options:

The Dummies Way

If you’re a Dummy (or the new politically correct term: “Lifehacker”), you can just use Delicious’ web-based export function to download a standard “bookmarks.html” file, which you can then directly import (Cold Turkey!) into most browsers. Of course, this means you now only have a local copy in your browser which is no longer sync’d with Delicious (or your other infoputers).  Your mileage may vary regarding browser support, and especially all your meta-data such as Notes, Tags, etc, since browsers usually don’t support these.  Here’s the HOWTO.

The Dorkers Way

If you’re a Dorker, you can automate backups of your Delicious data over their various API’s.  Here’s a pretty simple set of scripts to do this.  I’ve made some very minor changes for my own use:

# backup delicious to xml
# FYI: notes go into "extended" field
# modified to add paths to all binaries, since sometimes cron doesn't have proper paths (old habit)

# I always print some timing/debug info into my log
START_DATE=`/bin/date "+%Y%m%d_%H%M%S"`
/bin/echo --- ${START_DATE} - START ---


# the "results" argument increases the limit from the default of 1000
/usr/bin/curl --user $USER:$PASSWORD -o $BACKUP_FILE ''

# added gzip compression to save some space/bandwidth for my offsite backups. Not a huge deal, but this could allow me to keep more backups if I wanted.
/opt/local/bin/gzip -f $BACKUP_FILE

/usr/bin/find $BACKUP_DIR -name '$BACKUP_PREFIX*' -mtime +$KEEP_DAYS | xargs rm -f

/bin/echo --- ${START_DATE} - END - `/bin/date "+%Y%m%d_%H%M%S"` ---

And my crontab (which adds logging):

45      *       *       *       *       /Users/infoputer/Documents/docs/bin/ >>/Users/infoputer/Library/Logs/backup_delicious.log  2>&1

I run it at 45min past every hour since my offsite backups run a few minutes later on the hour. I didn't want to run it just once a day since i never know when my infoputer is up.  But i still only save 1 XML file per day (see the DATE_FORMAT code in the above script), although this can be made more frequent if you do alot of bookmark activity. Not sure what Delicious' frequency limits are on their API, but that's probably moot these days.

So what can you do with this new XML file? Who knows. But SAFETY FIRST, and lets hope Delicious finds a good home and we never have to find out.  Stay tuned for my research/review on the alternative options.

Leave a Reply

Your email address will not be published. Required fields are marked *

Anti-spam: complete the taskWordPress CAPTCHA