Backing-up my LJ ScrapBook pics, redux
2017-Jan-06, Friday 02:58 am![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
You are Good People, right? You just want to get what's yours (albeit in the laziest way possible), right? You don't want to misuse any tools to cause any damage, right? Okay, great, listen up.
A while back I came up with some cmd line calls using
They use Flash. Flash does not play nice with, well, anything. So I am left with the not so elegant brute-force approach.
Log in to LJ. Go to your Scrapbook, view a photo, click on the little share icon. Grab your $usernumber.
Format:
Have a look at your newest upload, note the $photonumber.
While logged in, export your cookies.txt (see previous post, basically find plugin for your browser).
Plug $username, $usernumber, and a number greater than the $photonumber into $maxphotonumber.
Run script.
The script simply checks every single number between zero and your maximum number. If there exists an image, it will save it, with the same name. It only checks for the _original images. It saves everything in the one directory. It is not optimised, but it should get everything. Most of the flags aren't needed. But I think they show well just how much patience I lost with this whole thing.
*edit* About 14 hours to get over 1,000 pictures in a 280,000 number range.
*edit2* If you get a few unopenable files, try different format extensions.
A while back I came up with some cmd line calls using
wget
to back up my LJ Scrapbook. This method stopped working as LJ restructured Scrapbook a few times. Me, I kept using them as free image hosting. But, for some reason, I've been remotivated to get a backup.They use Flash. Flash does not play nice with, well, anything. So I am left with the not so elegant brute-force approach.
Log in to LJ. Go to your Scrapbook, view a photo, click on the little share icon. Grab your $usernumber.
Format:
http://ic.pics.livejournal.com/$username/$usernumber/$photonumber/$photonumber_original.jpg
Have a look at your newest upload, note the $photonumber.
While logged in, export your cookies.txt (see previous post, basically find plugin for your browser).
Plug $username, $usernumber, and a number greater than the $photonumber into $maxphotonumber.
Run script.
#!/bin/bash
username=your_user_name
usernumber=your_user_number
maxphotonumber=your_max_number
for (( c=0; c<=$maxphotonumber; c++))
do
wget --load-cookies cookies.txt -erobots=off -nd -np -r http://ic.pics.livejournal.com/$username/$usernumber/$c/"$c"_original.jpg
done
The script simply checks every single number between zero and your maximum number. If there exists an image, it will save it, with the same name. It only checks for the _original images. It saves everything in the one directory. It is not optimised, but it should get everything. Most of the flags aren't needed. But I think they show well just how much patience I lost with this whole thing.
*edit* About 14 hours to get over 1,000 pictures in a 280,000 number range.
*edit2* If you get a few unopenable files, try different format extensions.