Matchu
35713069fa
I like running the full `archive:create` to help us be _confident_ we've got the whole darn thing, but it takes multiple days to run on my machine and its slow HDD, which… I'm willing to do _sometimes_, but not frequently. But if we had a version of the script that ran faster, and only on URLs we still _need_, we could run that more regularly and keep our live archive relatively up-to-date. This would enable us to build reliable fallback infra for when images.neopets.com isn't responding (like today lol)! Anyway, I stopped early in this process because images.neopets.com is bad today, which means I can't really run updates today, lol :p but the delta-ing stuff seems to work, and takes closer to 30min to get the full state from the live archive, which is, y'know, still slow, but will make for a MUCH faster process than multiple days, lol
3 lines
No EOL
163 B
Bash
Executable file
3 lines
No EOL
163 B
Bash
Executable file
# Run archive:create:download-urls, but using our delta URLs file specifically.
|
|
URLS_CACHE=$(dirname $0)/urls-cache-delta.txt \
|
|
yarn archive:create:download-urls |