Not quite what you're asking for but I wrote a python 3 script, dynscrape, that uses the Beautiful Soup 4 module to create a list for "wget -i".
#!/usr/bin/python
import bs4, re, sys, urllib.request
# get url argument.
if len(sys.argv) != 2:
print("usage: ./dynscrape <url>")
raise SystemExit
url = sys.argv[1]
# Set cookie to bypass age check: "Cookie: ageVerify=1"
req = urllib.request.Request(url)
req.add_header('Cookie', 'ageVerify=1')
# Download url text.
page_text = urllib.request.urlopen(req).read()
# parse page.
soup = bs4.BeautifulSoup(page_text)
# in the content div find anchors that contain .zip filenames.
zipre = re.compile('\.zip$')
for div in soup.body.find_all('div', 'contents'):
for link in div.find_all('a'):
if zipre.search(link.get('href')):
print("http://dynasty-scans.com", link.get('href'), sep='')
Run providing an url:
$ ./dynscrape http://dynasty-scans.com/reader/series/blarg >blarg-list
$ wget -i blarg-list