RipMeApp / ripme

Downloads albums in bulk

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Submit requests for new sites to support here

metaprime opened this issue · comments

Assume all links are NSFW (unless marked SFW).

Commenters: Please mark your NSFW links anyway.

Please open a new issue if you're reporting a problem in the ripper for a site we already support. This issue is to request support for new sites.

PLEASE include an example URL of the type of album to rip.

This issue replaces 4pr0n/ripme#8 and 4pr0n/ripme#502 which had gotten too huge to load or peruse in a reasonable amount of time.
This will give an opportunity to actually look at the older requests without new requests coming in to make that more difficult, and will help to see the more modern requests.

4pr0n/ripme#510 (now #43) will track received requests and aggregate links to the work in progress on each request.

Already-supported sites: https://github.com/4pr0n/ripme/wiki/Supported-Sites

NSFW Copy/pasting a previous request:

Site Suggestion: http://cfake.com/ (NSFW)
Example: http://cfake.com/picture/Olivia-Munn/969

Things I've noted already that should help make implementation easier:

The actress' name is completely irrelevant. It can be changed to anything so long as the ID at the end stays the same
The first page's URL can either end after the actress' ID, or with a /p0 after the ID
Every subsequent page is specified with a /p{INTEGER}, seemingly always in increments of 30 (the number of thumbnails displayed)
Every thumbnail's IMG tag has an A tag for its parent. The link of the A tag is something like:
javascript:showimage('/big.php?show=12843193425f90a741_cfake.jpg&id_picture=97478&id_name=969&p_name=Olivia Munn','546','800')
The show parameter from the A tag's link can be appended to cfake.com/photos/ to get a direct link to the image
The site also has a "slideshow" feature which would be another (possibly easier) method to implement
The links look like this: http://cfake.com/show/Olivia%20Munn/969/p{INTEGER}#here where INTEGER is the index of the actress' pictures. (starting at 0)
This mode also seems to directly link to all images, so there would be no need to take extra steps to get the link like the other method.
It would seem that while the slideshow method as a whole would be simpler to implement, the main method would likely be faster, because only one page needs to be loaded and scraped per 30 images, as opposed to having to load an entire page for every image.
This one should be relatively easy to implement. (in comparison to some other more complicated sites)

Site Suggestion: http://www.bobx.com/ (NSFW)
http://www.bobx.com/dark/ | http://www.bobx.com/av-idol/index_all_pictures.html | v.v..etc...
all will end up with http://www.bobx.com/av-idol/[name]

Example: http://www.bobx.com/av-idol/ruru-anoa/

Each av-idol have photosets :
Example : http://www.bobx.com/av-idol/ruru-anoa/photoset/graphis-gals-_194---exclaiming-0-2-8.html

Photos link : http://www.bobx.com/av-idol/ruru-anoa/large-ruru-anoa-471704/
DDL when clicked on HI-RES (if available) : http://www.bobx.com/av-idol/ruru-anoa/large-ruru-anoa-471704/ -> http://www.bobx.com/av-idol/ruru-anoa/ruru-anoa-00471704.jpg

What i want is download all images from photoset link and rename to <title>
Example : Ruru Anoa in Graphis Gals #194 - Exclaiming - @0 of 96

Thank

Does this program work with only the sites on the list?

for example I was able to
pull the images from this site from
http://www.ericryananderson.com/
With bulk image downloader. Can we force the program to detect pictures on any site?

@forsomefun

Does this program work with only the sites on the list?

Yup

Can we force the program to detect pictures on any site?

Nope

Is 4archive feasible anymore? Thanks!

Site: http://4archive.org/
Example: https://4archive.org/board/hr/thread/2770629 (NSFW)

^

Is 4archive feasible anymore? Thanks!

< under breath > Damn, look at the size of that place O.o ... It's not a 4chan archive for nothing.

site: imageshack.com
example: https://imageshack.com/a/vsWl/1

@Fff95 Support for theyiffgallery albums was added in 1.6.11

@handsomebananas Support for pichunter albums was added in 1.6.11

commented

@cyian-1756 Thanks, I was using 1.6.10 which was latest update when I tried it out :P

@bvcyt we support vidble albums but apparently not video links. We should probably add support for that.

(NSFW i guess)

Site: https://www.wikifeet.com
Folder images for celebrities example: https://www.wikifeet.com/Nathalie_Kelley

Each image have their own directory
Example: https://pics.wikifeet.com/Nathalie-Kelley-Feet-303524.jpg

This is the website I log everyday to save manually images, it would be miracle if RipMe support this,

Thank you!!

commented

is possible to study any possibilitie to donwload albums from http://imgsrc.ru/main/pic.php?ad=1861831?

Any chance of adding support for Behance or Dribbble? I req this in another thread, sorry for the duplicate.

Examples:

https://dribbble.com/typogriff
https://www.behance.net/illuleo

SFW http://www.cartoon3rbi.net (in Arabic)

Examples:

The good thing is the links always follow the examples' look, so it's very easy to build a RegEx to validate the link.

Site Suggestion: https://img.yt/
Example (NSFW): https://img.yt/gallery-216050.html
Every image opens to a new page, example: https://img.yt/img-5a1ef3ef2fefa.html

Any chance of adding support for i.thechive?

https://i.thechive.com/
https://i.thechive.com/jesseh89

Are there any plans to fix the Tumblr rip capabilities?

Whenever I try to rip a Tumblr site, I get the following error (the file links vary but just including one as an example):
NSFW

Downloading http://78.media.tumblr.com/678b5dcc8e4f4efd253f7c5a32b4ff6c/tumblr_p0x71gD5AK1wkeb5no9_1280.jpg
http://78.media.tumblr.com/22c8a8330bc33e53045f1fa1eaa7d92c/tumblr_p0wtfrizjW1wb6kqqo8_1280.jpg : Non-retriable status code 400 while downloading http://78.media.tumblr.com/22c8a8330bc33e53045f1fa1eaa7d92c/tumblr_p0wtfrizjW1wb6kqqo8_1280.jpg

@MrBiggz007

Are there any plans to fix the Tumblr rip capabilities?

There's already a PR for it #340

Site: artstation.com (sometimes NSFW)
Example (user profile): https://www.artstation.com/roldan
Example (individual artwork): https://www.artstation.com/artwork/baOvG

@m2afs Support for pornpics.com was added in 1.7.10

@keriorangejuice I'll start working on that.

commented

@cyian-1756 thx for the update.
Site: https://www.pornpics.com/
there's 2 layers to the site, would appreciate if u add the upper layer support as well
e.g: categories, pornstars, search queues....
Is it possible to fetch the lists then get the albums?

Suggestion: https://clips.twitch.tv/
Example Collection on Reddit: https://www.reddit.com/r/LivestreamFail
Example Video: https://clips.twitch.tv/FaithfulIncredulousPotTBCheesePull

Thanks for the great tool by the way!

@waqas
I gotchu fam. It's a quarter done already.

Site: https://8ch.net
Example: https://8ch.net/wg/index.html

I've noticed that 8ch will work with RipMe if trying to download from a single thread, as it has /res/ in the title but if you're trying to download a whole board at a time it won't work as it doesn't have "chan" in the URL name.

Keep up the good work

NSFW
Site: http://dites.bonjourmadame.fr/
example: http://dites.bonjourmadame.fr/image/170374587324

The site appears to be tumblr based or very similar, but isn't supported.

Team, thanks immensely for the updates.

If possible, can you add Behance to the lineup of sites to rip?

Site: https://www.behance.net/
Ex: https://www.behance.net/gallery/61432777/PAUSEFEST-2018

Some sites have mixed formats like .gif,.mpeg and png/jpeg. Would love it if RipMe could rip the whole gallery.

Thanks for the hard work, keep it up!!

NSFW
Site: https://girlsreleased.com
Example (Set): https://girlsreleased.com/?h=set/22517#set/22517
Example (Model): https://girlsreleased.com/#site/hegre-art.com/model/1308/Yara

Hidden in the page there is a div (id="share_html") with a textarea containing all the images links (imagetwist).
I wanted to download the sets organized in separate folders from the Model page.
In the Model page the exceeding sets are loaded on scroll (ajax calls).

This is a template for another bulk image downloader program, it could be useful:
http://www.webimagedownloader.com/templates/2-GirlsReleased.com-sets-and-models-template/

Thanks ;)

SFW/NSFW
Site: https://pawoo.net/
Based on https://github.com/pixiv/mastodon/tree/pawoo

What is seen on the site is built by JS, so the actual stuff is in the MediaGallery DIV:
<div data-component='MediaGallery' data-props='{image;path;is;somewhere;here;&quot;}'>

NSFW
Website: http://pornreactor.cc
Example: http://pornreactor.cc/tag/Porn+Art
Download all images with that tag.

Thanks ;)

@MachinaeWolf Both https://www.hairfinder.com/amp/hairstyles.htm and http://animals.timduru.org/dirlist/bear/ are open directories which ripme has no plans to support, try downloading them using wget

Please could yo add support to http://pixiv.net

NSFW
Site: https://exhentai.org/
Example: https://exhentai.org/g/1127876/4b75354336/
You need to be logged into E-Hentai.org before going to the site, otherwise you get a sad panda instead. format should be similar to e-hentai.org

NSFW
Site https://sta.sh
Example: https://sta.sh/21uap7sqyyg2

Is basically a dA reskin that just doesn't have public art. You just need a link to access an album. Usually there posted in the description of art on dA, any chance ripme can check descriptions and just add them to the queue automatically?

Site https://sta.sh

I can tell it'll be a fairly simple task, as I've checked the source code.

@thepulls

Can you post the link to a sta.sh album with more than 1 page?

@thepulls

Are any sta.sh albums divided into pages (Where you have to click a link to get the next page of thumbnails) or are all just one page?

@cyian-1756

No, from what I've seen they are all on one page. I've never seen a sta.sh album with multiple pages.

Also, out of curiosity, what is the practicality of ripme adding sta.sh albums to the queue from image descriptions?

Also, out of curiosity, what is the practicality of ripme adding sta.sh albums to the queue from image descriptions?

I have no idea. Taking a quick look it seems like it would be possible to add any links from the description to the queue but I'd have to take a more in depth look at the code. I'll check it out and get back to you once the sta.sh ripper is finished

1 thing could be you'll check the page for a ZIP DL option (https://sta.sh/zip/2hn9rtavr1g) in the case of multiple uploads per sta.sh link. If it exists, DL the ZIP and extract it, otherwise get the file (pic, GIF, ZIP/RAR/etc., ...) through the DL button, but be prepared for DL buttonless.

1 thing could be you'll check the page for a ZIP DL option (https://sta.sh/zip/2hn9rtavr1g)

Ripme can't handle zip files at this point and I don't think that's a feature that needs to be added

otherwise get the file (pic, GIF, ZIP/RAR/etc., ...) through the DL button

That's what I'm planning on

but be prepared for DL buttonless.

Are there albums without DL buttons? If so could you link one?

^ Even if RipMe won't get ZIP handling support you can still DL the ZIP.

Nah, haven't seen (AFAIR) a sta.sh upload which doesn't have a DL button.

@thepulls

I finished the ripper and made a PR for it (#460) it should be out in ripme 1.7.24

Site: vk.com
They have their own API https://vk.com/dev/wall
The requested feauture is to donload all photos from the group/public page (this is the axample of public page https://vk.com/siberiawear). All groups and publics in VK have feed structure.

@nyaasi Can you provide an example album for pawoo.net?

@linuxftw100

The femjoyhunter ripper is done and should be in ripme 1.7.24

@momoe

The sinfest ripper is done and should be in ripme 1.7.24

commented

@cyian-1756 Thank you for the sinfest update. It's much appreciated.

NSFW
site: imgpile.com
example: https://imgpile.com/album/LZG
The site requires you to login to view nsfw however.

plz can u consider adding the following sites to the list of supported ones?
NSFW
sites: Rule34.xxx and Rule34hentai
https://rule34.xxx/
Examples: https://rule34.xxx/index.php?page=post&s=list
https://rule34.xxx/index.php?page=post&s=list&tags=legoman

https://rule34hentai.net/
Examples:https://rule34hentai.net/post/list/Legoman/

NSFW image board request

Site: http://anon-ib.su/
Topic Example: http://anon-ib.su/c/
Thread Example: http://anon-ib.su/c/res/49217.html

This image board has multiple domains but .su and .to seem the be the most reliable.

For me, I would like to download everything under /c topic and not have to specify its individual threads since they come and go, but some people may want just to downloads specific threads only.

For example entering URL http://anon-ib.su/c/ would download everything under it to a similar folder structure and naming scheme.

Can you please add support for
Site: 9gag.com
Its a very popular site and many of you would know it already and it would be super appreciated by a vast community of 9gag.
Thanks in advance

Hello can you add https://hiveworkscomics.com/ , http://jb2448.info/, and https://www.canterlotcomics.com/ please ? They are webcomic sites. Thanks.

Exemple of page : http://www.blackbrickroadofoz.com/comic/cover/ http://jb2448.info/index.php?/category/1231 and https://www.canterlotcomics.com/chap/en/friendship_is_magic/introduction-609

It would be nice to have support for manga/comic books websites if possible as well :)

I'd like to ask for https://www.artstation.com/

Here's a couple of random galleries with thumbnails that in some cases have mini subgalleries
https://www.artstation.com/duongnguyenminhthuan
https://www.artstation.com/yankyohara

Would it be possible to support for
Site: ( NSFW) https://realsexclips.com/
Example: (NSFW) https://realsexclips.com/video/asmr-pussy-close-masturbation-sounds/

@night2819

It would be nice to have support for manga/comic books websites if possible as well :)

Post the sites and I'll add support

@night2819

Support for http://www.blackbrickroadofoz.com was added in 1.7.34

Requesting a pretty popular site, with fairly straightforward format:

Hitomi.la
example: https://hitomi.la/galleries/1212321.html
and check out how straightforward the image links themselves are:
https://0a.hitomi.la/galleries/1212321/0008.jpg

Site: https://www.wikifeet.com
Folder images for celebrities example: https://www.wikifeet.com/Nathalie_Kelley

Each image have their own directory
Example: https://pics.wikifeet.com/Nathalie-Kelley-Feet-303524.jpg

There is a way to download a full folder link? with all images..
It'd be possible to add in RipMe?

Thank you

@stalwartmonkey Support for Hitomi.la was added in 1.7.38

I would definitely like to see some compatibility for other websites that host large amounts of images. For example;

https://www.alpha.wallhaven.cc (SFW yet NSFW also)
https://www.artsfon.com (SFW yet NSFW also)
https://www.metarthunter.com (NSFW)

Also any way of adding a script that downloads the whole video contents from a specific youtube channel.
https://www.youtube.com

EDIT: Also a possible addition to download mass media from social media such as Facebook pages/ messages

Sites that host hundreds of thousands of wallpapers

I just want to bring up another SFW design oriented website, Behance.com, something I've requested earlier.

The addition of the dribbble.com domain was fucking epic!

Any progress on adding this domain to the ripper would be AMAZING!

Just a question
Any chance of 'ripme' adding whentai to the list ? and, if so, it could download paid art ?

@Nextshot

Any chance of 'ripme' adding whentai to the list ?

Can you link the site

and, if so, it could download paid art ?

I imagine it could if you paid for the art

Also would you be able to add a wordlist option so a bunch of links can be added at one time, would benefit everyone I'd think. Thanks!

When trying to download a tumblr archive for example

http://tumblraccountnamegoeshere.tumblr.com/archive

ripme returns the error shown in the image below.
ripme error

@JustAnotherUserAccount If you're willing to use command line, you can already use the "-f" flag to specify a file.

@sierrax if you're wanting that specific audio file then just look through the source code, here it is any way straight from the code itself - https://soundgasm.net/sounds/8d18db576d00bbbe542646380d2ae5ce474bb819.m4a

@JustAnotherUserAccount Oh, I know, that's easy enough. I'm more looking for a way to rip subreddits with a lot of content from there. That may be a different issue entirely.

@sierrax should be easy enough to implement. You can you'll have to wait about a week though.

So I mentioned this a very long time ago in a comment or PM. I was wondering what it would take to be able to implement a sign-in session for a Instagram account so I could rip every single account I follow automatically? I follow over 2 thousand accounts, and I am only going to keep going.

My project I have envisioned is to archive much of who I follow. I want a historical record. I want to be one of those rare that has a reference point in time and I can see who deleted their accounts, who wiped all their images.

Is there any way this could be implemented?

Also, thank you guys for releasing and creating a tool like this. Absolutely phenomenal.

I was wondering what it would take to be able to implement a sign-in session for a Instagram account so I could rip every single account I follow automatically?

It would require ripme to use the instagram api

Is there any way this could be implemented?

It could be