iheanyi / bandcamp-dl

Simple python script to download Bandcamp albums

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Download full discography.

arnauldb opened this issue · comments

Hello,

Is it possible to download a full discography? E.g, if I wish to download all albums in https://hotcasarecords.bandcamp.com/music

Thank you.

It is a feature I plan to add in the near future as a sort of final feature update.

I'd like to add one extra request here if possible. It would be great to generate a file (txt, whatever) and to monitor artists pages.

Then run bandcamp-dl in a screen, or run it as a service. or run it from cron.

Once launched, the txt file is processed. if anything new was posted on the followed artists page(s), it is automatically downloaded.

would automate the above process quite a bit.

commented

I'd also like to see this. Having issues with the other scripts that try to accomplish this. Keep up the great work!

Meanwhile you can get all the album's URLs from an artist page with this JavaScript code. Run it in your browser console while in an artist page. Then put the lines in a file and call bandcamp-dl in a loop.

JavaScript code to get album/track URLs

var x = document.querySelectorAll("li.music-grid-item > a:nth-child(1)");
var i, text="";
for (i = 0; i < x.length; i++) {
  text += x[i] + "\n";
} 

Bash code to loop over file

 for i in $(cat download); do bandcamp-dl $i; done

@Evolution0 Any progress? I'd also use the feature and could add it but if you already have something started I'll wait.

It would be great just to have something like "youtube-dl -a URLS.txt" to run it on a file full of album URLs. I will often collect album URLs in a file while sampling, and then grab them all in one swoop. I like bcdl better than ytdl, but really wish it had this feature!

@silkyclouds @alej0varas Great ideas. This batch script can be run on Windows Scheduler, etc.

More simple solution, but only tested on Linux, I think It can work on other platform too but you maybe need some adjustments:

#!/bin/bash

if [ -z "$1" ]; then
  echo "usage:"
  echo "    $0 <url>"
  echo ""
  echo "<url> can be domain.tld or domaine.tld/music"
  exit
fi

base=$(echo $1 | sed -e "s/\/$//")

urls=$(curl $base -o /tmp/bandcamp_out &> /dev/null ; cat /tmp/bandcamp_out | grep "href=\"/album" | cut -d"\"" -f2)

base=$(echo $base | sed -e "s/\/music$//")

for url in $urls; do
  echo "Downloading $(echo $url | sed -e "s/^\/album\///g") album..."
  bandcamp-dl $base$url
  if [ $? -ne 0 ]; then
    echo "Are you sure that $base$url is a valid album url?"
    exit
  fi
done

EDIT: More specific sed rule.

I'd be happy to try implementing this, but don't want to duplicate work. Anyone else who's started on this want to open a (even partial) PR?

In progress, also adding wishlist support and possibly collections support but as I don't have any content on Bandcamp I have no way of really seeing what the html looks like for collections other than what the wishlist html suggests (Likely same exact html, just under the users profile URL)

Looks like I'll be making a few functions and generalizing others, upside is I should also be able to add in the feature to spit out just urls.

Progress has been made, I can now identify content type from the urls provided (collection, discography, wishlist, track, album) from here I can now fetch page data and send it to the right parsing function.

Any update here?

commented

Another way, get all the album's urls with https://wiki.debian.org/Lynx
Need adjustments as your prefers

#!/bin/bash
##sudo apt instal -y lynx && sudo pip3 install bandcamp-downloader##
echo "Copy Bandacmp Url e.g https://ARTIST.bandcamp.com/music"
read url
lynx -dump -listonly -nonumbers $url | grep "bandcamp.com/album" > bandcamp-link.txt ; sleep 1
for album in $(cat bandcamp-link.txt); do (bandcamp-dl -r --template "%{artist} - %{album}/%{track} - %{title}" $album); done

massive-bandcamp-dl.sh

#! /bin/bash
listfilename=$(date +%F%H%M)
finalfilename=$(echo $RANDOM)
url=$1
if ! [[ $1 =~ https?://.*\.bandcamp\.com.* ]]; then
  echo "Use a valid bandcamp's URL..."
  else
wget  --convert-links $url -O /tmp/$listfilename.txt
cat /tmp/$listfilename.txt |  grep -e "/album" -e "/track/" |sed -n 's/.*href="\([^"]*\).*/\1/p' > /tmp/$finalfilename.txt
wc -l /tmp/$listfilename.txt
rm /tmp/$listfilename.txt
wc -l /tmp/$finalfilename.txt
LINES=$(cat /tmp/$finalfilename.txt)
for LINE in $LINES
do
    bc-dl $LINE
done
rm /tmp/$finalfilename.txt
fi

bancdamp-downloader.sh


#! /bin/bash
if ! [[ $1 =~ https?://.*\.bandcamp\.com.* ]]; then
  echo "Use a valid bandcamp's URL..."
  else
AAName=$(yt-dlp --get-filename --playlist-items 1 -o '%(artist)s-%(album)s' --output-na-placeholder  '' $1)
FixedAAName=${AAName// /-}
echo $FixedAAName
mkdir $FixedAAName
#yt-dlp -o "$FixedAAName/%(artist)s-%(album)s-%(playlist_index)s-%(track)s.%(ext)s" $1 
yt-dlp -o "$FixedAAName/%(playlist_index)s-%(track)s.%(ext)s" $1 
fi

Awesome. finally managed to download all of my purchased albums. thanks dudes.