packt-cli / Packt-Publishing-Free-Learning

Scripts that automatically claim and download free daily eBooks from https://www.packtpub.com/packt/offers/free-learning

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Fail to grab a books

ranrinc opened this issue · comments

Hi there,

Its seems that the script failed to grab a books even though its say complete.
So its failed to find the books that suppose to be grab already.

Upon manually grab and run the script again its found the right books and download..

Maybe Packt change the HTML again?

error

I test it again today and confirm that the current scrip unable to grab the new books. Its says getting you book data and upon successfully fetched its fail to do it so it end up with not founding the current books.

I had this problem too - seems there is a delay at Packt between claiming an ebook and the availability to download it.
I applied a quick'n'dirty fix by adding a time.sleep before "ebook.download_books" in packtPublishingFreeEbook.py, but there might be a better way to deal with this.

I've never encountered this problem after script modification in January (I've a cron job set up to run everyday). If you say it happens to you we can pass product_id (an ID of the claimed) product directly to download_books instead of titles parameter, we will see whether it helps and it's more elegant solution anyway. I'll make a pull request.

Does the problem you're describing happens often? Was it a single time or does it happen regularly?

@mjenczmyk I my ubuntu it fail to retrieve the book every time. No luck so far. The only way to grab the books is manually and then run the scripts, then it will be able to found the book and download. So would like to get some assistant on this issue

Try installing the script using pip install git+https://github.com/mjenczmyk/Packt-Publishing-Free-Learning.git@id. If that won't help what is the output of pip freeze from the virtualenv? What is the version you're using currently?

@mjenczmyk I did pip install and I believe it would be the latest version. Will try to do pip freeze

I've merged the pull request (it won't break anything, but it may not have resolved your problem).

Any news, did it help (I don't think it should've helped)?

I have dropped my local patch "time.sleep(60)" and checked out master (7c0b264), unfortunately it did not work:

2019-03-25 01:00:04,855 - api - [INFO] - JWT token has been fetched successfully!
2019-03-25 01:00:04,856 - packtPublishingFreeEbook - [INFO] - Start grabbing ebook...
2019-03-25 01:00:05,163 - packtPublishingFreeEbook - [INFO] - Started solving ReCAPTCHA on Packt Free Learning website...
2019-03-25 01:00:06,694 - utils.anticaptcha - [INFO] - Waiting for completion of the xxx task...
2019-03-25 01:00:49,096 - utils.anticaptcha - [SUCCESS] - Solution found for xxx task.
2019-03-25 01:00:49,446 - packtPublishingFreeEbook - [INFO] - A new Packt Free Learning ebook "Mastering Kubernetes" has been grabbed!
2019-03-25 01:00:49,513 - packtPublishingFreeEbook - [INFO] - Title: "Mastering Kubernetes"
2019-03-25 01:00:49,517 - packtPublishingFreeEbook - [INFO] - Downloading ebook: "Mastering Kubernetes" in epub format...
2019-03-25 01:00:49,963 - packtPublishingFreeEbook - [ERROR] - Invalid URL 'None': No schema supplied. Perhaps you meant http://None?
2019-03-25 01:00:49,964 - packtPublishingFreeEbook - [INFO] - Title: "Mastering Kubernetes"
2019-03-25 01:00:49,965 - packtPublishingFreeEbook - [INFO] - Downloading ebook: "Mastering Kubernetes" in pdf format...
2019-03-25 01:00:50,264 - packtPublishingFreeEbook - [ERROR] - Invalid URL 'None': No schema supplied. Perhaps you meant http://None?
2019-03-25 01:00:50,264 - packtPublishingFreeEbook - [INFO] - 0 ebooks have been downloaded!
2019-03-25 01:00:50,265 - packtPublishingFreeEbook - [SUCCESS] - Good, looks like all went well! :-)

Any idea? Thanks.

Did you ran pip install -e '.[dev]' after checking out? What command did you ran in the command line?

Yes, I did run pip install -e '.[dev]. I am running packt-cli -gd via cron

I've no idea what's the problem. Sorry, I'm afraid I'm unable to help with this.

It's a duplicate of #162, I'll try to resolve both issues soon.

Resolved by #166.