amueller / introduction_to_ml_with_python

Notebooks and code for the book "Introduction to Machine Learning with Python"

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ImportError: `load_boston` has been removed from scikit-learn since version 1.2.

matmuttt opened this issue · comments

Hello,
when I run this :

import mglearn
iris_dataframe = pd.DataFrame(X_train, columns=iris_dataset.feature_names)
pd.plotting.scatter_matrix(iris_dataframe, c=y_train, figsize=(15, 15), marker='o', hist_kwds={'bins': 20}, s=20, alpha = 8, cmap=mglearn.cm3)

I have an error in my terminal :
ImportError:
load_boston has been removed from scikit-learn since version 1.2.

So I can't get the graphics of the fig 1.3 page 22 in the book...

And i have also a problem when I import mglearn :
Traceback (most recent call last):
File "f:/PROJECT/LIVRE/main.py", line 107, in
import mglearn
File "C:\Users\starc\AppData\Local\Programs\Python\Python38\lib\site-packages\mglearn_init
.py", line 1, in
from . import plots
File "C:\Users\starc\AppData\Local\Programs\Python\Python38\lib\site-packages\mglearn\plots.py", line 5, in
from .plot_knn_regression import plot_knn_regression
File "C:\Users\starc\AppData\Local\Programs\Python\Python38\lib\site-packages\mglearn\plot_knn_regression.py", line 7, in
from .datasets import make_wave
File "C:\Users\starc\AppData\Local\Programs\Python\Python38\lib\site-packages\mglearn\datasets.py", line 5, in
from sklearn.datasets import load_boston
File "C:\Users\starc\AppData\Local\Programs\Python\Python38\lib\site-packages\sklearn\datasets_init_.py", line 156, in getattr
raise ImportError(msg)_

Hope you can help me !

It's unfortunate about the racist data. A quick workaround is to just heed the version warning:

load_boston has been removed from scikit-learn since version 1.2.

pip install "scikit-learn<1.2"

One of the authors leaves some comments here: #163

If you're using anaconda run this
conda install scikit-learn=1.1.3

and restart your kernel if you're using jupyter notebooks

If you update mglearn, this should be resolved now.