zie225 / -machine-learning-on-streaming-data-using-Kafka-and-Docker

I have completed my first project that machine learning on streaming data using Kafka and Docker. You can check-up my GitHub repository for codes.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

-machine-learning-on-streaming-data-using-Kafka-and-Docker

I have completed my first project that machine learning on streaming data using Kafka and Docker. You can check-up my GitHub repository for codes.

Steps of process:

1- install vscode
2- install anaconda
3- create nev enviroment on conda
4- install docker
5- install docker-compose
7- download docker-compose.yaml (https://github.com/cigdemkadakoglu/apache-kafka)
8- docker-composeup -d
9- you can check containers with docker ps or docker stats -a
10- create topic
  • I am really hardned with creating topic. Because of sh bash or another concepts are far to my domain.
  • But, our container named kafdrop also have create topic function.
  • You can write on the browser bar: https://localhost:9000/ Screenshot from 2022-07-31 00-03-18
  • create a new topic it is easier way for create topics.
11- I love pycaret library
   * setup()
   * create_model()
   * tune_model()
   * save_model()
    * Here is complex at API docs, you have to save the model with path_name just like pandas save function.
   * load_model()
    * And also when loading model, you must take path_name.
   * How it easy, yeah?
11 - I created random data generator named as data_generator.py
      * I used iris dataset, you can change this.
12 - I used this module for producer.py
        * same topic, proper ports
13 - Then, I used consumer.py
        * same topic, proper ports
        * load_model
        * print result

Coming soon....

Send data to MongoDB

About

I have completed my first project that machine learning on streaming data using Kafka and Docker. You can check-up my GitHub repository for codes.

License:Apache License 2.0


Languages

Language:Jupyter Notebook 91.6%Language:Python 8.4%