navaneet-rao / news-api-kafka-producer-consumer

An ETL setup has been established where the data from news-API has been imported for the keyword jobs and loaded into a CSV file that’s loaded into the hive table via hdfs. The loaded data is further used to extract insights, most said words, day-wise distribution of news articles, and so on.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

news-api-kafka-producer-consumer

An ETL setup has been established where the data from news-API has been imported for the keyword jobs and loaded into a CSV file that’s loaded into the hive table via hdfs. The loaded data is further used to extract insights, most said words, day-wise distribution of news articles, and so on.

video1912587707.mp4

About

An ETL setup has been established where the data from news-API has been imported for the keyword jobs and loaded into a CSV file that’s loaded into the hive table via hdfs. The loaded data is further used to extract insights, most said words, day-wise distribution of news articles, and so on.


Languages

Language:Python 100.0%