aws-samples / amazon-kinesis-data-analytics-blueprints

Kinesis Data Analytics Blueprints are a curated collection of Apache Flink applications. Each blueprint will walk you through how to solve a practical problem related to stream processing using Apache Flink. These blueprints can be leveraged to create more complex applications to solve your business challenges in Apache Flink.

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Blueprints: Amazon Managed Service for Apache Flink

๐Ÿšจ August 30, 2023: Amazon Kinesis Data Analytics has been renamed to Amazon Managed Service for Apache Flink.


โ›”๏ธ This blueprints application is obsolete. Please refer to the new repository


๐Ÿ—บ๏ธ ๐Ÿ“ ๐Ÿ› ๏ธ ๐Ÿ—

Kinesis Data Analytics Blueprints are a curated collection of Apache Flink applications. Each blueprint will walk you through how to solve a practical problem related to stream processing using Apache Flink. These blueprints can be leveraged to create more complex applications to solve your business challenges in Apache Flink, and they are designed to be extensible. We will feature examples for both the DataStream and Table API where possible.

Get started with Blueprints

Within this repo, you will find examples of Apache Flink applications that can be run locally, on an open source Apache Flink cluster, or on Kinesis Data Analytics Flink cluster. Clone the repository to get started.

Description Flink API Language
Reading from KDS and writing to Amazon S3 DataStream Java
Reading from MSK Serverless and writing to Amazon S3 DataStream Java
Reading from MSK Serverless and writing to MSK Serverless DataStream Java
Reading from MSK Serverless and writing to Amazon S3 Table Python

Prerequisites

Ensure that npm packages associated with CDK are up to date

  1. In the shared CDK folder, run npm update.
  2. In the CDK folder of your blueprint, run npm update.

For example, let's say you want to deploy the MSK to S3 blueprint. Here are the steps you would follow:

Navigate to shared CDK folder (from root of this repo)

> cd cdk-infra/shared
> npm update

up to date, audited 457 packages in 12s

30 packages are looking for funding
  run `npm fund` for details

found 0 vulnerabilities

Navigate to your blueprint folder (from root of this repo)

> cd apps/java-datastream/msk-serverless-to-s3-datastream-java
> npm install
...
> npm update

up to date, audited 457 packages in 12s

30 packages are looking for funding
  run `npm fund` for details

found 0 vulnerabilities

Now, you're ready to deploy blueprints.

NOTE: If npm update doesn't actually update your dependency versions, you might have to run npm check update or ncu and manually update the dependency versions in the package.json files in each of the above locations.

How do I use these blueprints?

  • To get started with a blueprint, first ensure you have the necessary prerequisites installed.
  • Then clone this repo using the command shown below.
git clone https://github.com/aws-samples/amazon-kinesis-data-analytics-blueprints
  • Open a terminal session and navigate to the blueprint of your choice within the project structure; once there, follow the blueprint specific instructions.

Experimentation

  • Once you have successfully begun sending data through your blueprint, you have successfully launched and tested a blueprint!
  • You can now take the blueprints in this repo, copy them to your own project structure and begin to modify them for your specific needs.

About

Kinesis Data Analytics Blueprints are a curated collection of Apache Flink applications. Each blueprint will walk you through how to solve a practical problem related to stream processing using Apache Flink. These blueprints can be leveraged to create more complex applications to solve your business challenges in Apache Flink.

License:MIT No Attribution


Languages

Language:TypeScript 45.4%Language:Java 34.8%Language:Python 7.5%Language:Jupyter Notebook 6.9%Language:JavaScript 5.4%