This Flutter project is designed to perform real-time emotion detection using computer vision and machine learning. It can identify emotions like happiness, sadness, anger, and more from live camera feed or images.
- Real-time emotion detection using the device's camera.
- Support for various emotions like happiness, sadness, anger, and more.
- User-friendly interface with live emotion feedback.
Before you begin, ensure you have met the following requirements:
- Flutter installed on your development machine.
- A compatible Android or iOS device or emulator for testing.
To get started with this project, follow these steps:
-
Clone this repository to your local machine:
git clone https://github.com/your-username/emotion-detection-flutter.git
-
Navigate to the project directory:
cd emotion-detection-flutter
-
Install dependencies:
flutter pub get
-
Run the app:
flutter run
- Open the app on your device or emulator.
- Grant camera permissions when prompted.
- Point the camera at a face, and the app will detect and display the detected emotion in real-time.
- Flutter: The UI framework used for building the mobile app.
- TensorFlow Lite: For running the machine learning model for emotion detection.
- Camera Plugin: For accessing the device's camera.
Contributions are welcome! If you'd like to contribute to this project, please follow these steps:
- Fork the repository.
- Create a new branch for your feature:
git checkout -b feature-name
. - Make your changes and commit them:
git commit -m 'Add some feature'
. - Push to the branch:
git push origin feature-name
. - Create a pull request.
- This project was inspired by the idea of using AI for emotion detection in real-time applications.
- Special thanks to the open-source community for providing resources and libraries.