There are 1 repository under attitude-estimation topic.
Convolutional Neural Networks for Denoising Gyroscopes of Low-Cost IMUs
Benchmark on Attitude Estimation with Smartphones (datasets & scripts)
Quaternion-based Kalman filter for attitude estimation from IMU data
Study and implementations of different attitude estimation algorithms for spacecrafts.
A general ROS package for C++ or Python that fuses the accelerometer and gyroscope of an IMU in an EKF to estimate orientation.
Book Website: Dynamic System Modelling & Analysis with MATLAB & Pythobn
Navigation filters, transforms, and utilities
Use accelerometer, magnetometer, gyroscope data, use ESKF to estimate attitude.
An IMU-based Attitude Estimator, implementing the Madgwick filter
This repository contains different algorithms for attitude estimation (roll, pitch and yaw angles) from IMU sensors data: accelerometer, magnetometer and gyrometer measurements
This is the main repository for the codes for Spacecraft Attitude Control course Final Project - Politecnico di Milano
Multiplicative Extended Kalman Filter Satellite Attitude Determination
High-precision Attitude Estimation for a Satellite
very simple matrix library, int addition to a recreation of DCM attitude estimation in the form of matrix of C.
Attitude estimation for iNEMO-M1
Python implementation of the Madgwick filter using Cython
C++ implementation of the Quaternion Multiplicative Extended Kalman Filter (Q-MEKF).
Attitude estimation using MPU9250 DMP (M5Stack) and Madgwick Filter
Estimating Attitude using iPhone's IMU in Matlab
Attitude and it's covariance estimation using raw gyroscope data
Towards real-time and robust star trackers.
Here you can have access to our group Social Economics paper at BSE code by Stata.
:compass: Visualize attitude tracking algorithm using mobile phone sensors
Files created for Sistemi di Guida e Navigazione project. I simulate a VTOL (drone) trajectory and attitude and implement vary static and dynamic filters to estimate the attitude.
Transparency control framework based on inertial measurement units fixed on the user to a series elastic exoskeleton joint. This approach guarantees the robot’s transparent behavior without explicitly relying on force or human-robot interaction sensors.