tuomaseerola / metaMER

meta-analysis of MER

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

README

How well we can predict emotions in music? What is the evidence in the published literature for explaining what emotions the listeners can perceive in music when the source consists of audio examples. To what degree the results are dependent on the actual models, emotions, musical/acoustic features, or musical materials or participants?

To obtain answers to these questions, we have set out to record and analyse the current state of the art from the literature using a meta-analysis paradigm. We focus on Music Emotion Recognition and hence the acronym metaMER.

The public-facing version of the repository is available at https://tuomaseerola.github.io/metaMER/

Plan

We define the aims and methods in preregistration plan.

Study Search and Selection

Search databases and criteria are documented in studies/search_syntax.qmd.

Data Extraction and Coding

Data coding and extraction is described in data template studies/extraction_details.qmd.

Analysis

Data analysis is covered in analysis/analysis.qmd document.

Manuscript

The study report is available manuscript/manuscript.qmd document.

About

meta-analysis of MER

License:MIT License


Languages

Language:TeX 98.4%Language:JavaScript 0.8%Language:HTML 0.5%Language:R 0.3%