Abstract
This research investigates the use of emotion data derived from analyzing change in activity in the autonomic nervous system (ANS) as revealed by brainwave production to support the creative music compositional intelligence of an adaptive interface. A relational model of the influence of musical events on the listener's affect is first induced using inductive logic programming paradigms with the emotion data and musical score features as inputs of the induction task. The components of composition such as interval and scale, instrumentation, chord progression and melody are automatically combined using genetic algorithm and melodic transformation heuristics that depend on the predictive knowledge and character of the induced model. Out of the four targeted basic emotional states, namely, stress, joy, sadness, and relaxation, the empirical results reported here show that the system is able to successfully compose tunes that convey one of these affective states.
Original language | English |
---|---|
Pages (from-to) | 200-208 |
Number of pages | 9 |
Journal | Knowledge-Based Systems |
Volume | 21 |
Issue number | 3 |
DOIs | |
Publication status | Published - 2008 Apr |
Externally published | Yes |
Keywords
- Adaptive user interface
- Automated reasoning
- EEG-based emotion spectrum analysis
- Machine learning
- User modelling
ASJC Scopus subject areas
- Management Information Systems
- Software
- Information Systems and Management
- Artificial Intelligence