Skip to content

Latest commit

 

History

History
24 lines (24 loc) · 1.71 KB

README.md

File metadata and controls

24 lines (24 loc) · 1.71 KB

Feel the data

Thumbnail

A machine learning-based data sonification and visualization

Map data to music and image, generating a data experience. The project is meant for anyone that is able to interact with a web app, from very young people (as it is almost usable just by looking at the images) to older ones

Goals

  • to educate the highest number of people possible: to inform the young generations and to get detached people interested in today’s issues
  • make an extendible platform
  • make data understandable without having to know specific concepts

Challenges

  • Not being able to work physically together and divide the work in such a way that it could be easily merged
  • Automatic music composition
  • Merging together our ideas and also using the latest technologies

Technologies

  • map: p5.js, mappa.js, and the Mapbox API
  • music: magenta.js, specifically the magenta RNN and magenta music VAE. We applied the valence-arousal plane concept to map the data to music
  • data: OpenWeather API, specifically the Current Weather Data API and Air Pollution API
  • visuals: JavaScript, the OpenAI Dall-e API, and the concepts of Particle systems and Perlin noise The whole project is hosted in a Node.js application

Contributors