Skip to content

Machine learning-based data sonification and visualization web-app mapping big cities atmospheric data to music and images.

License

Notifications You must be signed in to change notification settings

claudioeutizi/feel_the_data

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

57 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Feel the data

Thumbnail

A machine learning-based data sonification and visualization

Map data to music and image, generating a data experience. The project is meant for anyone that is able to interact with a web app, from very young people (as it is almost usable just by looking at the images) to older ones

Goals

  • to educate the highest number of people possible: to inform the young generations and to get detached people interested in today’s issues
  • make an extendible platform
  • make data understandable without having to know specific concepts

Challenges

  • Not being able to work physically together and divide the work in such a way that it could be easily merged
  • Automatic music composition
  • Merging together our ideas and also using the latest technologies

Technologies

  • map: p5.js, mappa.js, and the Mapbox API
  • music: magenta.js, specifically the magenta RNN and magenta music VAE. We applied the valence-arousal plane concept to map the data to music
  • data: OpenWeather API, specifically the Current Weather Data API and Air Pollution API
  • visuals: JavaScript, the OpenAI Dall-e API, and the concepts of Particle systems and Perlin noise The whole project is hosted in a Node.js application

Contributors

About

Machine learning-based data sonification and visualization web-app mapping big cities atmospheric data to music and images.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •