Music, language, and Artificial Intelligence (AI). Three fascinating subjects in which logic and creativity play a big role. We use language and music in our daily lives to express ourselves. But is this something human? Some AI models can ‘understand’ the things we say, and some compose music like Bach. How does this work? In this blog, I will try to find answers to general questions about music, language, and AI.
Music influences our emotions. It can make us feel either happy or sad. There is something about vibrations, beats, and melody that can reach right through us.
Music is based on math. Duration, rhythm, and intonation are measurable. And some chords just always work well together. But to compose good music, you need a creative mindset. Is this something unique to humankind?
Language is a structured system of communication. Through speech or signs (writing or sign language), we can convey a message and indicate meaning to thoughts and physical things around us.
Language is our way to connect. But to successfully understand each other, we need common ground and social context. Are we ever really able to communicate bots? How can they understand human utterances if we do not share the same experience?
Artificial Intelligence (AI) is intelligence demonstrated by machines. In a certain way, it mimics human cognitive functions like learning, problem-solving, and natural language processing.
AI is all around us. We use it to search the web (Google), discover new songs (Spotify), and call people (Siri). But AI is as good as we make it. What (biased) data do we use to train AI modelS? And how do we develop AI ethically responsibly?