It is clear that AI is changing the music industry. While studying at Berklee College of Music, I saw how AI is making things easier for students. From automatic composition assistants to mastering AI to practice assistant software—they have made it easier for students to do tasks that were previously only available to professionals. AI is clearly having a positive impact in that it lowers the barriers to creation and opens music up to more people.
However, when I look at recent AI-generated music, I wonder if this technology is quietly moving beyond being a simple creation tool into the realm of ‘manipulation.’ For example, many songs created by AI contain content that promotes hatred toward a specific group or glorifies self-destructive behavior. In particular, since March of last year, such songs have been spreading in large quantities. The problem is not that someone simply made them for fun, but that they are being automatically mass-produced and distributed. In the past, people had to at least compose and write lyrics to create music with such messages, but now AI can automatically create music that stimulates emotions and provide it in a form optimized for the listener’s taste.
What’s more concerning is that AI can go beyond simply creating songs with a certain mood, and can even quietly manipulate our emotions and beliefs. It has been studied for quite some time that the music we usually listen to is not simply something we “like,” but that it subtly reinforces certain thoughts and regulates emotions. But what if AI analyzes our moods and tastes and continues to play music that reinforces certain messages? Whether it’s political propaganda or a strategy to induce consumption patterns, music could be used as a tool to move people without our knowledge.
Watching AI help students create, I experienced many bright aspects of this technology, but at the same time, I think that if this kind of unregulated use continues, music could no longer be a free art form, but a tool that injects “manipulated emotions.” After all, as music has the power to move emotions, attempts to exploit it will continue to emerge. So what kind of balance should we find? Is there a way to prevent abuse without hindering the free development of AI music? What do you think?