
·S4 E6
Music of the Future: Tech Innovations and Inventions
Episode Transcript
What's up?
My name is the Bul Bey.
And I'm Kirsten Michelle Cills.
We're your hosts, and this is So Curious!
A podcast presented by the Franklin Institute, and today, we have a really exciting episode for you.
Because we're looking in to the future!
Oh, damn, I didn't know we could do that.
This whole season is on the science of music, and today we're exploring a bunch of new, innovative technologies in the world of music.
First, we'll be sitting down with Dr.
Jason Freeman from the Guthman Musical Instrument Contest, a long running competition for new musical inventions.
And then we're going to take this show on the road for our first ever field trip for the So Curious!
Podcast.
And we're going to get a tour of Drexel University's Music and Entertainment Lab with the director, Dr.
Youngmoo Kim.
Bey, I'm curious - ding, ding, ding!
- About your personal experience with music tech.
So, what is the first audio format that you remember listening to?
Cassette.
Cassette tape, ummmm...
Okay.
Cassette tape, CDs, obviously.
Torrents!
Yeah, right!
I was not in the cassette era, but I did have the hit clips, if you remember those.
That was like -Throw back!
-the little baby thing, and you put it in and you just get, like, 90 seconds of the song.
And for some reason, we were like, "that's all I need.
It's perfect!" Okay, that's enough nostalgia, because we're going to actually talk about the future of music today.
Enough with the past.
We are talking to Dr.
Jason Freeman.
So.
Welcome to the so Curious Podcast, Dr.
Freeman.
Thanks so much for having me!
Absolutely!
Awesome,real quick, let's jump into it.
Introduce yourself, and let us know what it is you do.
So, I am a professor of music at Georgia Tech here in Atlanta, and I also direct the Guthman Musical Instrument Competition, which is an event that we hold on campus every year, identifying the newest, most innovative, and kind of creative musical instruments in the world.
So let's get into this musical instrument competition.
What is the history behind it?
How did it get started?
What's the mission behind it?
So the competition actually started in the 1990s as a piano competition.
There was an alum of Georgia Tech, Richard Guthman.
He and his wife are huge fans of the piano, and they really wanted to start a piano competition at Georgia Tech.
And we did that for many years, and had a wonderful piano competition here, jazz and classical, and people from all over the Southeast came to compete.
But Georgia Tech was really changing, and we now have Bachelor's, Master's, Doctoral programs in music technology, where students are really learning how to create new products and services that change the music industry.
And so we talked to the Guthman's about how we might re-envision the competition.
And so we arrived at this idea of doing a musical instrument competition, where the competitors don't play a musical instrument, but they invent musical instruments.
And so we have an international call that comes out every year.
People from all around the world submit their instruments that they've created.
We have a panel that reviews and selects a group of finalists, who we invite to campus for two days to share their work with our community here and with judges that we bring in who help us to pick the winners each year.
And this all culminates in a big public concert that we do at the Performing Arts Center on campus, where we match up each of these finalists with their instruments, with a musician from the Atlanta area, and they present a performance together that's showcasing what that instrument can do.
Wow.
Yeah this is so cool.
And that must be hard to judge, too, because it's truly like apples and oranges, right?
Everything is so different.
It is very hard to judge!
I was a judge once in one of the early years.
But every year there is kind of a debate about - what are the criteria that matter, how do we compare these things to each other?
And over time, we've arrived at three kind of broad based criteria that really matter to us in the competition that we carry from one year to the next.
One is musicality.
So what kind of music can the instrument make?
How expressive is it?
Is it something that rewards practice?
Is it something where you can develop virtuosity over time?
What is that potential to make new and different kinds of music and to be expressive with it?
The second thing is engineering.
So, what are the ideas that went into building this thing?
Are there new kinds of sensors, or new kinds of software or artificial intelligence that are helping it make new kinds of sounds, or create new kinds of tactile interfaces, or things that haven't been done before?
And then the third thing is design.
What does it look like?
What is the form factor?
Is it something that's beautiful to look at?
One of our judges a few years ago was the chief curator of the musical instrument collection at the Metropolitan Museum of Art.
So he brought this incredible perspective to our panel, looking at this history that goes back thousands of years.
If you think about the collection of instruments in the Met, and the study of musical instruments historically, and thinking about how these new instruments really offer different perspectives and different designs, that iterate on and move beyond the instruments that we've been using for centuries.
One of the things that surprised me most with the competition, we expected that all the instruments would be kind of high tech electronic musical instruments, and the majority of the ones in the competition certainly are.
But a lot of instruments have actually been acoustic instruments.
Many of our winners have been acoustic musical instruments.
In 2022, the winner was this instrument called the Glissotar, which looks kind of like a clarinet or a soprano saxophone, but it has no keys on it.
It has a ribbon that you move your fingers up and down and have this continuous control over pitch.
You put a soprano saxophone mouthpiece on it.
It was an incredible instrument that made sounds I had never heard before in my life, but completely acoustic.
(Glissotar sample plays) Can you walk us through the different categories of instruments?
I mean, you have string, you have wind, and are there some other ones that we may overlook or just not pay much attention to?
So, if you were studying orchestration or something like that and looking at instruments in a traditional way, you'd look at wind instruments, you'd look at brass instruments, you'd look at percussion instruments, string instruments, and so on.
And so they'd be classified by, kind of the mechanism by which sound is produced.
But when we look at the competition and we try to classify them, those categories don't help all that much.
So we've seen many guitars over the years, for example.
So, these are guitars that in some way extend what it means to be a guitar or to play a guitar.
I'll give you a couple of quick examples of this category.
So, one of our winners from a couple of years ago was the Lego microtonal guitar.
So this enabled musicians to play all kinds of microtonal scales that are common in different musical practices around the world.
By having a 3D printed fretboard where you could actually put Legos on in different spots, to be able to retune the frets kind of very quickly and on the fly.
(Lego microtonal guitar sample plays) One of our winners this past year, the Hitar, was actually just a regular guitar that sent its signal through a computer running machine learning.
When you hit the body of the guitar and make sounds with it in a particular style of playing, to be able to transform that to all kinds of different sounds, so it could sound metallic, or could sound like you were in the middle of a tunnel, or all kinds of different things.
(Hitar sample plays) So both of those instruments, very different approaches, but they're both extensions of a guitar in some way.
They're trying to take this traditional instrument and enable to do more with it.
And so a lot of instruments that we see in the competition, whether they're inspired by a guitar, or a piano, or a trumpet, are extensions of some traditional instrument.
And someone going to play them can use a lot of the technique that they've learned through years and years of practice, learning the original instrument, and bring that immediately into playing the new one.
So is there any criteria around - does it have to be able to play, like, a traditional twelve note scale?
It can really be anything.
What we're really looking for from our competitors is a compelling story about why they created this instrument and what they're trying to do with it, what kind of music, what kind of musicianship they're trying to enable through the instrument.
What we see is that people are driven by all kinds of different reasons to make new musical instruments.
Sometimes there's a natural phenomenon like the movement of water that they want to explore and use that as an inspiration for their instrument.
One of our winners this year in the competition was the Abacusynth, which was inspired by a traditional abacus, you know, the mathematical calculation device.
And they really wanted to see how they could translate that into sound.
(Abacusynth sample plays) So there's all kinds of, kind of conceptual reasons that might motivate somebody.
There's also things that they might want to be able to do live in performance that they can't do with a traditional instrument.
To be able to go control pitches beyond those twelve notes in the traditional, kind of, Western equal tempered scale.
They might want to be able to make all kinds of different sounds and timbres that aren't possible with traditional instruments.
They might want to be able to create new kinds of collaborations where multiple people can play an instrument at the same time, or they might be able to want a dancer to be able to activate a musical instrument through their movements.
Wow.
So there's all kinds of different things that drive people.
And when you say it's an apples to oranges comparison, it's not just about the music that the instruments make or how you engage with them, but it's really about the reasons that people are making these instruments in the first place.
So we're going to play a little game with you.
I want to hear - of course, I imagine everyone asks these kind of things - like, some of your Hall of Fames that you can think of off the top of your head.
I know you're on the spot.
So first I'm going to ask you, what are one or two that were, like, the most technically innovative?
The most technically innovative?
I think the ROLI Seaboard.
(ROLI Seaboard sample plays) So it looks like a piano keyboard or synthesizer, but it's made with these rubber kind of keys on the top, so they don't move up and down like a traditional piano keyboard does.
But you can push them and you can mesh them, almost like Silly Putty or something.
Not quite that much, but but they're very flexible as you move around and they have an incredible amount of sensing.
And of course, it's a digital instrument, so you can take that movement on a key left and right, or the pressure you're pushing up and down or forward and backward and really map that onto anything that you want in the sound to change.
Awesome.
That's dope!
And what has been your favorite sounding instrument?
Probably the Segulharpa, which was our winner in 2021.
This was an incredible instrument.
It's probably, not quite the size of me.
I'm a fairly small guy, but it would probably go up to my chest if I were standing next to it.
And it was a hybrid acoustic -electronic instrument.
So, you play it with these sensors, maybe a little bit of a piano technique, but they're much more sensitive than just traditional piano keys.
Then it activates electromagnets inside of the instrument, that then play the strings.
And so, the sound that comes out of this is really, kind of, otherworldly, alien kind of sounds, unlike anything I've ever heard before.
It's incredibly beautiful.
Bjork has performed with it on tour.
It's been used widely by lots of different artists that are really looking for a unique sound.
It's the sound that really sticks with me through all the years of the competition.
(Segulharpa sample plays) Oh, that's wonderful.
All right, so the last one just the most fun.
I'm sure there's plenty of novelty and whimsical instruments out there, but what have you seen as the most wacky wonky?
These don't have to be the winners, but yeah, we're curious.
Most wacky wonky instruments?
We had some pretty crazy ones in the early years of the competition.
There was something that was focused on spinning plates.
This must have been more than ten years ago.
The details are a little foggy in my mind, but I very distinctly remember that this person was spinning the plates around and there were sensors embedded in this that were somehow, like the speed of the plate spinning, and the height and all that kind of stuff was mapping onto the music that we were hearing.
That's awesome.
Wow.
Yeah, there's like some showmanship in that too!
And how does marrying music and technology help foster collaboration?
What's special about combining technology and music?
I think technology opens up new pathways for us to collaborate with each other, ways that we can collaborate at a distance with each other, and also ways that we can collaborate with the technology.
Right?
So it can enhance human to human collaboration.
But there's also so many possibilities in kind of the human-computer relationship.
So a machine musician can take on intelligence, it can analyze what someone's playing.
It can become a really incredible partner that can spur new ideas and push you in directions that you might not think of otherwise.
Last question for you is, what advice would you have for anyone who might be interested in inventing their own musical instruments or music technology?
What advice would you leave people with?
Just do it!
It's so easy right now.
We live in a time where you can download free and open source software to your laptop and start programming code to make music.
You can start building your own virtual synthesizers.
You can get a few sensors and start putting something together.
There's so many tutorials and opportunities to learn online or to join communities of people that are doing similar things.
And so I think we're at an incredible point right now where anyone, whether you're an elementary school kid or you're retired, or anywhere in between, has the tools and kind of the ease of use at your disposal to begin experimenting and innovating.
And so just start and look at the examples.
At the Gusman Musical Instrument Competition website, we've got videos of all of our past finalists and winners so you can get some inspiration for them, and then just come up with an idea, and start hacking on your own.
Awesome.
Well, thank you so much, Dr.
Jason Freeman.
It was so wonderful to talk to you.
Thanks again.
Thank you!
Awesome.
Thanks.
I really enjoyed it!
All right, that was amazing.
Thank you so much, Dr.
Freeman.
We're going to have to start planning our inventions for the next year's competition.
Yeah.
Okay.
Well, then what do you think?
I mean, I guess let's just workshop now.
A bow, something like stringy?
I don't know?
But that's probably not original, is it?
Well, he said you can use anything.
So, what if - OOH.
Okay, hear me out.
We make an instrument that's made out of some sort of food, so then the big showmanship at the end, like, we play something beautiful.
Everyone's crying, everyone's like, that was the most beautiful music you've ever heard.
And then right at the end, when everyone's clapping, we eat it.
And then they're like, oh, my God.
Is this, like, ASMR stuff?
Yeah.
I don't know what the music's going to be.
I haven't figured out the instrument part, but the theatricality of, like, if we make something out of food and then we eat it, like, how dope would that be?
I'm right there with you, we're going to win this.
Yeah.
You're telling me we're not going to win if we eat the instrument?
Come on.
It's going to be cheesesteak based...
Yes, Go Birds!
Yes!
All right.
Well, okay.
Let's stop talking about hypothetical inventions, no matter how good they would be, because next we're going to be learning about some current musical innovations that are taking place right here in Philly.
Next time you hear our voices, we'll be reporting on the ground, outside of the studio.
This isn't awkward at all.
So weird.
We don't have headphones on.
We don't have anything.
And today, we are joined by Dr.
Youngmoo Kim.
Thank you so much for joining us.
Hey, it's my pleasure.
Thanks for visiting!
Absolutely.
We are excited to have this conversation.
First, introduce yourself and tell us everything that you do.
Everything?
Oh man, how much time do you have?
No, I'm Youngmoo Kim, I am the founding director of where we're standing, what's called the ExCITE Center at Drexel University - Expressive and Creative Interaction Technologies.
This is a research institute about the intersections of technology and creative expression.
So technology and the arts, but so much more.
I am also Vice Provost of University and Community Partnerships at Drexel University, and a faculty member of the Electrical and Computer Engineering department here at Drexel.
Could you explain the space that we're standing in?
It looks like a play pen.
Yeah, it is kind of a play pen!
So this is our collaboration space.
We have meetings here, we do work here, we have workshops and events - sometimes musical performances as well!
And a lot of K-12 outreach activities as well.
We host a lot of camps and after school programs here.
But why don't we head over to the piano since that is sort of the centerpeice of this space.
Yeah yeah, let's do it.
Hell yeah.
I guess, as we go over there, can you explain the mission behind the Music and Entertainment Technology lab?
Yeah, so my lab is one of three labs that are based here at the ExCITE Center.
Our mission is really to explore that future of music and media through technologies.
So this is one of our examples.
This is what we call the Magnetic Resonator Piano.
It's a standard grand piano - right?
(Plays piano) But it's also augmented with electromagnets.
Those electromagnets don't touch the string, they're about a quarter inch away from the string.
But by varying that magnetic field using electricity and electronics, we can vibrate these strings in very unusual ways.
So let me fire it up...
And, Kirsten, can you explain to the listeners what this looks like right now?
Yeah, it looks like a grand piano that has gone through like a science fair.
You know, in movies when somebody has to, like, do the wire thing so that something doesn't explode?
Oh, yeah.
Kind of looks like that as well.
Yeah.
I will say that nothing here is dangerous!
Second, nothing is destructive to the piano.
We can actually lift all of this off and pack it up in cases and install it on other pianos, which we have done.
This system has been used in concerts and recitals on multiple continents.
It is on the soundtrack to a Disney movie, Christopher Robbin.
Yeah, the composer for that soundtrack, J on Brion, loved the sound of this and did a recording with my former student, Andrew McPherson, who did a lot of the development work around this.
So I'm going to invite my friend and colleague, Daniel Belquer, our artists and residents here.
Daniel is an amazing musician and composer and technologist, to demonstrate a few things on the magnetic resonator piano.
(Daniel plays Magnetic Resonator Piano) So, it's a piano that can vibratto.
(Daniel plays Magnetic Resonator Piano) Oh, my God.
See, hearing that come out of, like, a traditionally acoustic instrument is so insane.
It was amazing!
It had the - from I mean, I noticed a little bit of what you were doing, but it had the vibrato on the keys and you were just moving side to side.
Yeah.
Are you able to talk about that a little bit?
Because I know, like, normally when you strike a piano key, it's just - ding!
Absolutely, it's game over, right?
With the piano, it's just about how hard you press the key.
But with this, because we are using the electromagnet to continuously vibrate the strings, right.
And all of this is acoustic, by the way.
People always ask, is there a speaker in there?
No, this is purely acoustic.
This is string vibration.
But because we can control that continuously, there's a special sensor over the keyboard that doesn't get in the way, but it's controlled right from the keyboard.
It sends information to a computer, which then generates the electromagnetic signals and go through an amplifier here.
So we can control that sound very, very precisely.
But maybe I'll ask Daniel to comment since he has played this instrument multiple times and what that feels like.
Please.
No, it's amazing because it completely opens up other dimensions of the instrument, right?
So, you have to play using the regular piano technique, but also other ideas.
So, as you were saying, like how to sustain a note after it has been hit, which is not feasible on a regular piano.
Yeah, I've taken the basics of piano lessons, but nowhere in the basics did they say, "hey, wiggle your finger at the end of it.
And if you'll get more sound," it's always just, strike it, that's it.
Well and yeah, if you go to a classical concert, there are these virtuoso pianists who do that, right?
Who want that ability to kind of continuously affect the sound, like you would on a violin or with a voice or with a wind instrument.
You can't do that on a traditional piano, but on ours you can!
That gesture becomes meaningful.
And how much control would you say you have over that wiggling?
The software is completely configurable, so it can actually change the waveforms that drive the piano.
So you can have things that go really fast and stop, or things that drive really slowly, the note.
So if I just - (Daniel holds note on Magnetic Resonator piano) - hold the note slowly, this sound will stay here.
So as you press the note, you start hearing - (harmonics excited) - you see, "bo be be bep" It's just a single note with the harmonics being excited.
After all.
We're at ExCITE Center.
Haha, absolutely!
So harmonics are a different mode of vibration that you, traditionally, you can get on stringed instruments like guitars and violins.
You can have the open note, or if you lightly press your finger on a string without actually pressing it all the way down, you get double the frequency and you can actually get triple, or quadruple, as you go up higher.
That's sort of the basis of vibration and of music.
But normally you can't do that on a piano.
(note played) That's the harmonic series for this note.
Normally, all those notes, all those pitches happen when you press one note.
Right.
But here we can separate them out and you can do it on the whole chord.
(plays Pachelbel's Canon using harmonics on Magnetic Resonator Pia This is so cool.
And Youngmoo, is that specific equipment that's on that piano, is that the only one or are there duplicates of it now?
There are a couple now, it's under five.
Oh, okay.
Wow.
So, yeah, very few.
Yes.
And we do get requests from musicians and composers who want to use the system.
Okay.
What room are we stepping into now?
This is the workshop, and this is where the magic really happens.
There's a ton of different projects here.
This is where a lot of our work and our research happens.
I'm going to hand it over to Daniel.
Do you want to tell them?
So the name of this project is Music, Not Impossible.
It started off in 2014 as a way for the deaf to experience music through vibrations on the skin.
And this project has been receiving many awards and celebrity endorsements.
Lady Gaga, her dive bar tour launch was using this technology in Nashville.
Oh, can I put this on?
Oh, my gosh.
let's get you all suited up.
Thank you, I would love one.
I totally wasn't expecting this.
All right, so, yeah, for again, the listeners, this is a vest.
It kind of takes on the laser tag like structure, and it vibrates.
Okay, so it seems like we are fully suited.
So Bay and I are here in the hottest of 2023's newest fashion.
We are wearing vests that look, like I said, like laser tag, or like VR maybe, but we are about to hear music with them on.
Well, the vests each have a bunch of vibration motors.
So the goal of these is for someone who doesn't have normal hearing, or is deaf, can still experience music not through actual sound, but through vibration through the skin.
So the system is designed to kind of transmit that feeling of music just through the vibration.
And folks who are differently abled and may not have full hearing, have you had any personal accounts from them?
I can only imagine the experience...
Well, Daniel will tell you.
Well, one, Music, Not Impossible has a great website, and there are some wonderful videos and testimonials there.
But two, literally bringing people to tears.
For someone who hasn't been hearing, experiencing some connection to music is a very moving experience.
Also, even for people who aren't deaf, but being in an environment where you have hearing audiences and non hearing audiences, dancing together.
Mmmmm...
As we expanded, we saw it was not just for the deaf.
It was appealing for everyone as an augmented experience.
So now you're in a venue, you can 't tell who is deaf and who is not.
And this is super special.
This is really cool, too, because this kind of, like, calls back to the So Curious!
Season one, where we were talking about the human enhancement wearable technology.
You said about crying, I'm thinking about - I've seen those videos of people who are colorblind putting on the glasses and seeing color for the first time.
And those always make me cry.
I can't even imagine what this is like.
Okay, we're going to start with a piece created by my friends from LA.
It's called Beacon Street Studios.
So let's get started.
Let's do it.
(Music starts playing) Oh, yeah.
Oh, man.
So again, we have this vest.
It's wrapped around our torso.
I'm trying to make notes of what instruments is playing, but as the instrument plays, it vibrates at different parts of the vest and the torso and the ankles and wrists.
And it's like bringing dynamics into the area.
Every little side piece is like, being activated right when it should like, they weren't all going at once.
And it feels like how a piano sometimes you're only using, like, two keys and sometimes you're using, like, six of them.
And I'm feeling it in my back now.
Yeah.
It's really difficult to describe, but the best I can say is, like, the instruments are individualized in different areas of the vest.
Oh, that was crazy!
Yeah.
It can move in waves...
Yeah.
And it travels from one part to the other.
Definitely the neck and shoulder.
I know we're supposed to be describing, because we're professional podcast hosts, but this is insane.
I am doing my best to not dance.
I know.
Which is really awesome because I'm like, the vibrations, really kind of like, get you going.
And we can make them sound too, when we want, because part of the experience for the hearing is actually to produce the sounds on the devices.
So depending on the frequency, we can make them sound.
So as you see, now...
There's intensity, now there's intensity.
So there's light vibrations, and then there's like, wow, that was a strong one!
Oh, my God.
So there's like a strum of a guitar that I feel in my arms.
Yeah, there's no sound now it's just you guys.
Yeah.
That's just what's on your motors.
So now imagine 100 people or something.
It's just that strong.
Yeah.
Wow.
Yeah.
It feels like an entire band.
How did you program the vibrations to go along with the specific music that you choose?
So we have a few ways of doing that.
The initial goal of the project was to broadcast live music to the audience at a concert.
Right.
So we would get several channels of instruments from the mixer on the stage, like the drums, the guitar, the vocals, the bass.
And then we would just transmit it wirelessly to the audience.
A few years back, as we were working with Mandy Harvey and I don't know if you're aware of her, she was one of the finalists for The Voice, and she's deaf since she was 18.
She's a friend of ours and a very close collaborator.
So she said, "Daniel, I really would like to feel my recorded songs, because I had to produce it and record in studio, but I cannot sing to a recorded track.
I have to have the band, because I need to touch the piano, I need to feel the vibrations through my bare feet.
So I started working on a system for us to play the vibrations in sync with the song, and now it evolved into a whole full composition platform.
So now we can design and we can also combine both.
We can have the live music with prerecorded stuff as well.
So creating all kinds of crazy effects, you can control - Wow...
You have 24 points on your body, and they are individually addressable.
So think about LED lights, right?
If you want to control each LED, and to do a different effect, you can do it.
It's the same thing with vibrations.
We can make movements.
We can create all kinds of textures individually controllable in a composition.
One might say, like, having these buzzing pads on us are a little, like, disruptive.
But for me, the experience is it's instigating movement.
It's like telling us where to dance, how to dance.
It feels like the Dance Dance Revolution game.
It feels like it's showing you how to dance.
100%, it's physical sensation.
I mean, the moment that track hit, I started jumping.
It was like we couldn't not!
Yeah yeah, no, it was really cool.
That's amazing.
Wow.
We did one more thing to show you!
Yeah, let's do it!
Okay, so we have one more thing.
So we have to lose the vests, all right.
So I guess the theme was about the future of music, right?
So we showed you kind of the future of music being new instruments, accessibility, but it's obviously also going to be around computing and AI technology as well.
So, this is Charis, Charis Cochran.
She's a PhD student in electrical engineering here at Drexel.
She is using AI to automatically learn the sounds of musical instruments, alright.
So, how do you do that?
Yeah, yeah!
That was my first question!
First of all, how?
How do you train an AI to do anything?
How do you do that?
Yeah, so basically, we've gotten together a bunch of data.
So it's about 5 hours of music where we have labeled the predominant instrument in each of these examples.
So there could be other instruments in the background, but say, for one example, you have flute and then, like, cello and things going along with it.
With that, what I've taken and done is trained a diffusion model, which is similar to a lot of the architectures you see in the text to image space.
But now we're taking these musical samples, adding a lot of noise until you can't recognize it anymore, and then training this model that will take out all of the noise and steps and produce music.
And then we condition it on these instrument labels so that we can control which instrument we're trying to produce here.
There you go.
And that's how you train it.
Oh, my God, it's that easy, Bey.
Why didn't we think of that?
So this is using deep neural networks, which is the technology behind all of AI.
It's just that it's being applied specifically to music data.
It's not just, hey, here's my entire music library.
It needs to know, okay, this recording is primarily cello.
This recording is primarily piano, right?
So there is a data set that Charis has been using that is labeled that way to train this.
And what we're going to demonstrate is a couple of things that it's learned about what a particular instrument sounds like.
And so it would eventually be able to be like, let's say, I'm a pianist and I am listening to a song that I love and it's a full band, and I want to just isolate the piano part to hear it.
Like, it would be able to isolate just certain parts.
Yeah, that definitely is one application of AI.
So these models can be used for source separation.
Right now, a lot of them are being trained just for generating new pieces.
So you give it noise, and you don't tell it anything about the context of the music, it'll make up whatever kind of piano it wants to.
Wow.
So we have a couple of examples here.
And this is still early days with training this model, but this is noise turned into flute.
(AI-created flute sample plays) And just when she says noise, what it started with was, "PSHHHHHHH".
Oh, my God.
We say, hey, give us a flute, it turns it into a flute.
Nothing musical was given to it, like no notes or anything, it's just that.
Yep.
It's just that that is quite amazing!
Damn!
And then we have saxophone.
(AI-created saxaphone plays) Yeah.
And they're short right now because it takes a lot of processing power to generate these right now.
So we just create, are they three second, five second clips?
Yeah, three second clips here.
Okay.
So here's trumpet.
(AI-created trumpet sample plays) Wow.
So there's a little bit of variation, but yeah, no musical information was given to it.
And there's a lot of like, the conversation around AI is very, very fresh and new and it's being thrown around a lot.
Hot topic.
Yeah.
What's a misconception that people aren't getting or just completely missing.
I mean, the reason things like Chat GPT are so good, or scarily good, is because they've been trained on all the text on the internet, right.
And there's a lot.
Which is enormous.
And there's a lot and that's all written by humans, so it sounds very human.
Like doing it with music data, even though there's tons of music out there, it is very hard to get that kind of specificity.
Right, you can go listen to whatever you want, but if you try to say, I only want, like, saxophone solos, that's actually still a hard thing to find.
So it's about labeled data.
It's also, sadly, a bit about copyright.
In the image space, in the text space, it's just much easier to get lots and lots of data.
On the music side, it's been harder because most of that stuff is copyrighted.
And you can try to train models using public data, but then you're not using necessarily the latest or the most popular types of music.
So that's a bit of a challenge.
So more and more places are trying to create agreements with companies or record labels to get more and more of that data for training.
But that's the frontier right now.
Wow.
So if I'm understanding what you're saying, it's basically that a I can listen to a saxophone all day, but until someone says that's a saxophone, it doesn't know what it is.
Is that accurate?
Yeah, exactly.
Wow.
So this is for both of you then.
Why do you think it's important to push for innovation within the world of entertainment and entertainment technology?
Because there is a lot of pushback.
The entire history of music and entertainment has been a symbiotic relationship between creativity and technology.
You don't get one without the other.
We wouldn't have had advances in filmmaking, in stagecraft, in operatic performance, or certainly in electronic media as well, without technology infusing some very, very creative people with new ideas about how to express themselves.
So it's a symbiotic relationship.
If you take one half of that away, you're going to lose all that.
That technological advancement is absolutely necessary for the creativity of the future.
And what future development, even if it's just a concept, we saw a lot of technology here, but what future development and entertainment tech are you excited about?
Yeah.
What do we have to look forward to?
So I can say a little bit more about the types of models that I'm working on.
I'm looking at models that you could say, "okay, generate a full song" or "generate a full album", and then you could then have tracks for each of these instruments.
And instead of a blank slate to work from, now you've got a little bit of creative freedom and, like, a starting point to go off of.
Wow...
Some people have compared this to, like, discovering fire.
Is that like, how do you feel about that?
We won't know for another couple of decades, or maybe centuries, particularly with AI.
A lot of us do think this is an inflection point, right.
AI is capable of doing things that we thought were impossible when I was in graduate school, 25 years ago, it's advanced that rapidly.
It's doing things that are, in some ways, very scary, and I acknowledge that.
And there are plenty of ways that AI could be used for evil, for ill.
I would say that there are also plenty of ways AI can be used for good.
It's the sounds that we haven't even conceived of, right?
The instruments, the sounds, the combinations that you just cannot practically experiment with in real life.
Opening that up digitally will only enable more creative and artistic possibilities in the future, I think.
Hell, yeah.
What a great place to close.
Awesome.
Thank you so much for having us.
Thank you for visiting!
We're used to having people in the studio, but we're here in your actual space.
This is amazing, we should do this more often!
Come back again!
We'll have more stuff!
Oh, yeah, we will!
Yeah, we'll come back in a couple of years, and I'm sure it'll be like something crazy new that Charis came up with!
Thanks again, Dr.
Kim, for having us.
That was honestly amazing, I'm mind blown!
There is just so many amazing innovations in this world of, like, music tech.
It's crazy that we're going to get to see how they're all going to be integrated into the music world in the coming years, but I cannot even figure out how they're going to do it.
Yeah, and listeners better be sure to tune into next week's episode because we're going to look at even more innovations, this time, how music can affect our physical performance and health.
And so there's really interesting things where doctors are even considering prescribing arts experiences for general health and wellness.
So listeners, please, please subscribe.
Why have you not already subscribed?
Bey, they haven't subscribed yet?
Yeah, and they have to give us five star reviews, for real.
You have to give us a five star review.
If you haven't subscribed yet, I mean, I appreciate that every time you want to listen, you just have to type us in and search us.
But man, if you subscribe, you're going to get a notification every Tuesday this summer when we release a new episode.
So, go ahead and do it wherever you listen, and we will see you next week!
This podcast is made in partnership with RADIOKISMET, Philadelphia's premier podcast production studio.
This podcast is produced by Amy Carson.
The Franklin Institute's director of digital editorial is Joy Montefusco.
Dr Jayatri Das is the Franklin Institute's Chief Bioscientist.
And Erin Armstrong runs marketing, communications and digital media.
Head of operations is Christopher Plant.
Our mixing engineer is Justin Berger.
And our audio editor is Lauren DeLuca.
Our graphic designer is Emma Seager.
And I'm the Bul Bey.
And I'm Kirsten Michelle Cills.
Thanks!
Thank you!
See ya.