Generative AI + Creativity Student Lightning Talks Part 2

Generative AI + Creativity Student Lightning Talks Part 2

Hi, this is Wayne again with a topic “Generative AI + Creativity Student Lightning Talks Part 2”.
Ok. Welcome back everyone., So for our second session, we will once again open with some really work right at the cutting edge. You’re, going to hear from a range of students. And then we’re going to go to our second panel, where we have voices from visual arts From literature from human-computer interfaces., And so it’s quite an exciting second session as well continuing some of the themes from the first. To kick us off.

Please join me in welcoming graduate student Vera van der Sepp. [ MUSIC PLAYING ] [ APPLAUSE. ] Hey.. Can you hear me Yes., So this is not me as you can see., But today I’ll be talking about a project I’m working on called Tomorrow’s Typography.. Basically, all of the work I do involves AI and new technologies and typography., And maybe not all of you know, — you’re here for AI., But typography is basically what language looks. Like. Yeah. And I’m very interested in figuring out how typography and also visual communication as a whole are intertwined with emerging technologies, which is something that has been happening since the invention of the printing press..

Generative AI + Creativity Student Lightning Talks Part 2

But now, let’s just zoom in on the last six years, which is about as long as I’ve been working in this, which in generative AI time is like a dinosaur.. When I started, I had to build my own computer because it was more efficient to basically generate this.. Well, I don’t know if you can actually read any of this, but this is only three years ago.

And it’s just so interesting to see and follow this progress that we’re seeing now. And yeah. You could already do interesting things like back in 2020, with, for instance, exploring the latent space and morphing between typefaces or even the latent space of one letter.. So you could go between one letter –. You could basically make it look differently., But now in the last year — – and I think ChatGPT is one year today – — we’ve seen such an immense technological, advancements.

And yeah also for type this is great. And for visual communication. This is great because you can generate them.. You can basically also make texts, look differently.

Generative AI + Creativity Student Lightning Talks Part 2

And I’m mostly interested in figuring out how generative AI can help type designers and can help creativity.. So, for instance, how can we have generative AI help, automate a lot of type design that normally takes years to make, Or how can we even make animations just with a single prompt And also I didn’t make this.? This is a movie — Minority Report., But I’m also really curious in seeing how we can actually also move towards more intuitive type interfaces outside of laptops and PCs., And we already have the technology.. We just need to get there., So I’m basically inventing tools for typography, but also for creativity as a whole., And I’m really excited to work on this in the coming months.. So if you’re interested in these future sketches it’s on the third floor. And please come by for that. Thank you., [ APPLAUSE, ] [, MUSIC PLAYING ] Hi, everyone., Hi., Hi everyone..

Generative AI + Creativity Student Lightning Talks Part 2

My name is [ INAUDIBLE ] And I am a first year PhD student in the City Science group at MIT Media Lab., And today I’m going to talk about how we use generative AI for collective urban planning.. So in city science, we design tools like this one. For urban prototyping and simulation., We test different scenarios in an area of the city that is usually under.

Development. – In this case, in Kendall Square, we have different stakeholders, citizens, decision-makers, designers, coming around this physical table and editing parts of the city. While we see on the screen the repercussions of these changes in different metrics., So while we have already used AI on these tools, we have been mainly focused on the efficient., So we have been optimizing different individual aspects of city like traffic flow..

Now we are using generative AI to focus on the actual experience of people and places., And we have built this tool that you can see here for rapid scenario testing in a much more immersive setting where we can actually speak our thoughts to trigger new designs.. So, let’s take Kendall Square as our reference in Cambridge again.. We can make changes on the physical layout..

We can change features such as the height or the density of the buildings.. We can then navigate the model finding a location on the web user interface., And we can share a vision of that place, the vision that we have in a qualitative way and see the result on different renderings.. So the idea of this tool is that we can iterate over and over in this process until we reach a consensus. For the past few months, we have been gathering insights from people using this tool in the lab downstairs. And we have asked them.

What do you want your city to look like, And these are some of the results that we are having. So to wrap up? We are using generative AI in collective urban planning to explore critical ideas and mainly to transform individual inputs on how we all imagine the cities and communities in the future to a shared and actionable vision that we can actually implement.. So with these tools we are prototyping locally, but it allows us to envision global stories.. That’S all.! Thank you for listening.

[ APPLAUSE, ], [, MUSIC PLAYING ] Hello., Hello, everyone. Hi.. My name is Pat., I’m a PhD student. In fluid interfaces group. – And I want to talk about how we can use AI to actually help us cultivate, wonder and wisdom. And for your interest, I love dinosaurs.. That’S why I dressed like this today., But these dinosaurs are very important, because it’s actually how I get into the media lab. When I was really young.

I loved dinosaurs. And my parents told me that, oh you love dinosaurs. You should pay attention to art class.. That way, you’ll learn all these things about how to draw your own dinosaur and wonderful things..

But you also pay attention to science class because that’s when you get to the biology and the wonder of nature right And that idea of personalized learning using the thing that you’re interested in to drive learning has been the central idea that I was focusing on.. Imagine in the future we love Einstein., You can actually learn from Einstein. Or if you love, Marilyn Monroe Monroe could be your teacher. Or Harry Potter could teach you I don’t know.

Quantum physics. Seems magical right. We published this first paper to show that a personalized AI character can have profound impact on learning.. We have an experiment showing that if you learn from someone that you might like or admire a.k.a. virtual Elon Musk, you might actually think more and pay more attention to the class and have a deeper interaction with that lesson., But who likes Elon Musk.

Now, That’s why I think it’s much cooler if you actually have a dinosaur as your virtual character., It’s free motivational., You don’t need to pay for copyright., But that idea also led us to thinking well beyond just learning from this virtual character. You can actually make someone from the past so that you can learn about history in a more interesting way.. We built a system that allows us to do that and have shown that this system, if you complement it with reading textbook, can actually make people be more curious about history and have a higher level of motivation to learn about that..

But this system always hallucinates. Mona Lisa AI, could say I love MIT because she’s at this amazing event.. So I suggest that maybe we have dinosaur Mona Lisa instead to constantly remind her that this, it’s a recreation of the past, not exactly the past., And I think it’s much cuter too..

The last project I want to mention –. My time is running out. — is using AI to actually create multiple versions of the future., So not only we can create the past.

We can also allow people to imagine themselves in the future.. We use this generative AI to help people think of their future self. When they’re much older, like when at the age of my advisor for example., I think that would help me think more about what is important. And you can do multiple versions of you, not just one., And the reason behind these multiple versions is that if you have One AI: you might actually follow that AI blindly.. Our studies have shown that people tend to just take the perspective of the AI., So if the AI becomes more extreme in one way or the other way, you might just follow the AI blindly.. So that’s why, with the power of AI, you can generate multiple versions of you that you can learn from get multiple perspective and finally go from just intelligence, which is a narrow way of thinking about a way to be effective.

To be able to have wonder and wisdom., I think that’s the most important goal for AI research., Thank you so much. And with dinosaur too. Yeah., [ APPLAUSE, ], [, MUSIC, PLAYING ] Hi everyone.. My name is Hope. Schroeder. – And here is an image of what I hope to wear as a PhD graduation cap someday.. I am a PhD student at the MIT Center for Constructive Communication here at the Media Lab. And today I’m going to be talking about cocreating, AI and how it can spark new ideas.. So we can take for granted that people are using AI tools for image. Creation. Dall-E is extremely mainstream, as are Midjourney and Stable Diffusion., So anyone can take up their idea, use a text-to-image model and generate some ideas.. Why AI for this use case? Well, it’s making creativity more accessible and ideation faster..

In some research we wanted to understand actually how cocreating, with AI, affects the design process.. Our first question was: do AI-generated images affect what we create in real life off the screen? So we had people do an activity where they built sculptures after brainstorming with AI text-to-image models.. So we had an artist look at text-to-image model generated images of their choice., And then we gave them some sculpture, building materials., And you can see our happy participants making all kinds of crazy creatures.. Afterwards we found that AI-supported brainstorming does affect physical design in 3D space..

So here we have on the left, an AI-generated image of a building with a red roof, green grass and shiny river., And on the right we had an object that someone built with an image of a building: — red roof, shiny river made of CDs.. This is, of course, an extreme example, and not everyone was so literal., But 75 % of people self-reported that seeing these AI-generated images did affect their final design. And overwhelmingly people said they would use these creative tools again in the future, reflecting what we already know, — That these tools are here to stay.. If you’re curious, you can read more in this workshop paper from AAAI earlier this year.. Our next question was: why are AI-generated images so useful in brainstorming like we saw So we had people have a conversation with each other about cocreating the future..

Two participants had a conversation about what they would like to see in their community in the future. And here’s a real example., Someone wanted to see with their partner biophilic vertical gardens, making public space more beautiful.. We gave that to a text-to-image model. And this was pre-Dall-E Stable, Diffusion, high fidelity era., And we got this image of a vertical garden in public space.. We then had interviews with people and some patterns emerged..

The images did spark new implementation ideas.. So talking to someone who was reflecting on seeing their idea, visualized — notice here that there were no pumps and tubes., The insight was that maybe natural features could be used to design more beautiful public spaces.. Second, we found that these images sparked association with unexpected concepts.. So there was no ocean featured here which was surprising but gave the person a new idea..

You can read more in this paper later.. So why do these images matter? Well, our research shows that they introduce new ideas which is important for the creative process and that what we visualize in 2D affects what we create in 3D. And you’ve already heard a lot about that from [, Vault, ] and Kathy and others in this session.. There’S a lot of interesting research still to be done. And we hope you’ll take a look at our science perspectives piece from earlier this year outlining some of the ethical questions about labor and creativity in this space. Thanks so much.

[ APPLAUSE, ], [, MUSIC PLAYING ] Terrific. Well thank you.. I just have to say this is my 22nd year as an MIT professor. And the dirty secret, which all faculty know is the students are smarter than us.

And we just try to keep up., And that was further evidence for more data. Points.. Thank you so much. [ CHUCKLES ], [ APPLAUSE, ] .