Amelia Winger Bearskin Interview

 

Amelia Winger Bearskin Interview

S1 Speaker 1 0

00

 

Appreciate.

S1 Speaker 1 0

02

 

OK. So I guess just to give a little bit of background on this project, it's, we're writing, I'm writing with one of my previous professors from the Texas Immersive Institute.

S1 Speaker 1 0

14

 

And we're doing a book chapter about remix and how A I is, you know, affect like the artistic or the creative act of remix and the art and creative world in T Home Your work Projects Community Upgrade · 7 days left J general.

S1 Speaker 1 0

28

 

So could you maybe give a little bit of background on your first exposure to A In applications and using that in your creative process?

S2 Speaker 2 0

40

 

Sure.

S2 Speaker 2 0

41

 

So I am Amelia Winger Bearskin and I'm the Banks Preeminence, Chair of

Artificial Intelligence and the Arts at the University of Florida at the Digital

Worlds Institute. And I'm the founder and director of the A I Climate Justice

Lab.

S2 Speaker 2 0

56

 

I first started working with a I when I was a professor at Vanderbilt and I think

it was around 2006, 2007 with a professor named Doctor Pim Sen Gupta who

is working at the Mind Matter Media Lab at the Peabody School of Education

in the area of learning science. And we are interested to study how creativity

and innovation can be taught to Children from a computational standpoint

with A I curriculum and education.

S2 Speaker 2 1

23

 

I, you know, as an artist was thinking about creativity, innovation differently

then the learning scientists were thinking about it. But we wanted to do you

know, qualitative and quantitative research or so that we could give good

recommendations for the Obama era. A STEM classes which at the time was also

that, you know, that idea of like steam was being introduced and how arts

could be critical for a learning stem. You know, I feel kind of I I feel the

same way now as I did then, which is like adding arts to stem is kind of silly

rather than seeing like all of this is a comprehensive way of teaching

someone art isn't like something you throw in on top of it and STEM isn't

something necessarily separate from the humanities. Like these are all very

integrated. So even at the time, we were like, I don't know about the Steam

thing, but we were like, you know, trying to get at least the the type of

vernacular that professors and teachers and educators and legislators were

I used to think about creativity, right?

S2 Speaker 2 2

20

 

And so we were at that time in A I and like the 2006, 2007 era A I was just

coming out of its phase of being considered something more like a

philosophy or something that wasn't considered like really hard computer

science in, in a lot of institutions, which is silly, but that's how it was back

then, you know, and it was starting to move into having some real results with

multi agent modelling. And so we were getting some good traction and it was

some of the first you know, conferences and papers and things that were

being written where people were beginning to think that A I, it was something

that could move on beyond the theoretical and into practical applications.

And then in 2015, I founded the DBRS Innovation Lab where we used it was

housed in a Wall Street company DB RS was the, our, our lab was in, was on

Wall Street and it was dedicated to creative application of machine learning

technologies. So combining quants with cutting edge data visualisation

libraries and I had a different artist that would come into our lab every month

and do a residency with my quant. So I'm machine learning scientist, my UX

designers, my, you know, D three dot Js experts at the time, you know, that's

kind of how we were doing visualisations and we would do different projects.

So the first project that we did was working with a convolutional neural

network and visualising that in virtual reality, we created the very first A I

generated screenplay which won a short film festival it was called Suns

Spring. You may have seen it. Right. So that was in our lab, we did something

where we interviewed the Guardians. We took all 6 million of the Guardian,

which is publication in the UK, their news articles and bedded into sort of at

that time, it was just Gans, and then tried to interview it as a person to kind of

interview, what would the Guardian as a singular being kind of do this

interview with? So there is a lot of research and expression in that way. And

the reason for it was a lot of the time series data that Wall Street uses in the

types of products that you might make money off of or be able to introduce

in the market. It's still unregulated to use A I I think even still now it's still

unregulated to use A I. So we didn't want to use actual time series data and

then make suggestions within the market because that would be illegal. But

We want to show people the capabilities of what I could do. And so using

these creative projects to do so help people think like, oh wow, we didn't

know this is possible and you have to understand in 2015, there was no chat

GP T or dolly or mid journey or anything like that. So people really were like,

What do you mean? I can make an image that seems ridiculous. To

people like, oh, it can draw a box like I can do that in code.

 

S2 Speaker 2 5

02

 

So we really needed to kind of push the limits to show people the capability

of what was possible with A I. And at that time, there were not, there were

some publicly available models but really it was just a bunch of researchers

at different universities communicating with each other like, oh you created

this in.

S2 Speaker 2 5

19

 

That's great. We can try this here. Oh Google has open source, this one

notebook. OK. We can combine it with that. Can we rewrite this, you know,

statistical format into javascript. Can people input that? Right? Like this was

all very kind of very, very collaborative and the, the big problems hadn't been

solved, not that all problems have been solved, but these kind of larger

language models and things hadn't been solved in such a way that we were

all experimenting, right? And there wasn't sort of the one way that we were

doing object recognition even, right.

S2 Speaker 2 5

55

 

So it's pretty early even though that's not that long ago, you know, trying to

figure out can we even do this in javascript? Right?

S2 Speaker 2 6

03

 

So, you know, or, you know. Right. So, and computationally expensive, you

know, like spending because we were Wall Street, we were able to spend a lot

of money on Aws credits in a way that I think would be a lot more optimised

now.

S2 Speaker 2 6

16

 

And that was probably the first time I was able to work with a lot of artists in

my lab. And then, of course, I was using it in my own work, both as a virtual

reality director and XR director and as an artist and filmmaker.

S2 Speaker 2 6

31

 

And I continue to do that. Now, the new thing that I've added to my practice

is working with a lot of environmental data. That's why I'm the head of the A I

Climate justice lab. I have this project called Talk To me About Water. And it's

a group of water scientists, machine learning scientists.

S2 Speaker 2 6

47

 

One of our collective members is a machine learning scientist and also builds

flying cars and planes. And we have others who are like work at the USGS in

water data publication. We have others who are, you know, musicians and

writers and activists and human rights activists, dark patterns, researchers.

OK. My dogs and, and study like sort of A I and user patterns and dark

patterns in that way. So we have a very diverse group. You can check this

out, talk to me about water.com and see the other members. But we, we have

this collaborative and we do our, our goal is to create conversations around

the global water crisis in public forums, like museums, libraries, conferences,

film festivals, places where we would be able to engage and interact with the

public in a way that a lot of environmental scientists don't, don't get to have

that kind of impact where they really, you know, educate and talk to the

public in that way.

S2 Speaker 2 7

44

 

And so that's how, that's sort of what began the A I climate justice lab was

looking at how A I could work in, in fostering those communications, but also

helping to open source a pipeline so that others can explore environmental

data through some of these new and creative and powerful tools. An example

of that would be OK. Come here, what is it, what is it here? Babe, come

on. What's going on? Crying gets a treat.

S2 Speaker 2 8

15

 

S some like an example is one of our members will create deep fakes of water

systems. And the way in which they would look based on different variables,

like maybe this watershed will disappear in five years and this is what it would

look like or maybe this area will be flooded in five years and this is what it

would look like to help people kind of understand how, how climate change

will affect or even the sounds or the visuals or the animations, right? Like

really to be able to understand what does it mean when these changes

occur? Part of the National parks system? And the unit of the federal

government has a architect for climate change to help make some decisions

around those things. So oftentimes, we just need the data to help people

understand what, what is going to happen and how can we plan for it

accordingly and make, decisions based on that as well as certain areas that

maybe prevention as part of that plan as well.

S1 Speaker 1 9

10

 

Right. Exactly. And I, I know we kind of jumped into it already. But did you

have any questions for me about the project and the interview in general?

S1 Speaker 1 9

22

 

Oh, sorry.

S2 Speaker 2 9

24

 

I'm just getting my dogs on my treat.

S2 Speaker 2 9

26

 

After five, they're done. They're always like, ok, we want off zoom, we

want play and I gave one of them a jacket, so I have to give the other one a

jacket. It's kind of cold right now and they're wandering around their slew

jackets and talks.

S2 Speaker 2 9

40

 

Well, tell me what, what are you excited about, about this project? Why are

you, what are you excited about doing these interviews for?

S1 Speaker 1 9

47

 

I mean, so it, the, the whole project is about understanding how A I is going

to impact, like how it's impacting the creative fields and how it's impacting

the process or the workflow of creatives.

S1 Speaker 1 10

01

 

And it's, you know, it's constantly changing. We have all these new

applications coming out and obviously now, like really recently exploded with

chat BT and now it's becoming very popular.

S1 Speaker 1 10

11

 

And there's a lot of, I think a lot of consequences in regard with regards to

how A I is going to impact all industries, including the creative industries.

S1 Speaker 1 10

23

 

So I'm really curious about how artists should engage with these new A IA I

applications like understanding the implications for their job security there,

you know, their intellectual property enhancing their creative workflow or

maybe seeing how A I detracts from it, or how it can create new art, new

forms of art, you know, and maybe destroy old conceptions of art. So there's

 

really a lot of new, new things to think about and it's really constantly

evolving.

S1 Speaker 1 11

02

 

Absolutely. And I, I know you did some work like with like the Wampum

Codes, the podcast. I listen to some of it but you're, I, I'm curious to hear

more about your ethical framework for development and using data and how

you use co creation.

S1 Speaker 1 11

25

 

Yeah.

S2 Speaker 2 11

26

 

Yeah. So I'm, I'm sending you a link really quickly of a article I wrote is A I, art

real art, right? I talk a bit about, you know, new art will, new art forms will

form new art, right? Exactly as you stated.

S2 Speaker 2 11

39

 

And a little bit about how I contextualise A I and art.

S2 Speaker 2 11

43

 

I, I have been asked to speak a lot about the most recent Z A Afra Writers

strike and, and, and a is role within that and what I think of it as an A I

researcher. So I am happy to answer some questions. OK, I am happy to

answer some questions specifically if you're interested in going into that

because I do talk a bit about that. I had a panel recently at imaginative film

and media festival, the largest media festival for indigenous people globally.

And I did a keynote panel there with a member of S A Afra around you know,

A I and rights and how it's gonna affect. Like, specifically, we were talking

about the film industry because it's a film festival. But we were also talking

about, we brought an A I artist on who works with A I and makes a sound art

using A I. So we kind of talked about it from those areas. So that's something

I feel confident talking about it. But you started with a second question which

was about an ethical framework for software development. So with Wampum

codes and the framework that I've outlined and I can get you a link to that as

well.

S2 Speaker 2 12

45

 

Do, do do and that's separate from Mozilla.

 

S2 Speaker 2 12

49

 

It is.

S2 Speaker 2 12

50

 

Yes. So I was a Mozilla fellow when I developed this framework.

S2 Speaker 2 12

55

 

So, you know, I was part of my indi individual research while being supported

by Mozilla. So let's say this one is the right one.

S2 Speaker 2 13

07

 

Hm. That's one of them. The other one would be OK.

S1 Speaker 1 13

22

Yeah, I got this.

S2 Speaker 2 13

29

 

I think this is it.

S2 Speaker 2 13

36

 

Mhm.

S2 Speaker 2 13

53

 

Oh, here it is. OK. This one is where the Mesilla Foundation kind of

summarised this, I also did this as a workshop for I've done it many times for

different kinds of organisations start up nonprofits. And essentially we go

through and it's not about me having an ethical framework that everyone

must adhere to. It's actually quite the opposite. It's about individual

companies, groups, developers being able to articulate what it is that the

shared values that they have and how they embed those into the code. So I

think there's a lot of incredible work done by A I ethicists in academia and in

policy. And I find where the rubber meets the road as a developer. You're

like, I read that great book, Weapons of Mass Destruction or whatever book

that kind of inspires you thinking about it from a policy. I understand what's at

stake. How does this affect my code? Like, like actually I'm building this on a

team. How do we do this ethically? And so this is really more geared towards

that team, the team of people who are building something saying, OK, what

are our assumed values? Let's talk about those? Do we actually agree? Do

we, we think we agree? Do we really agree? How do we reach consensus and

 

agreement? And then how do we test the limits of those assumptions? How

do we kind of flip the script and imagine what somebody on the opposite side

might do in order to try to break or exploit our system? Right. So, you know,

codes of conduct are only as good as you can enforce them. And so thinking

about that kind of thing too, like how are you going to enforce it? That

doesn't right now, when it comes to creative Commons or open source

technology, we have to litigate in order to, you know, decide something, right.

If someone breaks your code of conduct around, you know, this is open

source and you close sourced my open source thing, you could litigate. It's

not the greatest thing if you're just a nonprofit open source group to be able

to like litigating a big corporation. But that is essentially based in trademark

and copyright law nationally or internationally. And this kind of brings a

different concept where you can decide as a group, what type of

enforcement you want. Maybe you have an enforcement that is just this is

when we meet on Zoom every week, show up and talk and tell us if you think

somebody's misusing this and we can talk about it as a group and reach

consensus, maybe something very informal like that or it could be something

more formalised like, you know, people report you and your API key is

revoked and you're no longer able to use the updated code, right? There's

different ways in which you can kind of think of enforcement. And when I do

this workshop, I usually suggest to people to do something that's within your

ability and interest to enforce and not to set laws or rules that you have no

interest in enforcing or you have no ability to because then you might just be

stating that something is more safe than it is, right? Like if you can't enforce

it, you might as well state that to people like, hey, we can't enforce this. So,

and that's ok. But it's good to explain that, right?

S2 Speaker 2 16

49

 

So that's really where this workshop comes from is saying, you know, how do

you bring this kind of accountability into your space and the start ups that

have embraced it? I usually encourage like whoever is the biggest toppest

person that you feel would need to have sign off on this to be there so often

the founders are there, even if it's a small developer team working on

something, the founders in the room and they, they're talking and they're

listening and they're, it's very eye opening for them to say, oh, that's

interesting that the developers assumed that this is the values that we had

and that they have different values when actually we agree with them. And

That's interesting, like we didn't realise there was this disconnect. And so this

 

workshop can help people align their values and reaffirm developers agency

to pull up these questions. Oftentimes again, it's a matter of communication

and alignment if a developer saying this thing is unethical and I've raised

flags and no one's paid attention. This is a moment where people can pay

attention and say, yeah, I think, you know, we are taking this seriously and we

will make decisions based on this, but we didn't even understand where the

ethics came from. That because that, that's where I find it's a way of

designing a product or company or start or whatever from values versus

designing a start up and, or a company or code or program and then having

The design determines the values based on how it functions.

S1 Speaker 1 18

10

 

Yeah.

S1 Speaker 1 18

12

Yeah, that's fascinating.

S1 Speaker 1 18

16

 

Yeah, so I guess I'm also curious about where, where you think like the

responsible or how the responsibility is distributed because, you know, you

have like these large, like big tech companies where they like social media

companies, for example, where they have, they implement A I algorithms that

optimise for something that may not be in the best interest of the user, which

is their attention in the case of social media.

S1 Speaker 1 18

40

 

And so there's like kind of this alignment problem with figuring out how to

align the values of A I with users and like the general population.

S1 Speaker 1 18

49

 

So I guess, I guess so how do you think that responsibility for dealing with

that alignment is just best distributed?

S2 Speaker 2 18

58

 

I mean, for, for the framework in which I've developed, I think it's very

important to have the highest level of stakeholders in the room otherwise.

S2 Speaker 2 19

05

 

It's just a silly exercise which is can be fine, it can be fine even just for a team

that is like we have no power to make change, but we want to align and have

 

a place to talk in solidarity about our issues. That's fine.

S2 Speaker 2 19

16

 

I think if they want to make real alignment and impact in their company,

having stakeholders that can make decisions or that can hear them out and

then by hearing out the concerns of the developers can make a choice to do

something that's beneficial. I've been on panels with the original founder of

youtube and, you know, he, he did similar things around raising ethical issues

and flags until he eventually departed Google because, you know, he felt as

though he, you know, he had raised it up to the top, he had outlined and

explained and tried to change and tried and tried to make positive change

and was unable to. And so, you know, sometimes, of course, it's going to be a

point where no matter how clearly you articulate your fears or how, well you

try to design a solution to it. You may choose to like him to part and say like

I've, I, raised my concerns, I tried to innovate around them. I tried to create a

creative solution and I failed. And so, you know, I mean, and whether or not

you consider him personally failing or the company failing to listen to him, you

know, that's kind of, I leave that to people who have researched and

understood his situation more. But you know, listening to him say

like, yeah, I, I articulated it. I tested solutions. I tried to innovate and had to,

had to move on, right.

S2 Speaker 2 20

33

 

And other people and in similar positions have done the same thing, whether

they're whistleblowers or even just people who like make a public statement

as to why they've departed.

S2 Speaker 2 20

43

 

And that's not really the, the use case that I think I created Wampum Codes

for is not like you've tried everything. This is more like the first step or the

second step. Like you're trying, you're still in the state of trying and you'd be

surprised how many companies are like, yeah, we'd like a system to just start

trying from the beginning. That may be a key differentiator for us in the

marketplace to state we use ethical A I and this is how and why that may be

interesting for some new and upcoming companies to say I've already seen

two of my friends start a start up where they're like we use conversational A I

to write copy in an ethical way because we don't do it in anything that would

put somebody out of a job.

 

S2 Speaker 2 21

25

 

Interesting. I don't know how they define that, but that's already how they are

diff they're creating a differentiation for themselves in the marketplace of use

us instead of using chat G BT because we're ethical. A, we don't use copy

written material in our models. B which again, question mark and then, we

don't use the use cases for something that would put somebody out of a job

again, question mark because I, I, this is, this is a very early start up. So I

I don't know exactly how they define all these things. But the interesting thing I

think about the start up is that they're already using ethics to differentiate

themselves in the market saying don't use chat GPT use us because we're

more ethical. So I can imagine start up saying, OK, if people have concern

over copyright, putting people out of jobs stealing images and regenerating

and using them without people's consent, use us, we do something different.

S2 Speaker 2 22

15

 

So that is an option that people could decide and determine that we don't like

companies that have those kind of practices and then we as users and we as

citizens and we as workers want to have a different way of working.

S1 Speaker 1 22

29

 

Yeah, exactly.

S1 Speaker 1 22

31

 

I guess I could you talk a little bit about how your personal creative workflow

work or how you use creative A I in your and what A I tools you use in your

personal creative process, whether it's like for VR videos or like

brainstorming or dating or like any and then maybe just the name of those

tools.

S2 Speaker 2 22

50

 

Yeah.

S2 Speaker 2 22

51

 

I would say I use as many as possible to play with.

S2 Speaker 2 22

54

 

I, I am aware of the the carbon footprint that, you know, playing with A I has

as an impact, I'm aware that a lot of these models are using copyright

material and material that people didn't consent to. So I'm like, aware of all of

 

these things and also interested and curious and teaching them to my

students and understanding how they work.

S2 Speaker 2 23

15

 

I've tried using them for different projects to make films, whether it's runway

or dolly stable diffusion, you know, anything like that.

S2 Speaker 2 23

24

 

You know, playing with chat G BT or other, all the other different large

language model, API S or S or you know, A I as a service products.

S2 Speaker 2 23

33

 

And just to kind of test them out, see their capabilities because I, I want to

understand how they work. What are their barriers for entry, how, which is

very low at this point, anyone can kind of pick them up and use them.

S2 Speaker 2 23

44

 

And then I let my students use them too and see like where are the

limitations?

S2 Speaker 2 23

48

What is interesting about them?

S2 Speaker 2 23

51

 

I also just use basic maths, right? Things that are a lot of A I is just based on

statistical models that you can run in R Python or javascript or, you know,

wherever you can put them in unity, if you want to use them for different

types of you know, shader, however you wanna do, right. There's a lots of

different pipelines for how you can use these statistical models with high

performance and high high computing devices. We have a supercomputer at

our university that we're allowed to use for research as well. And so yeah, you

can kind of play around and see the ways in which you can generate frames

of video or do a style transfer or motion capture data analysis or motion

capture data analysis with style transfer, right? Like all those kind of

possibilities from very specific things that were kind of coding everything

From scratch I love using off the shelf tools. I I liken it to the moment when

George Eastman created the first portable film camera and up until then

people were taking photos for a long time. Whether it was a hero type or a

silver type or tin type, I, there were there were ways even you could say the

way that people used photo reactive red ochre on cave walls to imprint

 

images of their hands, right? Very, very early photography, like not even using

a machine but just using the light that comes into a cave to make a type of

photograph, right? Lighting, writing with light, right?

S2 Speaker 2 25

17

 

And so I liken it to that moment that we're in now. It's like George Eastman

I just created a portable camera. So suddenly everyone could have this

thing in their pocket, they could take a photo of anything and then they had

these standardised chemicals that they could either go to their neighbour and

pay him to do it or they could buy them themselves if they were sort of diy

and do it themselves. And suddenly everyone in the world could take a

photograph and there were a lot of terrible photographs taken because of it.

Right. A lot of nonsense photographs were taken. A lot of just silly ones of

just your, your kids' wedding or your dog or things like that were taken.

S2 Speaker 2 25

47

 

But then also they started using it for things like criminal justice or using it for

medicine. It suddenly became a standard practice when somebody is, you

know, diagnosing someone or writing a, a clinic to have photographs of their

illness or have photographs of their injury. Right?

S2 Speaker 2 26

00

 

So suddenly it changed the way it was used because it became so

standardised and ubiquitous. And I think that's what we're seeing right now.

It's interesting to look at something like, you know, stable diffusion or mid

journey and seeing what an eight year old does with it. Right. We can see that

now when I couldn't do that in 2016 because it was a little complicated and

quite expensive. I couldn't just put it in the hand of a nine year old and be like,

Tell me what you want to do? And it could just type in words and then

boom. Right.

S2 Speaker 2 26

26

 

And now we can. And it's kind of interesting and I like this moment for that

reason, but it is this democratised moment. I was asked in my most recent

talk, someone raised their hand and said A I art is all kind of garbage and

trash though. Right. And I was like, hm, that's interesting. I mean, what do

you think A I art is? Right. That's so cool that, like, suddenly people now think

they know what A I art is. Whereas in 2016, I would say it's a ir and people are

like, what on earth are you talking about? Like, what do you, what, what could

 

you possibly mean when you say a IR and now it's all been flattened down to,

it's only one thing and it's dolly. It's only one thing in its maturing. That's

interesting. It's not gonna stay that way for very long. I think we'll expand the

definition quite a bit. But it's interesting to be at a state in time where

Everyone thinks they know what it is. B can participate in it. And it's kind of

cool that he thinks it's all trash because that's probably what people thought

of most photographs when they started taking it. Right?

S1 Speaker 1 27

17

 

They think, oh, it doesn't create any, you don't need any skill for it. So then

there is no art.

S2 Speaker 2 27

22

 

And everyone said no one will ever paint again. No one will ever make a

sculpture again. Like all of Western civilization, whatever art is dead because

of the camera. And it was true, everything changed. But it didn't mean we

stopped making art. It didn't even mean we stopped painting. It just meant

that abstract expression happened. It meant that like, you know, obviously

Postmodernist artists were saying, what if we make ready maids? What if we

have performances are, what if smell or an ideas are, it did change

everything. I don't think it changed for the worse. I mean, I think at that

moment when the camera was invented, if you look at western painting, I

mean, my God, it was like the decorative art, sorry, but like not my favourite,

like really could do without. So I, I don't know, I think for me it art became a

a lot more exciting, right?

S1 Speaker 1 28

07

 

Are, are you optimistic about a IA I then and how it's going to affect, I guess,

artistic creative communities?

S1 Speaker 1 28

18

 

I think I'm optimistic about A I in the hands of artists for sure.

S2 Speaker 2 28

21

 

I think artists always do wacky and weird things with, with any new tool and

giving them new tools will always be the most delightful use of the tool you

could possibly imagine.

S2 Speaker 2 28

32

 

I mean it right. I would just say that kind of flat out, right? Really delightful

and incredible ways to watch. Artists use new tools that are not as delightful to watch.

You know, governments and policing and war using new tools. That's kind of

quite disturbing and terrifying. So definitely I'm not like that across the board.

A I optimist because it is a super power and it can be used for real negative

things and it can be used for real positive things. It's kind of like any new in

incredible energy force or energy system.

S2 Speaker 2 29

06

 

It has possibilities to make hospitals be able to run efficiently and, and it also

has possibilities of, you know, terrible destruction. So I don't know, II I would

be a measured optimist, you know, in the hands of, yeah, you, so you would

suggest artists play with it.

S1 Speaker 1 29

24

 

All the new technologies engage as much as possible along the line.

S2 Speaker 2 29

29

 

I don't know, I mean, you know, I like the way that artists engage with it and

they don't always en engage and most artists engage with it from a very

critical lens. So I would say, yeah, artists should engage with that, that may

not mean using it, that may not mean approving of it. And a lot of artists

engage with A I art by heavily critiquing it and pushing against it. That's cool

too, right? I think artists should pay attention to it.

S2 Speaker 2 29

53

 

And if they have problems with it should absolutely voice those and

demonstrate areas in which and why and how things might be better or

different. That's kind of the coolest thing about artists is we get to imagine

the world as it is now. And how it might be without actually having to prove it.

S2 Speaker 2 30

10

 

It's just ideas.

S1 Speaker 1 30

12

 

Yeah, I guess how in, how do you hope to see the future of art and A I, or how

with A I, and affecting art and artistic communities?

S2 Speaker 2 30

27

 

Oh, I mean, the, the desire I have for any type of innovation when it comes to

technology is the hope that it can bring a more just and equitable world and

it's the hope that it can do things that are dangerous for humans so that we

don't have to be in danger. It can do things that are tedious and annoying for

humans to do so that we can like enhance our lived experience and our

shared experience of being together.

S2 Speaker 2 30

54

 

I would love to imagine that A I can do difficult 9 to 5 low paid jobs and

humans can be creative and write and make poetry and art and the things

that we care about as a culture rather than the opposite, right? We're seeing

that, you know, I don't want to have to just work at mcdonald's so that A I can

write poetry and make art, right? I want the A I to be able to then again, when

you're talking about any type of labour, like replacement of OK, the A I works

at mcdonald's, I don't mean that, that means that people who are currently

those labourers then now have terrible lives and they can't make a living and

all that. So the true dream is that technology will make our lives better and

easier. But when we look at what the Luddites were fighting against at the

turn of the century, they were saying that these things like electric lights and

mechanisation and production lines and industrialization of our you know, of

our grain farming and our weaving of fabric did not make their lives better.

And that's what they were fighting against. They were saying like you've

made, you've invented all these new faster ways of doing things and because

of that, we now have worse working conditions, Children are going to work

and being exploited. All these issues suddenly were happening rather than

the promise of like, oh you'll only have to work a couple hours. These

machines will be doing, they lack regulation to it.

S1 Speaker 1 32

18

 

So and then the benefit or the fruits of the increased production wasn't

necessarily divided equally. Did they see that? But yeah, I, I know we're kind

of running low on time. I'm not sure.

S1 Speaker 1 32

33

 

Do you if you have a couple more minutes? Yeah, I have a couple minutes.

S1 Speaker 1 32

36

 

OK.

S1 Speaker 1 32

40

 

Let's see, I guess, are you familiar with remix and like the create the creative

process?

S2 Speaker 2 32

55

 

We actually define that like music like taking a previous art form and altering

it or tweaking it and basically creating something to do.

S1 Speaker 1 33

04

 

Like, I guess typically you think like sampling, sampling music, remix Do you

use that in your creative workflow or process at all?

S2 Speaker 2 33

13

 

I do in different ways, like I have this one piece called, say Indian where I took

60 different westerns. And then I used Python to find every single time any of

the characters said the word Indian and then made a super cut of just them

saying that word. And you get to see all the clips from across all these films

and what it kind of says at the end when they're all mixed together is like,

they're never saying it for a good reason, you know, like no one's own like

India and they're always like, you know, with a gun like India, right? And it just

like all piles on top of you until you're like, oh wow, that's a, that's an

amalgamation of the way we represent. We see indigenous people in

western. So that's a fun piece that I consider a remix.

S2 Speaker 2 33

58

 

I write my own musical compositions. I, I use some samples but they're mostly

samples from nature.

S2 Speaker 2 34

05

 

Oftentimes ones that I've taken and then I remix those sounds of birds and

water.

S2 Speaker 2 34

10

 

I, I, in my early sort of video art days, I did a lot more s like video sampling

and working a lot more with found footage. I think nowadays I tend to take my

own footage and then just remix it more with with statistical models or code,

right?

S2 Speaker 2 34

29

 

But I do think that mixing or remixing with code would fall under that

definition, right.

 

S1 Speaker 1 34

37

 

Yeah, definitely. And even, you know, there's with Chad G BT and like the, the

way the L MS process data, like in a lot of ways remixing other people's data

into a new, new form.

S2 Speaker 2 34

47

 

Oh, yeah, absolutely. It's a predictive model and it's so easy to trick that. It's,

it's fun to kind of find how it was remixed because it's very easy to kind of

like mess it up and watch it scratch and see how it loops, right?

S2 Speaker 2 35

04

 

Me, me and my students do quite a lot with that, like trying to get it to say

weird things or trying to get it to, yeah, or just to, just to see how, I guess

poking holes at the unsmart parts of it, right? I think it's important for us to

hit those limitations and be like it said, this thing with a lot of confidence, let's

see if it says it again, the second time we asked that or, you know, like it and

then, you know, kind of Googling and doing research and then comparing

how, how it's affecting people's perception of the world and truth.

S2 Speaker 2 35

37

 

Yeah, I think it's important. I created this thing called the stupid hackathon

and I think it, it started in 2014, 2015 and now they kind of have them all over

the world, these stupid hackathons where people do a lot of like cultural

remixing and using code and technology to, to poke holes at like things that

We are accepted as very smart but actually quite stupid. Right?

S1 Speaker 1 35

58

 

Yeah, I guess when you're, when you ask your students to do different

projects or explore these new technologies, do you suggest they use it for a

a certain part of the process? Like for H BT you say or how do you implement

like iteration into these tools? Like for example, like instead of saying, oh

put a prompt in the chat BT and turn that in like how do you iterate on that?

Do you focus on using or collaborating with these tools?

S2 Speaker 2 36

24

 

Yeah, we do it a lot of different ways.

S2 Speaker 2 36

27

 

You know, they build a lot of N PC systems for game worlds that are inspired

by science storytelling and so they can train some of their own models on

how to kind of create conversations with the user. So they may use it for that

reason. But they also again, are aware of the limitations because they might

say, you know, tell me how many petals this specific flower has and it's not

always correct, right? It's not always gonna give you a correct answer. So

they're aware that it's kind of fun for asking certain questions but maybe not

for other things.

S2 Speaker 2 36

58

 

They use image generators again for certain types of assets. In their games

and in their game design and some of their artwork, some of my students are

very you know, a lot of my students are computer scientists so they make

their own models as well. So they are like, I don't like stable fusion. I

made my own version and I like mine better and mine can do more for

whatever artistic reason, for whatever you know, video project that they may

do. One of my students created the first never ending zoom in stable

diffusion. So he was about 15 when he made that and it kind of got famous

on the internet and everything. It was just this kind of zoom that just never

stops. And then eventually he remade a sort of a competitive model to stable

fusion. So he could do it even better in his mind to, right, to create something

that was smoother or more beautiful in his mind. So some of them are

innovating at a technological level with these models and others are just, you

know, using my journey the way many consumers do and then using that

imagery.

S2 Speaker 2 37

58

 

Either in maybe some of their ideation or character development boards or

mood boards around like the look and feel of their games before they model

it out.

S2 Speaker 2 38

08

 

Or they may use one of my students made all of the wanted posters in her

Western game from mid journey, right. So maybe the whole game isn't it? But

certain assets are kind of a fusion, right? So a lot of them are using it in a

hybrid fashion, right?

S1 Speaker 1 38

22

 

Really cool.

 

S1 Speaker 1 38

24

 

I'm realising that I have a minute left in this Zoom copy that do not have

premium Zoom, but thank you so much for reading.

S1 Speaker 1 38

33

 

This has been really great and I I'll send you a follow up email with more

information.

S1 Speaker 1 38

40

 

And do you by any chance know Erin Riley? She's the professor that I'm

collaborating on this with. But I think you guys would, should connect at

some point because you guys are very similar and I think you would get along

both in a remix space.

S2 Speaker 2 38

54

 

I don't know. I would love an intro. Let me know. That's super cool. Yeah.

S2 Speaker 2 38

59

 

Amazing. Well, have a great weekend for a week and then we get OK.

S1 Speaker 1 39

04

All right, I'll talk to you later. Bye.