What trends will dominate media and technology by 2030, and how will they impact our lives? Speakers joining this panel include:

Yaël Eisenstat, Policy Advisor, Center for Humane Technology

R. Luke DuBois, Co-Director & Associate Professor, NYU Tandon School of Engineering Integrated Digital Media (IDM)

Desmond Upton Patton, Associate Professor of Social Work and Associate Dean of Curriculum Innovation and Academic Affairs, Columbia University

Justin Hendrix, Executive Director, NYC Media Lab

Tony Parisi — Unity TechnologiesNYC Media Lab Summit ’19 Innovation Panel: Media 2030

TRANSCRIPT: (auto-generated, last update 5pm 10.1.19)

Justin Hendrix: 00:05 If you think about the way our media and information system or ecosystem is wired at the moment, um, certainly there’s an enormous investment that we’re going to need to make together, uh, in order not only to have good lives and livelihoods in this industry here in New York, but also to make sure that ecosystem is future-proof and set for the tomorrow. And we’ll get us all where we want to go together. So in order to talk about that a little bit more and to kind of set up this idea of the media 20, 30 a discussion, I’ve, I’ve put together a panel and I’m going to invite my panelists to come on out. I’m going to grab my notes while we do it. Um, and or you can welcome them to the stage.

Justin Hendrix: 00:52 I wanted to have just a little bit of a chat about, um, where we see things going. So I’ve, I’ve sent these folks some questions in advance and, uh, I’ve hoped they’d had a moment to think about them. Um, but you know, the, the general frame for this was, let’s think about the decade ahead. Let’s think about the trajectories. Let’s think about if we’re in this room together in 10 years, what are going to be the things that we’re going to look back on that we accomplished, what are going to be the things that are still challenges, what are going to be the research questions and the universities? What are going to be the problems and the opportunities in industry. Uh, how are we going to kind of advance this question? So I’ve got a great group of people here to do this. Um, and to my, uh, my left here, um, I want to add, welcome UIL.

Justin Hendrix: 01:38 Thank you very much. Eisenstadt here you got a bio’s in your, your app of course, but she’s a policy advisor at the center for humane technology and I think she’s going to tell you, uh, maybe some other affiliations that are notable in her past. Um, so she’ll get onto that. I’d recommend you Google her quickly if you want the full context. Um, I’m introduced just to, to her left eye doctor Desmond Patton, who’s associate professor of social work at Columbia university and founding director of, I think one of the most interesting, um, labs in New York city, the safe lab. And so if you don’t know it, I want you to look it up. Um, and, and acquaint yourself with Desmond’s research. Um, I think it’s very important. Uh, and to the left of him, uh, you’ve got Tony Parisi, uh, who is got a, uh, credible title at a company that, how many of you know, unity technologies, how many of you know unity?

Justin Hendrix: 02:26 All right. Still some work to do to tell everybody what it’s about. But, uh, he’s at the game engine unity and running a global ad innovation. They’re, uh, enormous company. Uh, when’s he gonna go public? Tony are, you can’t tell us that. All right. So, uh, here’s the thing. Um, I think these game engines are, they’re a S a S a sort of early signal of where media, all media, not just AR VR games that are going. Um, we talked about the media as data thing earlier. Um, so an incredibly important perspective I believe. And then Luke Dubois, um, my partner in crime at NYU Tandon school of engineering on a variety of different projects, co-director of the integrated digital media program. Uh, and uh, you know, a professor of engineering there, uh, with a music pH D which maybe he’ll get on to in a little while. So, uh, this panel we’re going to try to kind of cast our minds a little bit forward and think about, uh, the decade ahead and I kind of want to put just a simple question to them at first, which is, if you had to kind of go down the line, what do you think is the most important technology trend?

Justin Hendrix: 03:29 How would you frame the most important technology trend that’s gonna shape things over the next decade? And I’m going to start with Tony Parisi cause he’s ready to go.

Tony Parisi — : 03:38 So as Justin alluded to, I’m, uh, I’m into spatial computing. The interface has going 3d that is not the most important technology trend. It’s a very important one. It’s AI intervening in, it’s happening with artificial intelligence and machine learning about our lives. On a daily basis to presumably make him make our lives better, to help us. Um, but it’s also the same, uh, family of technologies that will be used for deep fakes and other things that can be incredibly scary going forward. Imagine when we can do deep fakes in real time, for example. So everything we’re doing with artificial intelligence needs, stewardship and a lot of thought at every moment in time going forward. Desmond, is it AI?

Justin Hendrix: 04:22 What’s interesting for me is that I’m less interested in the technology that we will have, but more so the questions that we will ask if these technologies is someone that leverages AI to study gun violence. What I’m most concerned about is this surveillance state that may be caused by the use of AI. And so I think that we have to think very critically about the extent of which we bring in diverse voices and opinions around how we create and ask questions of these technologies in the future. And we are going to

Justin Hendrix: 04:49 get onto that, but I’ll ask a Luke in the aisle on the technology trend, is it AI as well for you Luke?

Luke DuBoise: 04:54 Yeah. But I think I would qualify in the same way that Desmond did the, that it, that it’s, it’s AI concerns me. But what concerns me more is the idea of a future where not everybody has access to engage with the technology at a level playing field. Right. And so, um, when I think of media 20, 30, I think of, you know, we have this opportunity to set the tone for real, for just a decade of equity or a better, you know, a future where we can really talk about, you know, who, who gets to put content out there, how that content skate kept, all that kind of stuff. And yeah, AI definitely keeps me up at night, but people, I know my students, you know, that’s what gets me up in the morning is the idea of their, their voice is getting heard in that conversation.

Justin Hendrix: 05:42 Yeah. It’ll get to up in the morning, keeps you up at night so you never sleep. I never have a child. Right. So that’s a baby so that, that helps.

Yaël Eisenstat: 05:50 All right, I’m going to bring this home with continuing the somewhat downer of a trend here. Um, for me it’s not about what actual particular technology, it’s the removal of human critical thinking and human analysis from the future of all of these things that concerns me and over-reliance of on data without human intelligence.

Justin Hendrix: 06:08 Okay. So let’s, let’s, let’s take stock of what they’ve, they’ve gone on about AI of course. Uh, and, and what that will do to the mechanism of, uh, not only of collecting data and information and media, which may well look like surveillance, but also in generating it real time, uh, defects as the example here. Um, look, obviously lots of questions about the health of media ecosystem that feels like we’re on a precipice of some kind. Maybe we always feel like that. I don’t know. Um, but if we’re going to make progress, um, what are the questions we’re going to have to answer over the next decade? What are the things, the hypotheses in your lab, Desmond?

Justin Hendrix: 06:46 Well, who gets to be human is the thing that I think about a lot. Um, in my work, I’ve been studying young, black and Latino, Latin X young people and their experiences on social media. And we messed up, we had imbibed narratives about the experiences of young black youth and totally missed how they are seeking help on social media, how they are experiencing trauma and grief and loss in a way in which they’re trying to get support for. And that, that communication changes over time. But because we were framing in a way that was punitive and negative, we missed complete narratives and we dehumanize them. And so I think for us, we want to make sure we bring in young people to help us unpack these things and ideas and we want to make sure that their thoughts, opinions are at the forefront, at the center of how we are doing our work.

Yaël Eisenstat: 07:40 Yeah. Um, what are the questions we’re gonna ask?

Justin Hendrix: 07:45 Yeah. The questions we’re gonna ask. I know you’re concerned in particular about the global implications of, of social media and maybe you’ll mention, you know, why and what your background is and with regard to that.

Yaël Eisenstat: 07:54 Sure. With that little prompt, my background. Um, I mean I spent most of my career in the national security and foreign affairs, well world in government and then went to Facebook as their head of global elections, integrity ops and walked out in under six months. So that’s the context for my statements. Um, I would say I think one of the things for the future to get to a healthier media status to the point where we can actually both trust media again and start having the kinds of the kinds of media that we want for a healthier global society. I think three things have to change. Being first, being fast and being free cannot be a sustainable model. If we want a healthy media environment and unfortunately I know that everybody wants all their information for free now. If you want actual journalism, you can trust it. Ha you, you can’t just expect it to be free. And so this whole, it’s not just the free business model, it’s, it’s the, we are forcing journalists have to be first and fast and not necessarily accurate and order and click baity in order to get that show that they need on the social media platforms. So I promise I didn’t pre-think this and I realized there’s three F’s in there. So that might be my new tagline first, free and fast.

Justin Hendrix: 09:14 It’s not sustainable. So, uh, the problem of who gets the human

Justin Hendrix: 09:22 they liked the three F’s. Um, and uh, and the problem of how do we capitalize the search for facts. Uh, Tony, Luke, what are the questions we have to answer in the next decade?

Tony Parisi — : 09:32 Well, I mean, I’m going to follow on with what Desmond said a little bit. I feel like technology has really depersonalize this in so many ways and clearly people of color and underprivileged and underserved communities suffer from this far worse than most. But in general, everybody, I feel like we’re living in this Westworld at this point. We are living in this place where we’ve forgotten that we’re human. I think that what’s going on in gun violence in this country, in addition to being bound up in social inequality and racism and everything else is also a function of the fact that I don’t think we’re thinking of those people as real people, as a society. So, um, I think going forward, really understanding what it means to be human and, and, uh, you know, reconnecting with that as probably the most important thing because we’re just so bound up in our media and we’re so, you know, responding to the clicks. And so I think that’s gonna be the key question going forward.

Luke DuBoise: 10:23 Yeah. And you can, you can throw a lot of, um, there’s a lot of like blame to go around with how that happened. But you know, one of the things that we research is, uh, you know, the, the use of data visualization in, um, everyday discourse, right? So like when it, when it went from a fairly anidine context where like, you know, we’re in the board room and I’m showing how profits are going up and that’s all fine, or I’m using it in scientific research too. It’s on the front page of a paper and I’m using a bar graph instead of a photograph, right? So that choice is anesthetizing, right? And that choice is actually like somebody made an editorial decision to treat people like numbers, right? Or to represent people as numbers rather than people. And that kind of stuff is really pervasive. A lot of the, a lot of the agitated that we have as a society about AI is actually like a is a, you know, is symptomatic of that, right? Like the, the cause is really the data, the data fication of us right in the symptom is the, you know, sort of the world of AI. I have a prop that I thought I’d show you guys. This is a thing called a crackle box.

Luke DuBoise: 11:34 It’s an electronic music instrument, developed some by some friends of mine in the Netherlands. And um, the circuit inside of it is from a polygraph. And so the thing that’s kind of fun about it is, you know, this is like one of these things of technology’s never neutral, right? Depending on con, so I can, I can turn this into a weird electronic children’s toy, but this seems signal is the thing that tells me whether I’m lying in a certain machine context. And I think that we have a lot of that around these days and we don’t, we don’t think enough about it. I think fun fact. Does anyone know who invented the polygraph? The creator of wonder woman, Charles Moulton? Marston it’s a smart thing. Yeah. No, she called the last thing. I know that. Yeah. I could go on about this. Also trying to lighten it up a little bit. I feel

Justin Hendrix: 12:22 well, since we’ve lightened it up, let’s go back to, uh, disinformation and elections just for a moment.

Luke DuBoise: 12:27 Oh.

Justin Hendrix: 12:31 it’s kind of incredible what’s happening right now. It’s almost like this information’s eating itself, uh, in a way, you know, uh, we, we seen the president kind of apparently put himself in an impeachable position because he’s bought this information, a story and pursued it right in office. We have poor report out today from, um, the, uh, uh, Oxford, uh, project on the internet, uh, on computational propaganda. I don’t know if you even saw it yet, but it said 70 nations are engaged in, uh, evidence. Uh, well there’s evidence that seven nations are engaged in social media manipulation, either targeted at foreign, um, parties or, or, uh, at their own, you know, citizens. Um, this is, you know, huge numbers that kind of spiking up at the moment. But one of the amazing tables that I was looking at in that report was, uh, the number of governments that are now investing millions and millions of dollars, hiring hundreds of thousands of people to do, um, basically information warfare. Um, and that strikes me as extraordinarily concerning. Um, uh, and I think you’re concerned about that as well. Um, what impact does that have and, and, and you know, to the dystopia thing, um, are we on the beginnings of, of addressing that or is it still going to get a lot worse over the next day?

Yaël Eisenstat: 13:45 Okay. So I’m going to try not to take three hours to rant up here, which normally I wouldn’t do this question. First and foremost, propaganda, information warfare, none of it is new. No. Uh, the difference right now is the ease and scale at which this can be spread. So what the biggest concern to me, yes. Propaganda, information, warfare, it’s here. It’s not going anywhere. What I look at, I mean, if we want to back up and use of course, Russia’s example in our last election, to me the biggest question wasn’t even did the Russians hack our elections. The biggest questions wasn’t what did Russia do on Facebook? I mean, those were all important questions. My question was why were we so easily manipulated and so easily persuaded? Why was it so easy to get us? And there’s lots of reasons, but since we are here talking about the future of media, I want to talk about the social media angle of that.

Yaël Eisenstat: 14:42 And it’s, listen, I get it. It’s a country of free speech. Everybody has the right to say whatever they want apparently. Um, I have some thoughts on that. But anyway, so it’s a country of free speech, but as they say, often at the center for humane tech, freedom of speech should not equal freedom of reach. And it is the fact that you have these platforms who are deciding what they will amplify. Deciding how they will curate content, who have these algorithms, who I love, how they love to say that algorithms are amoral or algorithms don’t have politics. AI algorithms are going to figure this all out for us so that human biases and put into news, well sometimes I actually want a little bit of human bias to actually look at something and make an assessment on it. These algorithms are 100% program to keep your eyes on their screen.

Yaël Eisenstat: 15:31 It’s all about user engagement and so the most salacious content wins. The clickbait wins. It is whatever it is that is going to keep your eyes on that screen and that is where it ends up breaking down our ability to reason. It’s what breaks down our ability to decide what is fact, what is fiction. If I can get one quick little kind of meta example because what brought me into this space to begin with was the breakdown of our civil discourse and exploring why this is happening. Because I think the breakdown of civil discourse is one of the biggest existential threats to our democracy. And I’ve written enough about that. If you’re bored you can look that up later. Um, but just a medic example, because good journalism can’t win in this environment. There was an interview on bout me about a month ago and when they put out the piece they put it out with the super salacious click baity title, which if you know anything about me you understand like it is exactly what I talk about as being one of the problems.

Yaël Eisenstat: 16:34 And it was Facebook knows more about you than the CIA. Yes, I’m a former CIA officer cause I didn’t mention that. Hey that’s not what I said. B it’s super click baity and see it has kind of nothing to do with what we were talking about in the article. And yet that piece spread like wildfire cause everyone’s like Ooh. It says Facebook. It says CIA, this is super salacious. I guarantee you most people didn’t read it, but everybody spread it. So to me the biggest issue here is as long as these platforms hold no responsibility or accountability for the way they’re curating your content and what they are deciding, you will or won’t see, this won’t change.

Justin Hendrix: 17:15 Desmond, how does that correspond with your.

Justin Hendrix: 17:17 yeah, I think so. Yes. Everything that you’ve said. And then I think what I’m most concerned about is kind of the privileged, the way in which we get to discuss these topics. And so the language that we use around disinformation and deep fakes and cheap fakes are terms that, um, are not widely dispersed publicly or understood publicly. But if you really want to understand this information, you can talk to black and Brown folks. I think there’s a, a big understanding of the ways in which information has impacted marginalized communities and get the language that we use does not allow us to enter into those conversations in a way that actually brings everyone to the table. And so I also think that as we engage these conversations that we need to think about a more publicly accessible language that allows everyone to contribute.

Tony Parisi — : 18:04 Oh, as well.

Justin Hendrix: 18:06 Let me ask you guys, I’m going to point this, uh, maybe a little bit towards, towards the, uh, idea of change in the next decade ahead. Um, you know, the great thing about the media lab, I’ve got this kind of funny perch where I get to see what’s going on in the universities, but also in the school system and in the companies and et cetera. And so that for me is a great education. Um, but I also just see so many different opportunities to, to change the way that those various systems work. Um, in order to address these problems. What do you think we have to do? Uh, if you had a magic wand and you could address the institutions of New York, the schools of New York society more broadly, what would you change first?

Justin Hendrix: 18:44 that’d been, what would you change first? I’ll let you take that one on and while the others think about it,

Justin Hendrix: 18:48 gosh, so much. Um, so I will, let’s keep it immediately. Just for the day. I co-directed AI for all summer programs. So we have young people, black and Brown folks from New York city, New Jersey coming. It’s been three weeks for us from the summer. And we initially thought that we were going to be doing the training. We actually learned an immense amount from the lived experience. And so for example, we learned that a lot of young people who may live in NYCHA housing are being impacted by Fisher recognition systems already already. And they have deep concerns about that and want to do something about that. That was not built into our curriculum. And had we not had young people value their voices, we wouldn’t even have thought around thought about these implications. So I think that we have to again, invite a more diverse, younger set of people to be a part of developing the questions, the implementation and deployment of these technologies. Um, so I think that’s where I would like to start.

Tony Parisi — : 19:46 Okay. Start with the you Tony. What about, I want to try and tie these last two threads together because I was going, I like where y’all was going, but it’s actually, I think, relevant to how much, how we might change education. I’ve been thinking about this a lot in the wake of, in recent political events, um, and the attack on journalism and this sort of fall of journalism that, yeah. Was talking about. I think we’re fundamentally in a place where we’re living in Andy Warhols world. Actually. We, you know, we have celebrity, we no longer have leadership and judgment. I feel like we’re rudderless, whether it’s in government or, you know, when the golden age of news was all about having someone curate that stuff. It was more on Cronkite and those people with a point of view, we’ve lost all that, you know, and, and part of that is we, we do want things to be more open and democratic and that’s okay.

Tony Parisi — : 20:37 You know, when we all love the electoral college to go away, we think, especially if you’re in a blue state now, because you know, kind of sucks, right? Um, but at the same, there was a certain, um, logic to the idea that there needs to be leadership and judgment and potentially, you know, protections against rule of the mob. I think we’re headed heading head log in the role of the mom and I think we need to bring back some judgment and I think that from an education standpoint when we could do in the education systems is encourage that kind of leadership. So I think it’s more about less about media and education and more about public policy and education in these other areas. I’m sorry that was a bit of a ramble, but, um, that’s how I’m feeling today.

Luke DuBoise: 21:18 Yeah, no, I’d agree with that. I mean, I think media literacy is a big part of it. I think, um, you know, the, the, the thing that, that concerns me is, you know, with all these new technologies that are coming online, who’s getting left behind in the conversation with these new technologies. So like when I think about AI, yeah, AI is really scary and it has a lot of bias and whatever. But when we do improve, the signal to noise ratio on it is to make sure everyone’s informed on what it is and what it isn’t. Um, you could say the same thing with, um, you know, like innovations and wireless networking, right? So like what is five G going to do? Right? Right. What is, what is better security going to do all these things you can sort of point at and say, well, we really need is we need just like a really a, a much better inclusive conversation about how everybody gets to participate in this.

Luke DuBoise: 22:04 And so there’s a lot of, you know, there’s a, the, in this kind of, you know, this reflects upon like my experience is sort of in the arts and when you, and when you think about, um, diversity inclusion issues in the arts, one of the things that’s often pointed out is that the real issue is the gatekeeping, not the artists. There’s lots and lots of artists from all backgrounds who make excellent work. And then the curatorial system for allowing whose voices gets through is incredibly biased, right? So that’s the, that’s the thing. You always sort of whittle away at, right. And so example of arts. Yeah. So arts organizations are now starting to sort of embrace this. IBM’s one of them who like we start saying, you know, we don’t just have artists in residence, we want to have curatorial fellows who are looking at equity, right.

Luke DuBoise: 22:48 Cause it’s one step up. Right? And so when I think about, you know, mixed reality or when I think about, you know, the, the way like the entertainment experiences of the future are going to happen. Or when I think about citizen science or citizen journalism or all that kind of stuff, I always think about like one step up. Like who’s actually empowering that voice to get out there? And how do we change their ethic to be looking for it for a deeper catchment of people. Like I think that’s one more positive angle on how to improve the signal noise ratio and say, you know, it’s not just better curation, it’s better in a very specific sense of the word. Right. And that’s, you know, the thing that we could probably write about in this media 23 or get some people to sign off on and gets them.

Tony Parisi — : 23:33 Yeah. Well, I do want to come to you. Yeah, absolutely. Please.

Yaël Eisenstat: 23:37 Two quick thoughts on this one is, this is going back to the last question then quickly answering the question you just asked. You know, I’ve been doing a little experiment Twitter lately. I don’t use Twitter that often. Um, when I write really wonky, like yesterday I tweeted out sort of what does a whistleblower really mean from someone who served in the Intel community. And it was super wonky and there was nothing salacious in it. And I didn’t like throw out conspiracy theories. And then I compare that with like the one time I might’ve said something salacious and I posted them both at the exact same time, both during big media cycles, the more salacious one thousands of cars you can check your engagement. Yesterday’s tweets seen by almost nobody because it was, and so Twitter has decided who is interesting enough to amplify. Apparently someone’s actually served in that world and has really like fact-based ideas about what it means.

Yaël Eisenstat: 24:32 Not cool. The young hot celebrity with some salacious comment, that person gets way more engagement. So that’s one of them, but two and I have lots of policy ideas on that. So for me the biggest thing needs to, that I focus on is how to get our government to step up and do the right thing. But to answer your question, I would like to see a more interdisciplinary approach in our education system, especially a future technologist. I think so many young people are being told they have to learn to code, they have to learn to code, they have to learn to code. And my concern is, well, I mean I want the computers know how to code in the future. How about learning how to negotiate, how to have critical thinking, how to speak to humans. I would like to see, I think I would like to see more, I mean this is part of why I tried to engage with a lot of students and might be teaching next year.

Yaël Eisenstat: 25:22 I would like to see people who are not technologists teach more courses on things like the unintended consequences of tech on things like teach the future engineers, the future data scientists, how to have them. Uh, yeah, I was listening to the podcast once, it might’ve been Scott Galloway who made the comment, someone made the comment, imagine if Mark Zuckerberg hadn’t dropped out of college and maybe took a few courses on what the world looks like. I don’t know if that would have changed much, but like that’s what I’d like to see a little more well rounded edge.

Justin Hendrix: 25:54 Well, I know one of the programs that the New York CV lab runs is a, well, we help to run as a PR class called tech media democracy, about a hundred students from five universities, Cornell tech, Columbia, NYU, CUNY, uh, the new school. Um, and we bring them together from disciplines ranging from media studies and journalism to design to engineering, computer science. Um, and it’s just incredible to see, uh, the frictions that arise between the engineers and the journalists. You know, the engineers who immediately think we’re going to build something and solve this. And the journalists who say, maybe we built too many things already, uh, which was in kind of a group. Maybe we don’t wanna add some stuff on top of that other stuff. Right?

Luke DuBoise: 26:33 Yeah. But I have a friend who teaches ethics at Harvard who has Mark Zuckerberg on the roster for his fall class, but he never showed up. He dropped out that summer and he will occasionally on Facebook and be like, you know, if he just showed up to my class, none of this would ever happen.

Justin Hendrix: 26:49 So, so we’ve got a lot of work to do. We’ve got to address our education system, we’ve got to, uh, we’ve got to make change in regulation in government. We’ve got to think about inclusiveness and who’s at the table. Um, we’ve got to, uh, what, what should industry do? Tony, I’ll put you, put you on the spot on that industry representative at the moment. Doing what regard in terms of changing the trajectory of this, is there a role for business leaders to, to, to make a, a different play? Well, yeah,

Tony Parisi — : 27:17 of course, but I mean it’s going to, every company is going to do what they do and everyone’s under different constraints. If they’re public companies, they have to behave a certain way because they’re effectively 100% focused on the bottom line. If they’re not public yet, they have a little more loud as they’re like Unity’s, not a public company. Um, we have very strong leaderships who’s got very progressive values and tries to give back and do a lot of good in the world. Um, and you know, support young creators and do all the things we do. I don’t know if I could recommend a general plan for the entirety of the, the computer industry. You know, we’ve got politicians coming in wanting to break up the big tech giants. There’s, there’s a lot there. Um, I think it’s mostly about engagement. I mean, the companies that will listen and respond and work with the folks who actually have to think, you know, they think about it on a daily basis to try to make the world better. I think it’s that interface. You know, there’s always gonna be an interplay between profit motive and the other things we need to do as a society. Right? And, and so I think it’s mostly about engagement and listening when it comes to industry.

Justin Hendrix: 28:21 So these problems we have, these are deep, deep issues. They’re big problems and they’re, they’re exposed in a way right now. I think that they haven’t been, uh, in past. Um, but I did ask one last question in my email list, which was, which we’ve only got a couple of minutes for, I’d love to go down the line. What does give you hope that maybe by 2030 against the background of so much challenged climate change, soaring inequality, all the issues. We have a polarization. We know we’re going to face these things, but what gives you hope? What gives you hope? Former CIA officer? What gives you hope?

Yaël Eisenstat: 28:58 It’s funny. Someone wants to introduce me as the most optimistic person they know. And I was like, that is not a way anyone’s ever introduced me. And they said, you wouldn’t keep working on this stuff’s a passion. If you didn’t believe that it could get better. So what gives me hope is the fact that more and more people will become aware of the way we are being manipulated of the effects that these things are having on us. And wow, I’m going to sound like the old lady here, that the next generation is not going to fall except fall for it. And except the idea that a public CEO should only be responsibility is only responsibility as a fiduciary responsibility to his shareholder. That societal issues matter more. My hope is the next generation will change the way. Honestly, it’s the biggest part is the way our incentive structures work for business. Right now.

Justin Hendrix: 29:51 there is a rising cadre of black and Brown scholars and quicker critical digital studies that are wrecking shop and are putting us to task on how we develop these technologies through how Benjamin Saphia noble, Metallica and Conde Andre Brock, uh, murder. The Broussard are just a few of those folks that are really bringing forth important, critical, theoretical apparatuses to help us do this work better. So please look out for them, read their books. Um, and leverage them in your development.

Tony Parisi — : 30:25 Got it. I’m going to sound old too, but it does come back to the youth for a couple of reasons. Uh, one is the, they are not [inaudible] and jaded by, you know, they haven’t been beaten on by life enough, so the ideas are fresh. Um, and so, you know, I see my son who, who’s just gone, just started college in Emerson and you know, whole world before him, uh, great ideas. You know, there’s, uh, there’s going to be a voice for those. Uh, but there’s also young people who don’t want to be shot in their schools. Um, they are really being, you know, they have an existential threat in front of them and that is going to actually engender quite a bit of creativity and industriousness on their part to make sure that we shaped the world in a different way going forward. Yeah. Shot in their schools or strangled by their own head,

Luke DuBoise: 31:09 the sphere. Right, right. I mean, that’s the thing, right. So like, like meanwhile, we also broke the planet. Right. And, um, there’s that, right. And so, you know, that’s, you know, the, the sort of existential emergency around that is gonna cause a lot of these conversations in this political moment in the way political actors have been behaving in this political moment to feel alarmingly petty. Right? And I think there’s a lot to be said for, um, you know, the, the, the American political rights sort of co-opted this term values like a few decades ago, right? Like values voters, right, right. Or whatever. But if you, but if you think about like values and design, like one of the things that I think is really positive about, especially my students today who I hang out with, like they, they curate their own world based on who is coming at them with information that embraces things like sustainability and critical discourse and inclusion. Right. And the more you can kind of amplify like those values, right. And say those are the things that, that actually makes something worth listening to your worth paying attention to, then yeah, that’s, you might actually get somewhere. Yeah.

Justin Hendrix: 32:23 Well, you all have set us, set, helped us set some of the key themes, uh, today, which we’re going to expand on. Uh, Sabrina road. She’s over there. Sabrina, wait, she’s going to be helping me lab, uh, put together, uh, this media 20, 30 project. So I looked for her outside. If you can, uh, over the next decade we’re going to hope to have, continue to have this conversation, draw folks together, explore what the right hypotheses and questions are, uh, think about the trends and issues, um, and hopefully come back to you in a year from today. Uh, you know, maybe not with answers, but with some ideas about things we could do together over the next decade that might lead us to, uh, some conclusions, uh, at least in that time frame. Um, cause I know these are big questions and they don’t get solved necessarily. Um, certainly not just in a workshop. Um, but, but maybe not in a decade. Um, but I want to thank you all for being with us today and you give them a round of applause.

Speaker 2: 33:19 thank you all so much for joining us.