The Social Cost of the Information Age | On Civil Society | June 26, 2019.

The Social Cost of the Information Age | On Civil Society | June 26, 2019.


Mutale Nkonde: One of the things that potentially
makes me very unpopular in some rooms is that I’ll often argue that technologies are social
tools that are created by engineers and technical processes. But they’re visited upon people in technology,
and there is no market for humanity, there is no market for human rights. Bianca Wylie: Right. [music] Bianca Wylie: My instinct is that we are in
pretty dystopian times already. I’m curious, what you both think about that
statement. But I feel like when we talk about technology,
sometimes we’re like, “Well, it’s getting kinda bad, or things are bad, or I’m not sure
how the democracy is working, or we’re losing track of truth,” and all of these issues. But I think it’s also really important to
ground the current state, in Canada for example, we shouldn’t have people who don’t have homes,
we shouldn’t have people who don’t have food, we shouldn’t have the kind of violence that
we have in many different versions. And so I suppose to start, do you think that
when we talk about the concerns of technology in society, that sometimes we’re skipping
over societal problems, period? And it’s a bit of a problem? Like we’re gonna solve problems in AI but
we’re not really looking at the core problem in our democracy or in our society? I’m just curious from either of you, just
as a provocation, what do you say to that? Mutale Nkonde: From my perspective, so I work
in Congress, and one of the things that we’re looking at from the United States’ perspective
is how we’re going to regulate and live alongside artificial intelligence technologies. But in the AI futures Act, which is really
our national priorities around that, our first priority is creating and enabling business
environment. So even when you have a framework where the
most important thing is economic capture, that sets up a conversation around technology
which is all around gain, and it’s all around markets and it’s not necessarily all around
the people that are going to be using these technologies. And I think one of the things that potentially
makes me very unpopular in some rooms is that I’ll often argue that technologies are social
tools that are created through by engineers and technical processes. But they’re visited upon people in technology
and there is no market for humanity, there is no market for human rights. BW: Right. MN: And so therefore, I am certainly somebody
through my work who’s looking to shift the framing of that slightly, but then obviously
balance that I’m working in the United States context. This is a capitalist system and I’m going
to have to negotiate those norms in order to be successful in my work. Taylor Owen: I’m not sure there’s as much
of a disconnect as you’re positioning that as. I think social economic inequality, social
injustice, racial intolerance, those are all structural issues. They’re functions of our economy, of our society,
of our democratic systems. And those structural dynamics of our society
are fundamentally being affected and reshaped by technologies. So I think it’s okay to look at them both
as part and parcel of the same challenges we’re facing. One of the structural drivers of our economy
and our social interactions, and our political interactions are the technologies through
which we interact with each other. MN: But where I would argue slightly, just
push back slightly on that is, I would argue, certainly through algorithmic decision-making
that these structural issues are actually being intensified by techno-socio-systems. So, yes they exist, but using an algorithm
to decide whether somebody should keep their children or not, I don’t know that that’s
the application. TO: Yeah, I couldn’t be more… Absolutely. BW: Yeah. And I think, just to pull it out a bit more,
the reason that I think it’s courageous particularly the work you’re doing on what to do about
Facebook for example, with hate speech, I listen to these conversations and I think,
we have structural racism, we have white nationalism is rising in Canada, we have school systems
that haven’t adjusted. So we have very structural deep problems with
racism. And I sit and I go, “You’re not gonna solve
that on Facebook.” TO: No, but it might be that the technologies
we use exacerbate some of these pre-existing social dynamics. BW: Absolutely. TO: I think that is clearly the case with
the design of some of these technologies, right? To the AI point. But the very same facial recognition technology
that Amazon is using to track consumer behavior is also being used to track illegal migrants
in the United States. That is the case where a technology developed
inside a certain economic system is being deployed for something entirely different,
exacerbating the social dynamic. MN: And also, facilitating their deportation. So it’s not just tracking the migrants it’s
also then enacting these other political systems that are harming families, child separation,
but that’s not the fault of technology, that’s the application deployment and potential need
for regulation around that. BW: And I think this is what I’m getting at
is, I think is it not true that if some of those base conditions were addressed, and
they’re not technical, they’re political decisions, they’re about funding, they’re about providing
resources to people, they’re not technical, that the negative impacts of technology would
be less or would be smaller. Sometimes, I think we’re just sort of avoiding
the core problem and working on the edge. That’s what I’m driving at here. So, yes or no, you know what I mean? What do you think though? Do you think, where should we… I just worry as a society we keep kicking
down, Ruha Benjamin called it. It was that finally someone gave the term
to me, a perversion of knowledge. We know what we should be doing to resolve
some of these structural issues. We’re not doing it. And now we’re worrying about Facebook. MN: But that’s not profitable to think about
that. So just going back to the AI futures Act,
the first priority’s enabling business, the second priority is privacy. And that’s specifically thinking about in
context of national security, it’s not even my privacy or your privacy. The third is, oh, god, I’m now forgetting
the third, I’ll get back to the third. But the fourth is bias, and bias of what,
bias of whom, what does that really mean? I’m sorry, the second priority is future of
work, and then we go down. So we’re looking at let’s get as much cash
as we can, then make sure the robots don’t take our jobs. [laughter] MN: The people concern is right at the bottom. And I think… I’m very strangely received in certain rooms,
where they’re like, “Oh my God, here’s the person that’s gonna talk about people.” I also have to point out, “Who do you think
makes the technology, like the robot’s not gonna make the robot, we’re gonna need a person
at some point.” And what’s wrong with humans anyway? They’re fun. BW: Right. [laughter] Right. Right. And so given this human piece, and thinking
about regulatory options, right, like, do you feel, or are you seeing signs in your
work right now that democracy’s up to the task to rein some of these issues in? MN: No. No. No. BW: I really appreciate that direct answer. [laughter] MN: And I think for me, why I’m seeing no
is part of technology is like spooking people and trying to act like people often be like,
“Oh my God, the AI can do this.” And I always tell a story. I was on a flight recently, and I had this
horrible interaction with the customers person. So I’m tweeting, I’m a big Twitter user, different
in person, live online. And I’m tweeting at the airline, and I’m saying
I’ve had this really, really bad interaction, and I’m basically trying to get a response
from them, and I do. So then when they resolve my issue, partly
my fault, partly their fault, partly just immigration people have got complexes but
that’s a different thing to talk about it afterwards. MN: And I tweet back and the bot says, “Please
DM me.” And then I realize, I was not interacting
with a human being. I was interacting with a bot that was picking
up keywords and looking for retweets. So then I call the airline and say, “Of all
the people that you want to use customer service bots, [laughter] I may not be the one and
this is why.” And I use that story because there is this,
when we’re thinking about democracy, we’re often, the first line of communication is
not even often a person. There’s been some type of machine learning
sorting going on. MN: And I thought it was brilliant in the
previous panel when you spoke about the role of Facebook in electing people. One of the things that I’ve been looking at
specifically when we were looking at deepfakes and what it meant to communicate online and
what it meant to be online and saying, “Well, do we need in the States social media blackouts
before elections?” And they all looked at me and they’re like,
“This is why we don’t let foreigners into this country.” Because you know, they’re coming with all
of these EU ideas. And I was saying “You know, we’ve had Trump
now for a minute. How’s that working? Maybe… ” BW: Maybe. MN: “Maybe we should just speak to our neighbours
and think about,” and obviously it wasn’t taken up. I don’t think they’re going for it. I haven’t made news. But it was this idea of the people that we
are online are not the people that we are often offline, the communities that we are
able, we may be a white supremacist online and really get a lot from being part of that
community but we may not be in our real lives. And that skews your vision. I mean, I’m thinking about the Charleston
shooting in the United States with the guy, he goes into the church while the African-American
Bible study. And when they looked at his online behavior,
he had gone down this rabbit hole of like white on white crime and other things that
don’t exist, right? MN: And been terrorized to the point that
he assassinated nine people during worship. And I remember Data & Society, the research
institute I’m coming from, some of my colleagues that actually looked at the algorithms that
drive advertising and video behavior and YouTube and found the extremist content was being
fed to people that had already seen extremist content and then kind of making them worse. And this was undermining their ideas of community,
their ideas of society. And when we did the racial literacy and technology
report, we were speaking to engineers in tech companies in Silicon Valley and they were
saying things to us like, “That’s not a freedom of speech. We have the First Amendment. That’s not something that we can mediate in
any meaningful way. There’s no such thing as racism.” And I was like, “Really? Like no such thing? Like never ever heard of before?” Okay, I see the problem here. And it’s that kind of eraser and depersonalization
of society that undermines what are communal projects like democracy, like voting. BW: Yeah. And I think, what I’m curious about from both
of view in terms of the government and the big technology companies right now, like I
feel for me, one of the scariest moments last year was when Senator Al Franken was… I wanna say interrogating Mark Zuckerberg
about election interference. And he was basically saying to him, “How did
you not know that the Russian… That Russian money was being used to purchase
these ads?” And I remember watching this and people who
were like, “Yeah, yeah. Come-on, Mark Zuckerberg. How didn’t you know?” And I remember sitting there going, “It’s
the government that’s in charge of the election. Why are we looking to Mark Zuckerberg for
election integrity? What is going on?” And this is what I mean about the dystopia
is here, is that I don’t know if the rules are getting a bit overly mixed up here. If the government’s looking to Facebook to
have integrity in its elections, I think we’re kind of going into the upside down. So what do you think about that? TO: This morning I was on at another event
downtown, and just following Jim Balsillie. And I said after, it’s very rare I get to
be the optimist in a panel on conversation about tech and democracy, and after I had
my… I feel like it. And again, I think this might be the case
repeated again. So maybe two times in one day, I’ll be more
optimistic about technology than I usually am. But I think it’s worth framing this conversation
and how we, as a society, have embedded technologies in our lives for the last 30 years and how… The degree to which governments have stepped
back from that process. BW: But who’s the we? When you say we as a society, who’s the we? TO: Well, that’s a very, very good and big
question. What I mean to… Referring to is democratic citizens in democratic
countries. We as a collective have brought and allowed
technology into our lives via a almost entirely unregulated system. And we did so for, I think, some good reasons. We for a long time, and governments in particular
for a long time, saw a alignment between the economic benefits of the digital infrastructure
we are building and the democratic uses of that same infrastructure. For a long time, we saw both as being aligned
with democratic and economic outcomes, positive democratic and economic outcomes. TO: And so we stepped back and allowed the
technologies to develop in this largely free market system. And it’s only in the last few years that we’ve
started to look at the… And started to feel and see and be able to
capture and hear the negative repercussions and some of the social and economic costs
that have come with the adoption of those technologies. And I actually think governments are starting
to pay attention to this. I think, the fact that we’re having the debates
we’re having now, would have seemed impossible two years ago. And governments are slowly starting to turn
attention to this. TO: Now, there’s a… You asked if our democratic system is capable
of engaging in this. And I think because we waited so long and
we allowed the digital infrastructure to mature to the degree it has and to frankly monopolize
in a very small number of very powerful global institutions, it’s gonna be very difficult. But there is some degree of change in the
public and democratic discourse around this that I think we’re trying to get at here. BW: I’m curious to your take, ’cause the one,
I think a lot of this… I’m with you except for the part that there
was complicit consent that people participated, when you said it was citizens. I feel as though you need to know what is
happening in order to be part of that, to have agency in the decisions. And I would say in the last 20-30 years, I
think a lot more has been done to people than they have had that agency to do with technology. I don’t know, what’s your feeling? MN: I largely agree, but I would also say
I feel like there was a real cultural movement around the acceptance of technology. And that was very intentional. I remember Safiya Noble, a Professor at UCLA,
wrote the book, ‘Algorithms of Oppression.’ And it was the first time that I realized
that Google wasn’t some non-profit that was set up to just answer my questions. [laughter] And I was like, “What? I need to go back to the library. That’s why I don’t know anything.” But around that, there were… There was the idea, at least in the States,
that Google was a great place to work. So you would see them showing up in other
news. The people there look really happy. President Obama was coding with some kid on
the Internet. And it really… At least in my world and where I came into
technology through education and as an educator and running around America and telling black
kids that they should code. MN: It was really this like, “Oh my God.” I actually remember saying “You should really
learn to code, ’cause maybe you too could be Mark Zuckerberg, and you too could be rich,”
and really believing this. And so it wasn’t so much that, it wasn’t this
complicity for me. It was a wholesale acceptance. The first day I showed up at Google, I thought
I had made it. I’m Zambian. That’s where I was born. What’s up to everybody in Zambia? I was like, “We have made it. We are in there.” [laughter] MN: We’re gonna get free food. [laughter] MN: And then once there, realizing the power
dynamics that were at play and realizing… We were speaking in the green room and I never
use this language, but realizing the way, as a black woman, I was being leveraged, and
showing up on panels of stuff I didn’t know about. Like, “Do you know about this software?” I’d be like, “Oh, okay. No.” And also realizing that it was a business
and the role of technology is, the role of private companies, public companies, are to
increase value and increase shareholder value, and I don’t think that anybody should criticize
them for that. That’s really not my position. My position is that there should be a counterweight,
there should be a public sector counterweight that centers the public good and the common
good and then it’s up to the government of that country to decide how they define that. BW: And that’s what I’m not… ‘Cause I’m pretty optimistic that things could
be drastically different. I just don’t see government action headed
that way nor investment headed that way. I think that’s… My concern is that it feels like everybody’s
kind of okay with this ’cause most of the harm is being done to people who have other
harms being done to them. I don’t feel that there’s a big urge to really
resolve in a core way, but… MN: Where I would push back in the American
context on that was that Robert Mueller’s report on the election interference really
sent ripples through Washington. And in fact, in terms of audio-visual manipulation
and deep fakes it was Marco Rubio who was the first person to actually go to the Heritage
Foundation and say, to a very right-wing… I never thought that my interests would align
with them, but I was like, “Yeah, go ahead. We shouldn’t be doing this.” And the idea about it was the national security
argument. So when I went in, and I was saying, “These
are technologies that are being weaponized against black women and girls. This is unacceptable. We have to move on that.” People were like, “Oh yeah, that’s really
sad.” And then I was like, “And they stole the election.” And they were like, “Oh yeah, no. Yeah, yeah. We can’t have that. We don’t want… The Chinese are coming, the Russians are coming. We don’t want those things.” So I would say that there’s an impetus. MN: I would also say that there is the desire
for economic capture too. And if that isn’t… If that’s not included, then any arguments
are ineffective plus innovation, I actually love technology. I think innovation and ingenuity is something
that should be celebrated. It just shouldn’t be at the cost of humanity. BW: Agreed. TO: There’s been a real pattern in the governance
conversation of these… Mitigating these social and economic costs
on both the platform company side and on the government side to take individual bad things
that usually are identified by journalists or by researchers and get a degree of public
currency and addressing solutions to those specific negative things. MN: Yeah. TO: And it’s this sort of whack-a-mole thing
we’ve been in. And Facebook is brilliant at this, every time
there’s a bad thing that gets identified, they apologize and do some little tweak to
try and reduce the chances of that particular bad thing happening again. And that is a very bad sign, when that pattern
entrenches. Because the problems we’re talking about are
structural and they’re embedded in the design of these technologies and in the economic
model that really just a small number of companies… We need to be really clear on what we’re talking
about. We’re talking about a very particular business
model by a very small number of companies that is the root cause of a lot of what we’re
talking about here. TO: And so I think the sign that governments
are beginning to get serious, and when you start to see initiatives that get at… That target those structural problems. And I think we’re beginning to see that. Even just look at the democratic primaries
right now. Half of the democratic candidates for president
are talking openly about anti-trust and the tech sector. Two years ago, never would that have happened. The German cartel office is talking about
regulating privacy as a consumer harm in competition law. That is a major thing. GDPR was a total shift, a 10-year process
and it’s already out of date, but that is a major change with global implications to
how we’re all using technology right now. So those things that get at the structure,
the privacy issues, the competition issues, are the signs we’re moving in the right direction
here, and I think there are some of those. BW: Yeah. You’ve, I think both in the course of speaking
to you both have raised the idea of utilities as something to think about. Can you riff a bit on that when you think
about technology companies and public utility or just utilities? MN: I think so, certainly in online search,
I can see a legal argument for treating online search as a public utility, because it’s something
that we all have to use all the time. There’s actually no way around using Google
for online search for example. On the internet there is DuckDuckGo, I love
you, don’t hate me but they don’t really have the market share. So it’s not just the utility conversation
it’s married with the anti-trust conversation. And then this idea of promoting small businesses. You would have to build a complete narrative
to get the shift and have a raft of principles. And the reason that I like those are, I don’t
believe in laissez faire economics. I think the invisible hand crushes poor people,
black people, vulnerable groups, and that’s not the type of society that we want to live
in. MN: So we do need to have a measure of regulation
that protects the most vulnerable, but then frees the more resourced groups in society
to go and pursue their will. And I really love this idea of while the bottom
rises, the top… While the top rises, the bottom rises too. And I am actually really optimistic that at
least some of those may stick just because we are now beginning to see Alexandria Ocasio-Cortez
and other women like her in the United States context, who are social democrats, that’s
what they’re saying and they’re looking to shift other parts of the economy. And so I think that this could be a natural
shift. Obviously, Silicon Valley are not gonna like
these proposals, but I would argue that Silicon Valley are more powerful than the government
in some ways. So we need to… We need a redistribution of power. No what… No, five companies, 10 companies, 20 companies
should have that amount of power. That just seems ludicrous. BW: Yeah. Well, we were talking earlier about how these
companies just aren’t rational anymore, ’cause money doesn’t matter they can just sort of
do whatever… MN: Yeah. BW: And drop it. Try a product out, if it doesn’t work, they
can leave it alone, and… MN: Yeah, they can do anything. BW: Right. And I think that’s the part in the monopoly
discussion that’s not getting addressed, ’cause this is one of the outcomes is once companies
get that big and have that much money, they’re just starting to do things that like you say,
it’s more power than government. And I think that’s… MN: Well, Libra. They… Facebook just got a currency the other day. TO: Yeah. BW: Right. MN: Crazy. MN: Right. And this is the thing, it happens, and everybody
looks around and says, “Oh, so they’re doing banking now.” And then goes back to lunch, right? I don’t… This is… TO: Yeah. The Libra case is a fascinating case of a
company that has been in the lowest… BW: Can you explain what it is, quickly? TO: So… BW: Ish. [chuckle] TO: Facebook and a number of other companies
have proposed a type of currency that will be backed by this fund that… An investment fund that will stabilize the
currency and could be used as a digital… A means of trading digitally. It’s not really a cryptocurrency, it’s something
a little bit different. But they’re basically trying to become… They’re trying to be a means of exchange for
the internet that Facebook can then embed in their products and be a lead on. But I think that tells us something about
how they perceive the regulatory threat right now. So to be a little more pessimistic about this. The company has never had a worse public image. They are facing the most regulatory pressure
and investigative pressure they’ve ever faced in their history, and they think that now
is a good time to launch a global currency that could destabilize the government function
of controlling monetary policy in their countries, right? Globally. MN: Crazy. TO: So I think we do have to put the regulatory
threat in perspective, right? That they think they can still get away with
that, or they can… That it’s still worth taking… I shouldn’t say get away, they think they
can still take the risk of pushing aggressively into a new space that challenges the authority
of democratic states. And that is an interesting statement. MN: But then I think even with… They’re just not living… There are alternative realities and I just
think that we are in a reality that potentially they’re not. Because these are also the same people who
we’ll have discussions with them, and I’ll say things like, “Existing civil rights law
in the US does not allow for racial proxy or accountability for racial proxy because
of intentionality.” And they’ll say things like, “Well, then we’ll
get an ethics board.” [laughter] MN: And I’m like, “What? Who cares about your ethics board? [chuckle] Get one, get 10, you’re still gonna
get regulated.” So there is this feeling that the language,
like the English I’m using is not reaching their ears and part of it is just being insulated. I was saying in the green room how Mark Zuckerberg’s
wife has just bought a school district. Their kid is coming up to school age, so that
they were like, “Oh, let’s buy a school district.” These are not people that share our material
circumstances. The rest of us are just like, “Please, can
I get into the school? Please? I’m gonna make cookies. I’m gonna make friends with the people.” And not these people. And I was really happy that Congresswoman
Maxine Waters, she heads the Financial Services Committee and she’s already moved to stop
Libra going anywhere. But they had a nice launch party. BW: Right. TO: And the… You mentioned their financial capacity, but
they actually are very much determined by their financial model and their need for profits
from the existing financial engine they have. They’re a publicly traded private monopoly. Who have shareholders who demand growth, who
have directors who are responsible for growth, right? So that’s a really strong structural incentive
to behave a certain way. And it requires an insane global scale, that
everything they do needs to be applicable… [30:23] ____ Facebook, it’s 500 billion pieces
of content a day, right? That scale is massive. And so everything they do needs to be thought
of in that… From their perspective, and how can this be
rolled up globally? So their latest speech regulation is in this
idea of a court, like a… Whatever they’re calling it, their moderation
court or council or appeals court. TO: It’s going to be 40 people that will over… Be the appeals court for a private entity
mediating speech globally in 4,000 languages, right? MN: 4,000… 40 people, 4,000 languages, how is that… TO: 40 people, 4,000 languages, 150 countries. So that is an economic model that’s determining
their need to do that, but it also leads to the exact situation you’re talking about which
is a total disconnect between national and local context and the cultural and social
dynamics and regulations and policies we put in place over… MN: Right. TO: Super problematic, but worked through
over long periods of time through messy democratic processes, are not necessarily embedded in
the design of the technologies that are being deployed locally. MN: And also, to your point about this international
context, Libra may seem ludicrous in Toronto, right? But in unbanked global south where they want
to capture that market, it may be seen as a liberatory act. It may be seen locally… BW: Do you hear the sirens? MN: Right, exactly. Yeah, call the police immediately. It may be seen as this act and I was… Google have just opened AI Africa in Accra,
and I’m getting all of these emails like, “Oh, you should come and you should tell us.” And I’m like, “Busy today. Nope.” [chuckle] MN: “Nope.” But they’re really like… BW: Yeah. MN: And I think this is the problem with the
40 people. Which 40? Which nations? Are the global south included in this? Are they aware of the risk? ‘Cause we weren’t three years ago, to your
point. TO: Yeah, absolutely. BW: So we’ve had some discussions about mitigating
these issues, just before we head to questions do you see any regulatory or policy or legal
moves to start building out other structures or other opportunities? Or do you see any like what, from a democratic
perspective, should we be pushing on that’s not just mitigating in defense? How do we start getting into what we want
the internet to be? What we want it to look like? How we want it to function? Is there… Do you see any motion there? And it’s okay if you don’t. MN: I’m loving my brothers and sisters in
San Francisco right now for banning facial recognition technology. I think that the surveillance of people in
public space, particularly on technologies that are untested, because that’s the other
part of this is often when these technologies are sold and rolled out, it’s with the assumption
that it works. But if you have a technology that’s supposed
to see faces but cannot see mine because it’s black, that technology does not work. Right? MN: And then you’re embedding it in automated
vehicles that are supposed to stop because they recognize people and I’m not even recognized
as a person by the technology, I don’t think that everything needs to be improved. I think we should have the political will
and community organizing around what we don’t want so that we can make space for what we
do want. We do want greater efficiency, as long as
that efficiency isn’t at the expense of vulnerable people. I do want to be able to speak to my grandmother
in Zambia and share my life with her in a way that I wouldn’t have been able to 50 years
ago. Those are things that I want, right? MN: And I sometimes get very frustrated when
people… It wasn’t until San Francisco banned facial
recognition that I was seeing op-eds that were like, “Wow, we can stop technology. We’re still in control.” Same with listening technologies. We have in New York a technology called ShotSpotter,
which the NYPD used to… If there’s a shooting incident, they can listen
to the audio files. They have microphones all over certain neighborhoods,
yes mine and if a shot is heard, they can then listen into the conversation around that
so that they can try and catch the people. And there are some people doing an investigation
now, that are finding that sometimes they’ll just pretend there’s a shot because they just
wanna listen to what people are saying. That should not be allowed. BW: And I think we dodged that one. Toronto was gonna be the first Canadian city
that was gonna use ShotSpotter and I believe at this point in time there’s been a hold
put on. MN: Well, I think your Sidewalk Labs people
are scaring people off and we need that too. TO: I can’t believe we went an hour in a conversation
with Bianca and not talking about… That was the first Sidewalk mentioned. BW: Any questions? TO: I was just gonna transition to Sidewalk
already because I think that is a case where communities and citizens have decided to reframe
a debate about technological innovation and the particular individual cool things that
are possible with technological innovation and reframed it as you have led into a conversation
about civic space and democracy. And that is a really positive sign ’cause
we’re starting to talk about these issues, not as individual pieces of technology that
play useful roles in our lives, but we’re talking about them as social and political
infrastructure. And that is a really positive turn. MN: And it’s a model. TO: Absolutely. MN: I know in my work, I certainly invoke
your work and the work of… I’m so excited to be… Well, I’m excited to be in Toronto ’cause
the Raptors won and I was like… [laughter] MN: “What a great time to come to a city.” And I was like, “I wonder if Drake’s gonna
be here?” [laughter] MN: But also that is a great example of what
potentially could be done by a city and what can be learned from cities as well because
I don’t want… I would hate for people to walk away with
the impression that I hate it and we shouldn’t do it. No, I just wanted to be in concert with like,
when we invented the wheel, we didn’t run over humans, right? We still needed humans. So living with, working with, is kind of where
we’re going. BW: And it’s messy.

One thought on “The Social Cost of the Information Age | On Civil Society | June 26, 2019.

  1. socially speaking-the internet makes it very easely stay in touch-social means you know this person-any other reference to mean social ind making headlines using the word social- is mis-leading.

Leave a Reply

Your email address will not be published. Required fields are marked *