- TGI: Transactional-Generated Information is the fuel for AI control
- Artificial Intelligence: “It’s time to come to terms with the machine”
- Dr. Oscar Gandy, Author of “The Panoptic Sort: A Political Economy Of Personal Information”: Functioning in a completely monitored environment…
The McAlvany Weekly Commentary
with David McAlvany and Kevin Orrick
PERPETUAL SURVEILLANCE: YOUR SMART CITY
KNOWS MORE ABOUT YOU THAN YOUR MOTHER DOES
November 21, 2018
“How do you keep the actors focused on the good of society versus a particular agenda which may, in fact, marginalize a group of people because they have the tools to do just that? They have the power because they have the information and they can, if they choose, marginalize anyone – not just an individual, but an entire subset in society.”
– Oscar Gandy
Kevin:Dave, we are in the world, now, of smart cities. I read an article last year, 2017, about the 172 million cameras that China now incorporates with artificial intelligence. To test that, it was a BBC reporter named John Sudworth that was just randomly placed in one of the cities – these smart cities – and within seven minutes he was identified by the artificial intelligence. With nothing more than just a picture they knew exactly where he was.
David:Did the Chinese learn anything from the British on that? In terms of camera surveillance I’m not sure that prior to the Chinese rollout anyone had them beat in terms of the number of cameras watching the citizenry (laughs).
Kevin:But the Chinese had the information like the Sesame Credits.
David:That’s right, the Sesame Credits, this ability to gauge someone’s value in society using a multiple-factor input where you can score their credit, you can score their loyalty to the party, whether they qualify for insurance or a loan. A lot of this has to do with feedback from neighbors. There are lots of ways to get your stars.
Kevin:Your Facebook posts, your friends, your neighbors, who you dated – all that factors into a credit score. If you go online and actually look at some YouTubes on this facial recognition, Dave, they actually have a lot of that information walking around with the faces. You will see them identifying them and it will have their name, and if you click on that there is information that is available on that person. It is not really fair because the people that they are identifying don’t have that same access back. Google has been criticized – in fact there was a high-level executive at Google who walked away because he felt like this Dragonfly program in China was overstepping the boundaries. It is a censorship program using Google.
David:I think we have all sat at coffee with someone and had a conversation where you were hoping that they would share something and you would share something in response, and there is this balancing act between who has what information on the other person. We don’t think about it in terms of data, we think about it in terms of intimacy and knowing and relationality, and there is greater relationality when there is this reciprocity of a sharing two ways.
And you are right, whether it is Dragonfly, which has zero reciprocity, or what we have developed in terms of a data-gathering apparatus through Google and Amazon and some of the giants on this side of the pond, which, of course, have a global presence, there is less and less informational reciprocity. It goes one way, data collection, and fortunately there also is this control or power relationship that is implicit.
Kevin:Sure. It is very powerful if I have more information on you than you can get on me. Our guest today is an author that you read his book back in 1993.
Kevin:Jeremy Bentham. The things we are talking about today you would think are all current, but his book actually starts with technology that goes back to the 1800s, the panopticon, which was a round structure made of bricks to keep prisoners in where they could be watched, but not necessarily know when or where they were being watched.
David:Yes, the architectural structure allowed for greater efficiency in management of a great number of people using just a few guards. So the role that those guards played, and the role that the – well, let’s think of it as a second-guessing – do you know that you are being watched? You believe that you may be watched at some point, or all the time, and therefore your behavior changes as you are being watched. That idea of that architecture built by Bentham was expanded on by Michel Foucault, and he borrows that idea as he writes the book, The Panoptic Sort, the sorting of individuals into this viewable world.
Oscar Gandy is Professor Emeritus from the Annenburg School of Communication at the University of Pennsylvania. He has written a number of different books where his areas of interest run the gamut from privacy to race, to information technology, media framing. These areas coalesce in political economy, which is something that we have long had an interest in.
Kevin:The book really raises the understanding that he who has the information really does make the rules. It reminds me of something your dad used to say.
David:I was raised with the saying, “He who has the gold makes the rules.” That was an insight into geopolitics and it was an insight into foreign relations and economic dominance. Well, today is a little different. Data is the new gold. So I am curious, Oscar, who makes the rules now?
Oscar: I would say, from my perspective, the particular actors with the gold have changed rather dramatically. With the development of dominant players in the information economy, that is, information platforms like Facebook and Google, as well as the platforms like Amazon that dominate the market for goods and services that rely on the Internet. There are, of course, other actors. We call them data brokers, those who collect, process and make recommendations on the basis of their analysis of what we might refer to as transaction-generated information, or TGI as a favorite term of mine. Government agencies still play an important role, but today, I think, as in the past, they are really responding to the demand of the big players, the transnational monopoly firms have all sorts of gold, including data.
David:The title of your book, The Panoptic Sort– I’d love to try to define terms and give a background for how, conceptually, this is a helpful lens to look through. Let’s start with Bentham, let’s move through Foucault, and then discuss the modern expressions of the panopticon. That’s a word that is not going to be familiar to all of us, so tell us about the original architecture, going back to Bentham, and of course, maybe you can define the term as you go. Foucault added this interesting piece, the implicit power relationships in the panopticon. So give us a little bit of a historical sweep for what the panopticon is.
Oscar:For those that have no familiarity with this, Bentham offered a model of a prison as an architectural system. It is one that would allow observers to keep prisoners under surveillance but it did so, or would do it, in a way that it was not possible for the prisoners to know when they were actually being watched. This prison, which was not actually built, was thought to empower the guides, the owners, the managers, because the prisoners would never know whether they were being watched, and they were also fearful that, in the context of this kind of total surveillance, the threats of punishment or discipline that scholars were concerned about, would always work in order to keep these prisoners under control.
Michel Foucault’s work was, and continues to be, important to me, in part, because of the way he explored the ways in which surveillance that he understood in terms of a variety of forms of observation, both direct and indirect, his analysis of data in prisons and schools and medical centers facilitated control of populations or societies. That is the part of Foucault that still works with me in terms of understanding different forms of what is called governmentality, governance or control that goes beyond the prison, and goes to society as a whole. Currently I am focused on cities and the ways in which this surveillance in smart cities reflects the kinds of debates that Foucault has established about the ability to govern through the management of data.
David:So we start with Jeremy Bentham, who is talking about an actual architectural design. We move through Foucault’s look at these as a relationship between the observer and the observed. In your work you are working with a metaphor. The panoptic sort represents the technology which surveils and which categorizes and which controls data to the benefit of, again, in your book you are looking at both corporate and governmental bureaucracies, and it is based on Bentham’s brick and stone prototype. Do people realize that knowledge, and knowledge about them, and power, are inseparable?
Oscar:I don’t have a basis for answering that from my own research. I don’t know if people really understand the extent to which knowledge and power are inseparable, but I really don’t believe that they understand the extent to which this TGI that I referred to, this transaction-generated information, is being used to characterize them, and more importantly, to make decisions about how they are treated by a whole host of institutional actors who have the ability to use the power that information and knowledge, and occasional wisdom, provide to them. I don’t think they understand that. I think those of us who do this every day are struggling to understand how the collection and use of information reinforces and establishes different kinds of power, and also reallocates it as the technologies of gathering and making sense of information challenges.
David:There is also a sense in which you have selfhood on the line. Consumers have data about them which is floating around. Today we have big data, and it is used, in many respects, to redefine selfhood. We could probably talk about that for hours, but maybe you could just give us some idea of how impactful data is to redefining, or defining, the self.
Oscar:I’m not the philosopher that you are. I am not trained in it. I have a new colleague that I have been writing with who does come from philosophy. So it is a struggle for me to talk about selfhood in the same way that she would talk about it. But, really, for me, it is influence over the opportunities that people have in order to make decisions about and exercise freedom at our time in society.
I am talking about, as you suggest, that this is about correction and control of individuals. And I’m not sure that people are engaged in efforts to adjust their freedom in these spaces. I am not sure I responded to your question, with me avoiding discussions of philosophy at this point.
David:(laughs) No problem. You begin the book by saying that the evidence is clear. The similar discriminatory process that sorts individuals on the basis of their estimated value or worth has become even more important today, and reaches into every aspect of individual lives, in their roles as citizens, employees and consumers. This is the panoptic sort. And as you say, the all-seeing eye of the difference machine that guides the global capitalist system – again, that is the panoptic sort.
Toward the end of the book you conclude that the panoptic sort is an anti-democratic system of control that cannot be transformed because it can serve no purpose other than that for which it was designed, the rationalization and control of human existence. You may reflect on this personally, or more at a collective level, but it sounds like the idea of being measured, moved and manipulated doesn’t sit well with you.
Oscar:That’s a quote from Jacques Ellul, that isn’t my own thought, but I still pretty much agree that this condition, which is certainly possible, it certainly represents a point along a path that I believe that we are on, is not something that we should be looking forward to. Again, my absence of a philosophical basis is that that is not how I understand what human beings are supposed to be engaged in, that is, being completely controlled and completely dependent upon a machine, a device, a system or a structure. We believe in our individual and our collective freedom to design our own futures, and that is not what any of us understand is really down the road. So I don’t think that is a path that we would really choose to be on. It is certainly not one that I choose to be on.
David:I was fascinated, as you ended the book there was a personal reflection about how perhaps an earlier stage in your life, the idea of social engineering was not anathema to you, but as time has gone on you see it as suboptimal. Maybe that is my interpretation of what you were saying, but you were talking about the power of the individual being. We are talking about that being overrun by the power of bureaucracy, with data being the tool. We have talked a little bit already about the use of observation, allowing for correction and control, and we are obviously talking about something that is very effective, but do the results, being as effective as they are, legitimate this idea of social engineering?
Oscar:Part of what we all need to reflect on, and here is now a reference to another source in the book who was important then, and is still important to me, is Anthony Gittins. Gittins really talks about knowledgeable actors, that is, all of us are in this world and we make rules, are guided by rules, but we are participants in the process of making these rules. But he also says, although he doesn’t pay as much attention to it as I think he ought, that we really are not aware of the other actors who also are knowledgeable, and who are also pursuing certain kinds of goals, goals which may include control and influence of us. And we really don’t know about that, so we are not really informed participants, or equal participants, in this process of shaping our future, shaping the technology, shaping the rules, and the rules that technology now enables, in the same way that his ideal notion of the knowledgeable actor would suggest that we are all participating and shaping our future.
I would say, part of my dissertation reflected my orientation toward technology, and I was interested in educational technology and the extent to which educational technology was transformed by the involvement of the military and military thinking, and military thinking about control, and therefore it was a control of education that we are clearly moving toward, and I think, and many teachers think, is truly problematic in terms of how surveillance in the school and in the classroom, and with the use of this panoptic technology, is transforming what schools are supposed to do. Though, again, I am not, and a great many of the rest of us are not, pleased with the use of technology to create a future where we won’t have any worries. If we don’t have any worries we won’t have what life is supposed to be about, the experiences of new things and new people, new challenges.
David:So that is its impact, that is, technology on education. You also look at the impact of technique on areas like law. In one section of the book you discuss identification and classification, and prediction. These are the operations of the panoptic sort. I was fascinated by your discussion of what goes into jury selection, how classification of potential jurors aids in the predictive outcomes of a trial. G.K. Chesterton, ages ago, said he prized this idea of 12 ordinary people being involved in serious deliberation, being a part of that jury. What you are describing is, we now have 12 skewed representatives of a group likely to produce a desired outcome. Diversity is out. Would you agree that this legal institution is being compromised via the techniques and mechanisms of the panoptic sort?
Oscar:You have raised a question about another part of me which was certainly being developed in The Panoptic Sort, but has been explored more fully in some of my work after that. So let me respond. The idea that a jury is part of our system of governance, and the idea that there are actors who are advantaged because they have access to a technology, in this case the study of juries, says that that process which we created which was not supposed to be that, it was supposed to a process by which we were to be judged by our peers, and the only challenge was how would it be that we would select a jury of our peers, or that the justice system would choose our peers? What we have is that it is not the judicial system that is doing it, but it is those various actors with different degrees of power that are able to afford a particular resource that would give them an advantage within this judicial system, and that the system is not supposed to provide them with.
I would extend this to ask you and your listeners and readers to consider the extent to which this technology, as applied in the electoral system, as applied in the judicial system, as applied in the legislative system, all damage democracy, that is, the ability of actors in order to use this information that I focused on primarily with regard to communicative interaction that is that it would influence what we read, and would influence, perhaps, the kinds of questions that we have asked. It even influences the kinds of political representatives that we want. It even influences the kinds of judges that we want. It even influences the kind of legislation that we will support or oppose. And that is really a damage to democracy as we understand it as an ideal which is to the benefit of all of the members of society. So yes, I think this is a great problem of which jury selection is only one example.
David:Pivoting a little bit to what technology allows us to gather and retain. I would argue that there is a virtue to forgetting. It is something that we have had available through time, and some might even describe it as an expression of mercy. I read a book one time about a gentleman who could not forget anything, and it was really a tortured life, never being able to fully live in the present or anticipate the future because every past event was imposing itself. What does society look like when nothing is forgotten? Here we are talking about technology, and again, the sorting, processing, and matching up of all of these data points, but what does a society look like when nothing is ever forgotten, and all of these possible chunks of data are sorted and stored for efficient recall, but they are decontextualized? There is no context for them. There is no narrative associated with them? What does that look like?
Oscar:There are two things that we are forgetting, certainly an important one, and people in Europe have said people have a right to be forgotten in that things that they have said or done, or the kinds of transactions that they have been involved in should have a reasonable life, and that we ought to put a limit upon the data that ought to be used in order to make a decision about a person, in order for it to reflect what might be truer with regard to that individual’s life. What people don’t really consider is that we have moved away from data strictly about the individual in order to try to develop some generalization, and some correlation, and some predictions, about types of persons, that is, types of persons that don’t actually exist as real persons, but exist as conceptual persons, persons that are the product of a correlative analysis.
The idea that the science that I was trained in, the social sciences, where I was trained in order to try to understand systems and understand behaviors and understand the reasons behind them, has been set aside by an orientation toward prediction, an orientation toward what the person is likely to do in response to this offer or in response to that threat. Indeed, predictions about what a person might do if I, indeed, offer a threat or a promise.
So, the idea that our past is gone is true. Our past is gone, but we are not going to reclaim our past because this process of data analysis is not confirmed with, if you will, its relevance to the past, or its dependence upon the past, but whether or not that part of the past which has gathered and entered into an analytical system works, meaning, in terms of does it predict reliably? Does it predict more efficiently than these other systems or predictions that I had in mind?
So I would say, David, that there is not, for me anyway, that much of a concern with forgetting the past. It is really more about how it is that what one did in the past is used in a system which affects what I am able to do in the future? That is really what matters, not so much how it is that the data became available for the analysis and use of it in order to predict how different kinds of persons – not me, but different kinds of persons – might respond to a particular kind of threat, or challenge, or requirement, or an offer.
I do want to say that forgetting is not so important to me, and I would like to suggest that it ought not to be as important to the rest of us, because the models, the algorithms, that are making decisions about people like me, whatever that is, are being changed continually. And that piece of data from 20 years ago might have some role in there, but its role is marginal, at best.
David:I guess I am interested in how I, as an individual, am treated. I remember going to a heart doctor when I was 40 just to get a baseline for health moving forward, and the tests that were done showed that I had less than a 10thof a percent chance of developing heart disease over the next ten years. And then the next revelation was that I had a 50% chance of heart disease from that point forward, from the time I was 54. I asked the doctor, “Now, are we talking about the tests that were done for me, or are these generalized truths about correlations and predictions that deal with types of people, namely, my categories of male, North American, etc.?
Oscar:(laughs) Good question.
David:And so am I going to be treated differently based on generalizations and typologies, or am I, the individual, going to matter?
Oscar:I hope your doctor told you the truth (laughs).
David:(laughs) Well, the advice that was given was based on generalizable issues and not anything that was specific to me.
Oscar:Exactly. The people who are engaged in privacy and surveillance are struggling with this orientation within the law and within the history of the law, where privacy is about an individual. And then there is this small group of us who are trying to say, “Now, wait a minute. It is not so much the individual, except when he or she is chosen, but it is really about the types of people, and so we are trying to move toward a group, or a collective, or a community privacy concern.
David:And I think there is this toggling back and forth of who is in control. We work on Wall Street, we’re interested in finance and economics, and for a time there has been a periodic shift in dominance from Wall Street to Washington, and every once in a while Wall Street gets its hand smacked firmly, and Washington retakes the reins of control. And then the next thing you know you have Wall Street back in control.
I wonder if it is possible, because we have been talking about data, and we have been talking about a shift in power in light of who has the information and data, is it possible that the 21stcentury is more of a tug-of-war for power between Washington and Sandhill Road, specifically, Silicon Valley, the creators of technology, rather than Wall Street?
Oscar:I think it is a good question, understanding where the power is located. But my response to this is one that asks us to reconsider whether or not the location of power is in those two places that we have identified, that is, in the state, maybe presumably in terms of the federal government, and then the state and the corporation wherever it is located, including a transnational corporation, or whether it is in, let’s say, a particular segment of the corporate sector, which is those that are involved in information technology.
I would invite a consideration of placing this power, and continually changing sets of actors, if you will allow. I am currently exploring the nature of power within the city that is wanting to identify itself as a smart city, for all of the benefits that can be derived from such an identification. So here you have local city governments that are somehow struggling for power in relation to a county and a state government, as well as a federal government. They are seeking to develop their own, if you will, system of power within that city, but they are doing it with, what may be for them, that is, the city, a new, if you will, combination of powerful actors, referred to now as public private partnerships, or P3s.
So these actors, which are including my own group, so that is universities are participating in these P3s, these public private partnerships, and establishing decisions about what kinds of data are important for the operation of the health care delivery system, for the operation of the transportation system, for the operation of the criminal justice system, for the operation of all of the systems that play a role in organizing our lies within this thing that we call a city, and maybe a smart city, because we are several steps ahead of other cities, to the extent that we have incorporated information technology in the production, the analysis, and the use of data in order to intervene in all of these sectors in order to lead us to a better, more productive, higher quality standard of life.
So it is not, as you suggest, reasonably, that it is government or the tech sector, but it is a combination of all of these sectors enabled by, and if you will, influenced by the provision of new systems of communication, beta generation, beta analysis and prediction. Our challenge is to figure out how it is that we will govern, that is, again, back to Gittins, what kind of systems, rules and constraints will be establish in order to say that all of these actors are using the power that they may have temporarily in order to realize the kind of good life that we are all entitled to in our concept of a democratic society. It is a very challenge-worthy plea.
David:Well, it’s a great challenge, and I think you are right at the core of what we are seeing emerge, not only in the United States, but in other parts of the world. So we’ve talked about privacy and surveillance and the emphasis being about the individual. Transition the conversation a little bit to being about types of people, and this is where the P3, Public Private Partnerships, and the emphasis on preservation of the value of a group of people, or a set within society, is really critical.
Go with me, if you will, across the pond to China. We have sesame credit. Sesame Credit is an extension of Alibaba, a publicly traded company, but it is also in league with the government to basically create a corporate network that allows for looking at people’s “legitimacy.” Do they qualify for insurance? Should they be given a loan? What is their historical payment record? We have information on dating and shopping habits and mobility, and all of a sudden you are getting three stars if you are good citizen, five stars if you are a fantastic citizen, one star if you are the dregs of society.
You are really homing in on something important here. How do you keep the actors focused on the good of society versus a particular agenda which may, in fact, marginalize a group of people? Because they have the tools to do just that. They have the power because they have the information and they can, if they choose, marginalize anyone – not just an individual, but an entire subset in society.
Oscar:Well said, David. I really appreciate that you made that leap there. Two years ago, I guess, I was invited by a former student of mine to come and teach a seminar about privacy, of all things, in China. And it was an informative process for me, both in trying to learn more about China before I went, and to imagine what my class would be like and what kind of students would choose to take a class about privacy. I wish that I could have said to them about the path that I see their nation taking and the struggle that we are going to see next. We are in one struggle in which the U.S. is trying to develop and define its relationship to Russia and we are struggling with the ability of the U.S. to define and influence our relationship to China.
But exactly as you have described it, China is going to be a leading actor in defining and developing and involving very active, very productive, very efficient and corporate actors in the development and application of these systems in order to gather information about members of society in order to control the members of that society in ways that have been unimaginable today. And I feel for my former students, and those that I am not still teaching, who are struggling with their ability, all of the Chinese students who came to American universities in order to study and be exposed to our approach to our democracy, and I’m really feeling for my students at this moment so forgive me in that regard, to understanding about what democracy was, and what it was as a resource, and how it might be developed and applied in a Chinese context.
And yet, they are facing this new move which is using all of this technology and all of these resources in order to identify and punish segments of the population that aren’t responding in quite the way that it is believed by the state that they should. Now, I don’t mean my comments to pick just on China. A colleague and I that I started working with in the past three years, have published a piece dealing more with Europe and with smart cities. We don’t yet have the identification of China with smart cities as being their marker. But the extent to which governments are able to provide different kinds of influence over the citizen as [unclear] as to ways in which people ought to behave in order to meet our requirement.
So there are so many similarities, still, to state views about the kinds of resources that ought to be used, but the authors who had been leading the European debate toward the use of nudges, or influence of citizens, is not as close as China is to the kinds of control that both you and I are concerned about. But the possibility that they will move in that direction is not one to be underestimated.
David:This convergence is pretty clear with a company like Google, as well. Again, we have had the state apparatus. They have been a potent force in times past. You have the business bureaucracy and Google is taking its place amongst the crown princes, if you will, very close to the flow of data, and they are now in their censorship program called Dragonfly there in China, actually aiding and abetting. They are working in a Public Private Partnership, if you will, facilitating censorship. So how should we view an organization like Google who, as far as we are concerned on this side of the pond, is benign, but on that side of the pond may be malign?
Oscar:Part of the changes that are taking place in an organization like Google, and with an organization like – dare I say – Facebook, in order to influence the way in which the state responds to the things that have been defined as privacy and surveillance, surveillance is not yet a public policy term. Surveillance is limited to police and police activity. So it is a struggle, in one sense, to have a conversation about the same kind of behaviors with the same kinds of consequences.
I am still now waiting to see how this discussion is going to develop with regard to Amazon and its decision to move to my home town, to my neck of the woods. I was born in Amityville, on Long Island, but is the Long Island city going to be transformed by this tremendous actor, with a tremendous, if you will, multi-varied actor in terms of a range of things in which Amazon is acting as a commercial vendor, but also as an unmatched source of transaction-generated information about the kinds of things that we are attending to – the power associated with this actor is only growing, and the extent to which there is going to be an informed and motivated public response to the kind of, still for me, primarily corporate-based.
States are weaker in my view than they have been in the past. I don’t know what kind of struggle we are going to see between the federal, the state and the local in this regard, but it’s going to change in terms of the nature of the actors who are really influential in the use of data at the level of the locality that people are existing in. I am still struggling with trying to predict, trying to understand, how this is changing, but certainly, the development of smart cities is an important place to be looking at how it is that this new class of cooperative actors is shaping our future.
Consider the regulatory environment, the way in which it is focused on privacy as a public concern, the way in which it is focused, first, on what was perceived to be the more powerful actor, that is, the state, and the kinds of regulations that were set in order to limit the state from sharing information from different silos. But nowadays, in the context of public private partnerships, the state no longer has to recognize its own limitations in terms of where it can get data because now it has new partners, and it is no longer limited since they didn’t get it from a state agent.
So the nature of the challenge that we are facing in terms of which of these actors hold what share of the power and how they can be influenced through the public policy process that we still believe is the way through which democracy is realized, is still a challenge. China is not a democracy. They don’t have the same sort of expectations about how it is that policies get to be changed. But for the moment, we still do have something that we understand to be democracy that has certain kinds of expectations, but it is being bypassed, and it is still a challenge to modify our regulatory response to this change.
David:I will be very interested to see – I know 12 years ago you entered retirement, but I am still very interested to see what you have to say and what you write about. You wrote The Panoptic Sortin 1993.
David:You were anticipating many of the transitions of the last 25 years. It read like something that was written last year, not 25 years ago. So you were so right you might have even surprised yourself (laughs). But a lot has matured since that period of time. We have the power and suasion which being understood, or re-understood, in the context of the digital age. One of the things that I am curious about is the relationship between those who have the information and those who don’t – we talked about this a little bit earlier.
David:Twenty years ago I had a conversation. It was very one-sided. It was questions being asked of me and no real back and forth dialogue, and I said at the time to this person, and I still hold this view, that a healthy relationship requires relational reciprocity, and it seems to me that there is an imbalance in the flow of information and knowledge today. It creates a power dynamic between the knowers and the known.
Talk to us a little bit more about this relationship – again, government and corporate. If it ends up being a Private Public Partnership, they know more and more about us in a relationship that is structured like a one-way mirror. Ultimately, doesn’t that change the nature of society when you have the knower and the known, and we are constantly at a deficit, and they are constantly at a surplus in terms of the information, and the informational advantage that may give them?
Oscar:Let me agree with you in terms of the inequality in terms of access to information, although some of my colleagues would argue, “Well, no, we have access to Google, and we can find out whatever it is that we want instantly.” And so our access to information is actually quite substantial. It has been expanded quite a lot. But the challenge is still there. I have forgotten what my count of how actors are, but we have talked about the state, we have talked about the corporation. It is now time for us – you, the rest of us – to pay attention, really, to the machine. I haven’t written much about this, but it is clear that I need to come to terms with automation, to come to terms with the machine, with a machine that is able to process the data, with a machine that is able to provide the intelligence, to provide the analysis. But more than that, there is this problem about where the machine fits within our democratic system. So here we have a technology which can gather data, decide which data to gather, it can process data and decide which of that data makes sense. It can decide what to do about that data in terms of the opportunities and challenges that it places before us. So part of the problem that we are facing is establishing for the machine about what the outcomes are to be optimized, about what the goals are supposed to be. And it is not clear to me, and I would argue that it is not clear to the rest of us, about the extent to which the machine is deciding what we ought to be deciding. So that is a real challenge for us to understand about how human beings play an active role in deciding what it is the machine does for us. While I have written about the importance of us having reliable, responsible agents working for us, the extent to which the agents are machines changes the nature of our lore, change the nature of our identification with privacy and our concerns about privacy, changes the ability of us to exercise influence over this agent or this actor in terms of shaping how it acts in terms of engaging in evaluation of its performance, whether it is doing a good job or not (laughs). We are really challenged in our ability in order to do assessments that the best of us can’t understand how the decision was made – forget the rest of us in understanding how the decision was made. The extent to which we are setting the challenge for the machine, setting the establishment of the goals for the machine, and the extent to which we will be the actors with power to be able to make those decisions, actually, in my lifetime, I’m not at all clear about that. I think we have work to do.
David:It’s a huge project, and part of it hinges on the trust and the faith that we put in the machine. I would probably say the machine actor rather than agent, just because that is a big concept that you have machine agency. We had a conversation with former President of the Federal Home Loan Bank not long ago. I asked him about modeling and mathematical modeling. A lot of what goes into an algorithm is also used on Wall Street to predict risk and outcomes and certain things, so there is a predictive capacity there that is very helpful. He said, “You have it, you have to use it, but you must always distrust it.”
And there is this idea of how do we engage with the artificial intelligence, the automation, the machine which is doing the grinding out, the gathering, the processing, the prioritizing, and yet maintain a healthy disconnection where we are not depending on it, we can still distrust and yet use. There is no sense of joining the class of luddites that don’t embrace technology, but on the other hand it is worth critically looking at the impact of technology and not necessarily assuming that it is always good. These are careful processes that I think still have a human element, that of ultimate criticism, what is going to serve us individually and as a society very well. I don’t think that outcome or the answer to that can be machine-generated.
Oscar:I don’t know, David, about the extent to which we are prepared or are making the investment in preparing our human beings in order to deal with this. Universities now have departments and divisions that are focused on data scientists. I am not sure that data scientists are paying as much attention as they need to about the nature of error, the consequences that flow from error. So if we put this on Gittins’ table and he has to think about what we don’t know about what the machine knows or doesn’t know about its goals, we have a long way to go in all of the areas that matter in terms of how a democratic system responds to its opportunities and constraints.
So you are quite correct in terms of trust, but again, trust is a knowledge-based assessment. We have to evaluate our experience, maybe we have to engage in tests in order to evaluate whether our trust is well-placed or not, but my sense of our movement along that path is nowhere near as devoted as it needs to be. It is no longer as well-resourced as it ought to be in terms of the kinds of self-defense that corporate actors have, and to some degree state actors have, in terms of their requirement to reveal the nature of their latest technology in terms of how well it works, and what kind of biases are inherent in the data that they gathered and are using in order to make decisions about me and people like me.
I don’t think we are moving well enough and quickly enough in order to figure out how it is we are going to do the work that a technology evaluation bureau did at one point in our past. We don’t have such a thing in government that has the responsibility and the resources in order to evaluate how well technology is working, especially in this regard, with regard to information and information-based decisions. We need to commit. What we are seeing, in China as a prime example, is not about whether there are problems, but how can we do more, more quickly, with greater efficiency and effectiveness.
David:Again, when we talk about data scientists and the role that they play moving forward, it reminds me of the importance in philosophy or science that you understand your first order questions and your second order questions, and that you really assess the assumptions that go into the analysis and the usefulness of data. Data is not neutral because scientists are always going to have, and if it is a data scientist they will always have, a perspective. So before we move beyond the knowledge-based assessment I want to make sure that the data scientists aren’t the new Wizards of Oz.
David:Frank Baum. You end your book this way, “If we’re going to place faith in the intelligent processing of information, who is the guy behind the screen who is making the decisions about the data that is being used or is selectively being revealed to us. There are these very critical first order and second order distinctions in terms of the questions and the processes that are being put in place.
Oscar:I couldn’t agree with you more, and as someone trained in philosophy and therefore concerned about morals and ethics and our reasons for actions that we take, it is clearly a challenge that we face in ensuring that our data scientists and the money that we are paying for universities in order to train them, unfortunately when I say we here, it is corporations that are providing the money, our government agencies that are providing our money, I’m not sure about the extent to which the republic is actually thinking about and trying to influence the requirements of moral consideration and ethical consideration of the kinds of systems that we are building.
Sure, we are having debate about the extent to which we should allow automated military armament and decisions about life by machine. The extent to which we are involved in the debate about that is way behind what it needs to be in this regard, so you are quite right and I agree with you, it’s a first order consideration about what matters, why it matters, and how we are going to determine that that matters, and that consideration is inherent in this process of building the systems that we depend upon. I agree with you.
David:I guess a part of the vulnerability that I feel as we move toward greater sophistication and data processing, and perhaps there is not careful enough work done on first order versus second order distinctions where the data scientists are maybe even required to have some philosophy classes (laughs) as they design the machines that create the next reality – small “r” reality. But I think about Cardinal Richelieu’s comment from the 17thcentury where he said, “If one would give me six lines written by the hand of the most honest man, I would find something in them to have him hanged.”
From a 30,000 foot perspective, whether it is Sesame Credit and the private public partnerships that we are seeing develop in China, it is very possible that we have already moved beyond six lines. And all of us have that sort of existential vulnerability because data being chunked and recreated to tell whatever story needs to be told, it really has to do with the processing and the interpretation and we are not necessarily in control of either of those things. That leaves me with a few concerns that relate to your panopticon.
Oscar:Sure. David, I wouldn’t be talking with you if I had given up, right?
Oscar:If I thought that it was too late and that there was no possibility that individuals and collectives would develop in response to this. I wouldn’t be talking with you if I didn’t know that people are writing books about the challenges that we face. I don’t know exactly how it is that they will mobilize individuals and groups and systems in order to do the things that we ought to be doing as the kind of human beings we idealize. That’s why we are talking with each other.
David:The point of cultural critique and cultural understanding, your emphasis through your education from the early days of sociology to media studies and communication, I wonder if we could look back 100 years and dive into literature. Upton Sinclair was writing The Jungle, he was writing The Brass Check, he was writing Oil, which was made into the movie Daniel Day Lewis was in, There Will Be Blood. He was looking at, as a social critique, from his perspective, through fiction and nonfiction, what would he write today and how would he approach the panopticon?
Oscar:If we take an author, and their role is very important and it will be interesting to see the extent to which algorithms have influenced the kinds of decisions that writers make, and I’ve lost the authors name that has written a book, really, I think, about Facebook, or about Google, about privacy within the corporation, I know that I am not alone in thinking about these problems, and that there will be, and there are writers who are trying their best to have us think about, have us reflect our individual and collective responsibility.
With regard to the rule of government as a public resource, what the role and how government is supposed to respond to our needs and interests, especially, I expect that somebody is writing, and will write, and I have signs of people writing some of this, about the need of the state to protect those of us who have limited resources, who are on the bottom of the distribution of resources and capabilities. I have to believe, and I think that you also believe, and have to believe, that we depend upon writers of novels, not only writers of things like The Panoptic Sortwho are working to raise our awareness of the danger that we are in collectively. I have to believe that, and I believe you do it well.
David:As a final thought, Jacques Ellul was not enthralled by the idea of a perfectly integrated technological world, and he said this, and in a somewhat critical tone: “It will not be a universal concentration camp for it will be guilty of no atrocity. It will not seem insane, for everything will be well ordered. And the stains of human passion will be lost amid the chromium gleam. We shall have nothing more to lose, and nothing to win. Our deepest instincts and our most secret passions will be analyzed, published, exploited. We shall be rewarded with everything our hearts ever desired.” Oscar, he imagined the modern technological panopticon as you have, too. This is the train we are on. Is it too late to get off?
Oscar:Well, I agreed with Ellul early in my career when, as you noted earlier in our discussion, I was oriented toward technology as the path to the future, I realized the limitations that were possible and I also still realize the idea that we may, as we have in response to Google and Facebook, and have behaved in ways which, upon reflection for some of us, do seem insane. And it may be that those corporate leaders and government leaders who were shaping policy did not think about the outcomes that we are facing with regard to the motivation of a desire to engage in sharing and the development of a sharing economy and what that means for the nature of work and the kinds of things we will require in order to survive in the kind of life that we might have imagined for ourselves and our children, which are not going to do very well today.
I think that voices will be raised now about the path we are on, that the path will seem insane, not well ordered, and we are going to respond and challenge the systems of power. That is what I believed, but I was wrong. We will respond. It is never too late to respond to dangers as we perceive them, even though we share responsibility for their development. We have to.
David:Oscar, it is because we care about the future, and not just the future of prices, that we have a conversation that goes beyond finance and economics, because ultimately prices are impacted by the world in which we live. Prices can be illusory, they can be somewhat real, they can help people, they can hurt people. We are interested in reality, we are interested in the things that people say yes to and don’t understand the unintended consequences to. So it is very easy in the marketplace to say yes to something that looks appealing, that has that chromium gleam, but may actually have a lot more entailed with it. And I guess we want to say, thank you for helping us turn a lens on the marketplace and look a little bit more critically at it.
Oscar:My pleasure, David. Thank you for this conversation. Keep up your good work.
David:Thank you so much for your time.
Oscar:All right, take care. Bye-bye then.
Kevin:That was a fascinating interview, Dave. I am sure some of our listeners are wondering, what does this have to do with the financial markets, but actually, we have been talking about passive investing, especially this last ten years, where you have this social group that just goes into the same thing, all of them. Doesn’t that lead to some of this control?
David:What you have is a concentration of people in a few products.
Kevin:Being moved a certain direction.
David:Exactly. So there is greater visibility about who owns what, and there is a greater segmentation in terms of who those people are, what their motivations may be, what their age bracket is, and so the appeals that you can make to them allow for a greater control of the system, a smoothing of the business cycle, what I would suggest is a part of that process.
Kevin:When you ask them what they invest in, a lot of people will just say, “Oh, I’m in Vanguard.” (laughs) Well, what does that mean?
David:(laughs) What does that mean?
I think one of the things that is important is, with the Commentary we are not interested in merely looking at prices in the financial markets, but we are interested in what drives people to make the decisions that they make, which means that we are inherently curious about psychology and sociology. Today’s exploration was really about power structures and ultimately how people can be manipulated and controlled, and if we don’t understand this in the big picture, we get lost in price and forget the meaning of those prices as they change day to day.