On May 11, 2020, the Information and Communications Technology Council (ICTC) spoke with Professor Ronald Niezen, the Katharine A. Pearson Chair in Civil Society and Public Policy in the Faculty of Law and Department of Anthropology at McGill University. Professor Niezen was appointed as the William Lyon Mackenzie King Chair for Canadian Studies at Harvard University for 2018–2019. An anthropologist with wide-ranging experience, Professor Niezen researches and teaches in the areas of political and legal anthropology, Indigenous peoples, and human rights. Prior to McGill, he taught for nine years at Harvard University and has held visiting positions in the Department of History at the University of Winnipeg and the Institute for Human Rights at Åbo Akademi University in Finland. You can find out more about his research, work, and new book here.
As a legal anthropologist, what is your perspective on the efforts by legal institutions to mitigate negative social impacts of technology (e.g. Europe’s GDPR)? In particular, in light of tracking during COVID, do you think laws are going to be effective tools to mitigate harmful uses of technology later?
This is a hard question. I haven’t written about it as much, but I can answer in another way. There’s a great video of San Francisco in 1906 where a trolley-car is going down a major street full of mayhem. There are horse-drawn carriages, cars, and pedestrians all weaving in and out because there are no traffic signals anywhere: traffic lights hadn’t been invented yet. It was only in the 1920s after cars became more powerful that traffic signals were installed to regulate and direct the flow of traffic.
I believe we are at an analogous stage with new technologies today. To date, there has been no will to regulate these new technologies. Instead, there has been sharp resistance to regulating them. But one key difference between 1906 and now is that back then, the creators of technology — the automobile manufacturers — had no problem with traffic regulations because that didn’t mean they were going to sell fewer cars. In fact, the regulation was good for them, keeping customers safe. Regulation now, in contrast, is a scary thing for big tech corporations because it means an erosion of their access to data, which is their driving force, the data that gives them a vision into our private lives and has enabled the enormous revenues that they’ve generated, mostly through targeted advertising. They are putting up a fight, but I think the only way to move forward and create protections against the worst kind of abuses of our data has to do with regulation of our data and what people are able to do with it. We are seeing the beginnings of this now.Read more...