This conversation took place in July 2017 with R “Ray” Wang, the Author, Principal Analyst and Founder of Constellation Research in Silicon Valley. www.constellationr.com

Clips from the interview

Transcript

The Startrek Economy

R “Ray” Wang: “What we’re all trying to grasp is how to get from today’s world to the ‘Startrek economy’, where nobody works for money. It’s a very interesting kind of situation. We are in the middle of that transition. How fast does automation occur? Will we be able to learn or rescale quicker than the automation that’s occurring? Will the machines actually do more work more quickly than we can? I think one of the things that we do have to understand is that humans are very adaptable. I saw ‘The Matrix’ on my way to a AI conference and in the middle of the movie, I realized something very interesting. In ‘The Matrix,’, and I know I’m digressing, but this is what’s interesting; in ‘The Matrix’ the computers are really good at rules. The humans are trying to follow rules, but what people don’t realize is the humans created the rules and it’s the humans that know how to break the rules. That’s the one thing we have in humanity that’s going for is. The ability to see between the lines, the ability to connect things that people might not have seen, that level of intuition you can’t today repeat yet with cognition in a computer and not for some time. That’s the good news. Now, the bad news is the fact that there are things that we do without thinking. There’s a lot of automation that will go away. How will we support that? A lot of people are talking about population reduction, a lot of people are talking about how we look at resources, a lot of people are trying to figure out maybe to tax computers and robots for their work. I don’t know if those are really good models, but I think we do have to think about this issue very seriously.”

Silicon Valley and the Future

“Is Silicon Valley thinking about these issues?”

R “Ray” Wang: “I think Silicon Valley is scared. They’re so scared… In one of our futures papers, we said that we think that at some point, there will be a terrorist attack in Silicon Valley because it’s the valley that’s creating all these things that are potentially destroying jobs, taking away privacy, and changing the way people work. On the other hand, the valley has created a lot of innovations which gives people hope and prospects. Think about what Tesla has done and Elon Musk in terms of thinking about space exploration or thinking about energy. There are both sides of the equation, but I think in the valley, there needs to be some realization and humility as to what has occurred.”

The Last Bastion of Free Market Capitalism

“Do you think tech entrepreneurs are pushing it a little bit too far at the moment?”

R “Ray” Wang: “I don’t think so yet. I don’t think we’ve pushed the limits yet. I think you’re going to see a little bit more before we actually get to some balance. I think what’s going to happen, and this is what I don’t on the valley, is in the last era of industrial technology revolution, governments got involved and tried to regulate the amount of work, tried to regulate everything to the point where there was no innovation. Tech is the last bastion of free market capitalism. It is the last bastion of innovation, and when you take that away, you will lose that innovation. People will say, ‘Well it’s for social good,’ or people will say, ‘Hey we’ve actually created more human tech.’ The reality is [that] over-regulation hampers innovation, and if you want innovation, the tech community has to learn to police itself. It has to learn to actually understand the implications of that technology in broader terms and in more inclusive terms.”

Governments and Innovation

“What’s the role of governments in all of this?”

R “Ray” Wang: “You’re asking a very tough question about how governments can help with technology. The challenge with governments, and no offense to the legislators that are there, but most of them are not qualified to even understand what’s going on. Take a country like China, where 98 percent of the legislators have engineering and science degrees. They understand the technology implications. In the US, that’s like three percent that might even have a doctor’s degree or medicine degree or science degree. They don’t have the chops to understand what’s going on. They all rely on lobbyists, and the lobbyists are relying on where the nearest buck is to be made. That’s not a very good model. I don’t have a lot of hope for technology and policy to catch up with each other. I just hope that we don’t make really bad policy decisions. In that vein, there are certain things that are important, there are fundamental human rights that should be ensured, like the right to privacy. That means ensuring that citizens have control of their own data. Being disconnected should not be a alarm that you’re doing something nefarious. Being disconnected means I choose not to be tracked, and that should be okay. When we talk about things like cryptocurrencies, if I cannot transact anonymously, you’ve taken away a right, and that is something governments can protect. Protect how that information is provided, to make sure that individuals have a choice as to what information they want to reveal. More importantly, they do have to create a level playing field. What I mean with a level playing field is one that ensures competition, not one that takes away competition. A level playing field is one that encourages investment in innovation, not one that hampers that for lobbyists, or for unions, or for regulators, or for a tax break, or for a treaty. It doesn’t matter where we go. Why am I saying this? because the fallacy of Western democracy, or the failure Western democracy, is that we’ve run out of votes to be bought. ‘Let me give you free health care, let me buy you a tax break let me give you housing, let me give you this trade deal…’ Those were all forms of buying votes, and what we’ve come to is a system in Western democracy where money actually influences the legislation to an unfair balance. What we can at least try to do is save the international human right, which is privacy.”

Innovation Policy

“Do you think some countries have a leg up on innovation policy?”

R “Ray” Wang: “I think the challenge right now with governments guiding technology policies is that we also don’t know what the implications may be. You can make certain bets on technology policies and hope you do well. Sometimes you do well, you bet big, you win. That’s what venture capitalists do all the time. However, when it comes to humans, I think where we are at this point is the technology policies don’t necessarily reflect what the potential outcomes may be. Let’s take an example like Japan and semiconductors, or South Korea and semiconductors. We flood the market, we glut the market, and suddenly, there’s only two vendors left in the market because no one can afford the capital to jump in. Think about something like biotech and biosciences, where we have rules because of our religious beliefs and our ethical beliefs, which are wonderful, some may say. Other people would say, ‘This is horrible, you’re harming the future of stem cell research and the ability of pioneering innovation.’ I think what’s important to realize is that different countries will adopt their values. We need to respect their values in their point of view, but there are also some things that are greater human values that will change over time. We will all say, ‘Hey, that’s really a bad thing to do, we should all stop doing that.’ so we’ll get there over time, but when we try to force that onto people either by political parties or by oppression, we don’t get a lot of good results. If we influence people or shame people, we usually get better results over time.”

Legacy and Social Good

“What do you think is a surprise that’s on the horizon?”

R “Ray” Wang: “I think the big surprise is that people are going to realize that those who have been very successful in Silicon Valley may not care as much about the money, but more about a legacy. We’re actually going to see a shift in the value of more social good, and I think that’s a very important thing. The ability to actually work towards bigger ideas. There’s one thing that’s very interesting. Think about what Vint Cerf and Mei Lin Fung are doing with the people-centered Internet. They’re trying to democratize access to the Internet across all areas, trying to ensure what it might look like. We often ask people, ‘What rights does a baby have on day one when they’re born in the year 2100?’ People always answer, ‘It depends what country they’re in. Okay, it depends on the country, but what would be a universal set of rights that they have, and a set of privileges that they must earn, and a set of responsibilities that they should be working towards? When you start thinking in those terms, you start understanding how people are going to look at their future and their legacies.”

AI Ethics and Fear

“Are there also some things that scare you about technology and the future?”

R “Ray” Wang: “I’m scared that we don’t have a good framework for AI ethics, but I’m more scared when we try to apply a universal framework to everybody on AI ethics. That is going to be the balance that we get to. People might say, ‘That is completely evil.’ I’ll give you a fun example. I don’t know if this was used before, but [with] autonomous vehicles, you get two cars that get into an accident. The accident occurs in China, and someone hits someone, and someone gets injured. In China, the algorithm might be, ‘Run that person over three more times, because it’s cheaper than keeping them alive.’ That’s the financial cost incentive. In Canada, it might be, ‘Oh, I’m so sorry, I’m so sorry, how can I help you,’ because people are very polite. I’m taking stereotypes to the extreme, but you get what I’m saying. In Russia it might be, ‘No none of us did it, everyone denies plausibility, and here’s the videocam that’s been doctored.’ This is my point. These are all interesting examples that may occur, so I think what we’re trying to figure out is what is the universal human code. We can’t answer that question. In the meantime I think we’re going to push the limits, and when we get to that crucial point, I hope the populace is educated enough when they make their decision. That’s the thing I think we need to continue, to educate people on the implications of technology, what it means to your life, what it means to your business, what it means to society. We need to actually take a much more holistic view.”

A Holistic View

“What would be a holistic view?”

R “Ray” Wang: “I think we should understand those implications. What happens if we all went to cryptocurrencies and and someone hacked the cryptocurrency? What happenes if we all went to cryptocurrencies and you decided to buy something, and the government said, ‘By the way, since you have free healthcare from us, you really should lay off the wine.’ People don’t think about these things. I think people should at least understand those implications in free thought. I think academia at this moment has been polluted by grants and corporate grants. People are being paid to be spokespersons but aren’t really providing real research. I hope we get back to a point where we still value the truth, value the pursuit of the truth. And also, well, that’s a concept in the U.S., but we have this notion of a pursuit of happiness, where we don’t invade other people’s rights, but we pursue what we’re interested in. I think we need to get back to that.”

Overestimating Technology

Do you think we’re perhaps overestimating what technology can do for us?

I thought we saw a little bit of that in the early 2000’s, where we believed that everything could be digital. At that point it was called ‘cyber’, which is kind of fun. [We thought] everything can be digitized, and no one really has to do any work. There’s a lot to do from here to the Star Trek economy. I wish Gene Roddenberry was back. I mean, if we go back in time, that’s the one guy I’d like to interview: ‘How did you get here?’ How do you get there and how do folks get to the point where they work on the things that they’re interested in, and there’s enough energy, there’s enough food, there’s not to sustain what people want to do. No society is perfect, but that sounds very interesting and that utopian world. Unfortunately, we have all these dystopian movies, which actually help make us think about things a little bit more carefully. Around topics such as privacy, or technology adoption, or things that we don’t normally think about.”

Being More Informed

R “Ray” Wang: “What the public can do around being more informed is, there needs to be foundations, institutions that help. I love, and sometimes, they’re a little crazy, but EFF (Electronic Frontier Foundation). They’re perfecting privacy rights. You know that they’re a little bit crazy, but at the end of the day, they mean well. That’s what they’re trying to do. I think other ones are really policy implications in terms of such things. Instead of creating clickbait in the media, about ‘this technology… Uber meets… meets Taylor Swift’… Alright, okay, you got clickbait, but at least do your job in the media and educate folks on the implications, the policies. People might not want to hear [that]. It might not be the most [read] article, but it’s something for people to understand. The second piece behind this is, as we’re building technology policy, or people are thinking about technology policy, I really hope that our politicians spend a little bit more time to understand how these technologies impact the regular individual. What people are trying to get out of those technologies. In some cases that will direct how you build basic research grants that will help you with policies, that will help you rethink issues around jobs. I know everybody’s is really busy trying to get their votes, but you do have an obligation to serving your country and trying to think of what is best in the mid term and the long term, which is very hard to do in today’s environment.”

Better Use of Technology

“How can users make the most of today’s technologies?”

R “Ray” Wang: “The challenge with most technology is that users get excited about a shiny object. They’re like, ‘Oh this is great, it’s the best thing I ever had!’ However, you need to go back to the basics. The first principles are: What are the business objectives, what are you trying to do? You need to get better at asking the questions of what you want to achieve. When we think about the art of the possible, we always start by saying what do you want to accomplish? Where do you want to go? Why do you want to do something different? You have to have a deep introspection to say, ‘Okay, here’s where I want to go.’ Then you can come back and say, ‘Hey, this is some interesting technology. Form always follows function, but people forget that all the time. We try to reverse-engineer something. Take a house. Do you really need that much home automation? The trouble to get to that home automation may not even be worth what you’re trying to accomplish. So you go back and say, ‘What did you really want to do?’ – ‘I want to be able to walk in, and the lights turn on.’ Okay, a switch would do, or at least a sensors switch would do. By the time you program your whole house… It might be cool, that might be what you’re trying to achieve, but that’s different if you’re trying to push the envelope. We also have to think outcomes. I think that’s what we always forget.”

Stills