Brian Cooley 

Hello, everybody, whoops, that's a long step down. Okay? I'm gonna stay right here. Now I'm gonna with you. Okay? Once bit, but now I'm okay. Welcome to CNET presents the next big thing. Here we are. Yeah.

 
Lindsey Turrentine 

We are here to talk about anticipation. I'm Lindsay turpentine Senior Vice President of the cbsi tech group, I manage CNET.

 
Brian Cooley 

And she manages me. And I'm Brian Cooley editor at large at cnet.com. And how many of you are veterans to the next big thing presentation? Wow, we got a lot of new blood here. This is great. Okay, so what we do here is we try and set the weathervane for, not things that are total futurism. But

 
Lindsey Turrentine 

tangible. That's right. We're talking about about five years out. Yeah, what all of you to be thinking and getting excited about what could happen in a few years. Yes.

 
Brian Cooley 

And our topic this year as we dive into is going to be this idea of where we're going with the anticipatory future.

 
Lindsey Turrentine 

We are going to be talking about some things that are going to get better in the future. We're going to be talking about some things that might be a little bit unnerving. Yes, in the future more than a few. And we're really excited about all the panelists that we have today.

 
Brian Cooley 

Yes, what a great panel. And notice that as we get into this discussion, and we've got this panel to really tease these things apart, that as we go here, we're going to see that as some things get better and easier than they are done today. We're trying to always move the ball forward, not just do some kind of a lateral where we take things we're doing today and make them techier for tech sake, you're going to see how we try and tease apart the difference between tech for its own sake and text tech for things that are better by anticipating our needs and really moving the ball forward.

 
Lindsey Turrentine 

Today, an example of the automation that we might be talking about is your smart homes. Heater turning on based on your calendar settings, relevant data, stuff that you bought at the store on your way home, and other members in your amily coming home at the same time, that combination of data might be something that in the future can make better decisions about when your heat comes on how warm it gets.

 
Brian Cooley 

Yeah. Whereas today, it might be simple. It might be I've got my car set to a geo fence, and whenever I'm certain I'm a certain place that garage door goes up doesn't really mean you're heading home, it means you're in a certain geographical area, that's automation, what Lindsay's talking about with a lot more inputs, and perhaps they're more dynamically gathered. That's anticipation. So that's where we're trying to go on that one. And we're going to be very cognizant, we talked about a lot this was we were putting this together as to tease apart that difference, because they can look very similar from a at a glance from a distance I go, that's just automation. Now, it's a lot more than just that.

 
Lindsey Turrentine 

Yep. In a few years, things will happen for you. Before you ask for them to happen. Yes, that's really what we're talking about here.

 
Brian Cooley 

With that, let's bring on our panelists and then we're going to dive into one of our videos to start seeding the clouds.

 
Lindsey Turrentine 

Okay to explore the future of anticipation. We have an impressive panel. Starting with of course, Brian Cooley. Rana el Kaliouby, CEO of Affectiva

 
Brian Cooley 

Please give a big hand for our next panelist joining us on the stage will be Michele Turner. She is Senior Director of Smart Home Ecosystem for Google.

 
Lindsey Turrentine 

Next we have Doug Clinton. He's the Managing Partner at Loup Ventures.

 
Brian Cooley 

He's scanning your brain right now you'll see what we mean. And a very important part of this conversation is to bring on the electric freedom founder Electronic Frontier Foundation. And joining us is the Executive Director Cindy Cohen.

 
Lindsey Turrentine 

Before we start our conversation, we're going to watch a little video on what this is all about. Every time you leave your house, you do something a little quaint. Turn off your lights, adjust the thermostat, lock your door, set a security system, maybe you use an app, but that itself takes a couple of extra steps. Wouldn't it be great if your home anticipated what you were going to do next, and simply took care of it for you. cameras in your home can use facial recognition to know if residents or strangers are in your house and eventually they'll make decisions based on what they see. The signals are all their image processing cameras, your location, calendar interests and all of those verbal utterances. I'm booked solid until a little after five today, but we've yet to combine them into a sort of household graph that can be used to really anticipate your needs. The anticipatory home will itself be a smart thing. Not just a big box with lots of other smart things in it.

 
Brian Cooley 

Look for strong echoes of that in the car as well the home soulmate, a number of major automakers have begun to build tentative bridges between smart homes and connected cars. And it makes sense because so many modes of our life are a handoff between the two.

 
Lindsey Turrentine 

And while anticipatory homes and cars would make our lives easier, the same technology in health and wellness can make our lives fundamentally better and longer. The term normally used is predictive analytics, and it's attracting some of the biggest money and initiatives in tech. But there's a piece missing here 80 to 90% of what determines our health is actually social factors, not test results, doctor's visits or medicines. These factors are often left out of the picture, so it makes it harder to get in front of health problems before they are problems. Traditional clinical signals like heart rhythm, blood pressure, sleep cycles, and blood glucose are on the cusp of being monitored regularly. This is things to the products on the Here at CES, the constant data patterns they detect will allow various forms of AI to get ahead of our health trends before they turn negative.

 
Brian Cooley 

Then there's the world of brands and commerce. The game has always been to truly anticipate what consumers want, when and where, as opposed to the relatively crude estimates of that we make today. But now we're on the verge of a real breakthrough. Imagine a fast food chain with technology that anticipate what you'd like to see on the menu board and dynamically deletes items you would never order that dilute or convolute your perception of its brand. Smart voice technology is rapidly growing up moving beyond clumsy can skills to being able to listen to your sneem and know if it's from dust or a virus.

 
Brian Cooley 

Alexa, what's for dinner?

 
Alexa 

Would you like a recipe for chicken soup?

 
Brian Cooley 

No thanks.

 
Alexa 

Okay, I can find you something else. By the way, would you like to order cough drops with one hour delivery?

 
Brian Cooley 

That'd be great. Thanks for asking.

 
Alexa 

I'll email you an order confirmation feel better.

 
Brian Cooley 

Our faces can be mined for vast but formerly inscrutable clues about our interests without us even knowing it. Once Netflix can watch you watch it, you may never get a lame movie recommendation again. wants some holy grails keep an eye on things like facial recognition and tracking technology, a little further out the spread of gaze detection. And then there's brainwave monitoring, not just for medical and therapeutic uses, but to really get in the consumers mind. The last couple decades of consumer electronics have been amazing and giving us levers to pull, like search follow commands skills, but now it's time for levers that pull themselves and frankly do so better than we ever could. There's a lot to chew on to wonder about Not to feel perhaps unnerved about what I'd like to do is just to start make sure everyone knows what the POV is here on stage. If we can start with Rana, give us the real quick and each one of you thumbnail, what is it your organization Dude, what's the POV you bring to the stage you want to tell us very quickly about Affectiva.

 
Rana Kaliouby 

Hi, Hi, everybody. I'm Rana, I'm an activist, co founder and CEO. We are on a mission to humanize technology. So we build technologies that can understand people's emotions by analyzing facial expressions, as well as vocal intonations. Lots of applications. A lot of them were covered in this video, including understanding how people emotionally engage with brands all the way to automotive and a lot of applications in the health and wellbeing space as well.

 
Brian Cooley 

Okay, Michele.

 
Michele Turner 

Hi, I'm Michele Turner. I run the Smart Home ecosystem for Google. So the very beginning of the the video is pretty much what we're all about, trying to figure out how do we take today's collection of individualistic connected devices and bring them together to create whole home solutions that are practically helpful for consumers to be able to help them in their home every day, get things done. Give them better entertainment, give them a more safe and secure home and figure out how to do that privately and securely.

 
Brian Cooley 

Doug, you guys aren't a consumer facing brand as a venture capital house. But what are your involvement in this?

 
Doug Clinton 

So I'm Doug Clinton. I'm one of the founders of loop ventures, we are a multi stage focused venture fund. So we invest in a lot of the companies that are building the technology for this anticipatory future. And in particular, we do a lot of work on the brain segment that Brian teased in the video.

 
Brian Cooley 

And Cindy, how many gallons of cold water Did you bring with you?

 
Cindy Cohen 

I come from a very different place. I want to make sure that when you engage with all of these technologies, they're actually serving you that they're not two face technologies that are basically trying to pretend like they're giving you something but really collecting data on you. We care a lot about privacy, about security, about where that data goes, and who has access to it, including law enforcement, we know that we live in a society that is not always fair. When you take some of these technologies, and you gather all this information, and you layer it on top of a society that has struggles with fairness, you can make the fairness much, much worse. And this has a disproportionate effect on people who are not at the top of the food chain in our society. So I guess my my role here is to try to make sure that, you know, we say DEFF, when when you go online, your your right should go with you. Now, I'm trying to just say, well, when you go into your house, you're right to go with you too. All right.

 
Lindsey Turrentine 

Michele, you're thinking really spending a lot of your time thinking about what this anticipation could do in a best case scenario, using Google devices, but also using a broad network of different devices. Tell us a little bit about what? Give us a good use case for a few years out from now how anticipating could make things better?

 
Michele Turner 

Yeah. So I think what we need to see is the convergence between these smart devices and a lot of the services that are out there today. I think that what, when I, when we look at a few years, we're seeing a home that has a lot of sensors in it, your sensor data can be used to come together and help drive that predictive future, right? where it really gets interesting is when you start marrying this sensor data with the services and the other things that you use on a daily basis and to start creating to start to create that that truly predictive and proactive environment that can help you do things every day. So I like to think about it this way when I'm getting ready to go home. I have kids, right? I live in the Bay Area. So there's always horrendous traffic. It's going to take me half hour 45 minutes to get home. Kids are going to be hungry right? If Google Maps if I go in and I map that I'm going home, Google Maps actually probably knows that I'm headed home and I'll be home in 45 minutes. Can it turn on my Smart Oven? Set it to 350? And can it set the house to if it doesn't know that somebody's home if we're now living in this communal environment, we're making this fundamental shift from really individualized technology on our phones to smart homes that are communal spaces where we have smart speakers and displays that have context around multiple people in the family. So how do we start creating that environment in the future where I happen to know that this child is home and they are going to be doing this in the house at this time, I need to be able to have dinner on at this point for them because they may have band practice or soccer practice or something like that later, and let all of those services from Google Maps to my calendar to the devices in my house which is my Smart Oven. My smart thermostat, Spotify, and everything. come together to just streamline how my world comes together in a very busy part of my day. So I think that's, you know, as we start looking at this, it's we talked about it in terms of ambient computing. But it's really the convergence of all of this data plus services together to start solving real problems from people on a daily basis. And it can be when we look at, it's not just Google services, it can be things like instant cart, right? I might have a set of groceries that need to be delivered, my smart refrigerator knows what's in my fridge or not knows that I need to get dinner on that night. It can get the groceries delivered for me from the time I come home to be able to get dinner on much more quickly. And we look at it in terms of being able to do those helpful types of services. Also, safety and security are really big one as well. Obviously, that's a key touch point for us in the future where we're able to do a lot more to secure your home and let you know what's happening in your home and be much more proactive on it. And then a whole future of energy and sustainability like how do we start doing things that are saving us energy, anticipating when your home might start to have problems like getting sensor feedback from, whether it's your boiler or your oven or your air conditioner, that something's gonna go wrong and then proactively be able to address that before your boiler goes out on Christmas Eve, and you've got the plumber with overtime at your house, right? All those types of things, the sensor diagnostics there, start going to start feeding into this intelligence databases will help us be able to be much more proactive in the future and just solve those everyday problems that causes all a lot of headaches.

 
Brian Cooley 

Now, Doug, I think your technology that you guys invest so much in the BCI brain computer interface term that I learned fairly recently as listening his neuro tech podcast is the one that needs the most explanation. What Rana does is pretty out there for a lot of us what you do is way out there Okay. What's your company's do? What is it? You're picking up? A you're not literally reading people's minds? Not your company's?

 
Doug Clinton 

No. Yeah, not literally

 
Brian Cooley 

not. Yeah, it's not good to hear it and then be all nervous.

 
Doug Clinton 

Yeah, the way that I would think of brain computer interface is if you think about the way humans have interacted with computers over the last 50 years, 50 years ago, you went to a room this size, probably and there was a mainframe, and maybe you had a punch card that you put in to interact with the device. Then we had the graphical user interface and the personal computer in your home. And then we had the touchscreen. So we're getting pretty close to physical connection with the computer, use your finger to literally tell the computer what you want to do. And now we have this proliferation of voice assistance. BCI to me is sort of the natural evolution of that progression where instead of having to use your voice or use your touch command, to tell a device what you want to do, you literally just use your thoughts and the computer will interpret your thoughts and say, I understand that August trying to press this button or change the temperature on the thermostat. And you don't have to have any physical interaction. So it kind of brings you a layer closer to the computer and removes this sort of abstraction of the human senses from interacting with computers.

 
Lindsey Turrentine 

So this is very short term anticipation. If a computer can anticipate your net literal next move,

 
Doug Clinton 

it can be and and the way that we think about anticipation two is is really sort of two dynamics of it. There's the this is like a condiment and I think construct where you have sort of the present and future self. So you have the present self, which is your sensing self, your brain is your present self and you are feeling things in the world. And when you're on social media and you get the endorphin rush from continuing to scroll through. Your brain says this is a good thing. But 30 minutes later, when you realize you wasted all that time, you're reflecting self or your future self says, You know what, this was a bad decision. And again, that emotion is happening in your brain. So it can be your present. for your future self, and it really depends on how the software is tailored and what you want to figure out from the signals.

 
Cindy Cohen 

Yeah. I,

 
Brian Cooley 

Cindy, where are you right now and all this?

 

Cindy Cohen 

I guess I think that there's a lot of sunny talk going on here. And I think that there is a not enough thinking about how this can go wrong. Right? We know this. We know we know historically, that you know that people rent cars that have a driving systems in them, and then they don't pay for the car, and it stops on the freeway, right? Imagine that your house right? Something goes wrong, or you don't pay the bill, or you're not who they think that they're doing. And suddenly your house isn't working, the heat gets turned off. The oven goes up to 300 and that's just if the technology doesn't work very well, that's not talking about malice, and and I hope people saw the video this last week of the Amazon ring that got taken over by hackers and harass a little girl in her bedroom. Right. I'm sure that Amazon didn't sit here at CES and talk about the fact that that could happen. But it did. And we all have to start thinking much more defensively, about how do we build these systems in ways that the recognizing that they're going to get misused, they're not going to work very well. You know, the idea that you're going to be able to have the kind of control of your brain to make sure that it doesn't do the dumb thing you thought of it as a smart thing you thought, right? Like, you know, like, we're gonna have to be monkeys right in this world in order to be trained to do the kinds of thinking that they're anticipating, like, we have to start thinking much more defensively about these technologies. There's good things that could come out of these kinds of things. But, but I think that the, you know, Rosie isn't the only scenario here and it gets especially problematic and I'm going to bring Ring up again because they have gone out and made deals with Chiefs of Police all across the country to be promoting these devices in order to have surveilled neighborhoods. So inside the house and outside the house, this information is going to be available to law enforcement, it's going to be available the bad guys, it's going to be available to all sorts of people who aren't the people that are just looking out for you. Even if you think that all the people are trying to sell you advertising are just not for you. And if you think that I've got a bridge to sell you. So we need to start thinking defensively about these technologies, and we need to have them built from the beginning to actually work for us. And we're nowhere near that with the technologies that are being rolled out on the floor right here. And and you know, when you get into anticipatory then there's a whole other level that that we need to begin to think through. We don't have the legal structure. We don't have the policy structure and we don't have educated consumers. And that's an extremely dangerous situation. And by the way, not all the people who are going to be going to have this stuff are upper middle class people who are commuting in the Bay Area. There's people who work in our homes. There's people who pass by our homes. They say there's people who have who have entirely different cultures, who are in our neighborhoods in our homes. And if you have technologies that are only developed as if we're all rich, upper middle class, white people living in the suburbs, you're going to really hurt those other people if you don't anticipate how your home is going to be changed, how things are going to be changed when we implement these things,

 
Brian Cooley 

Rana, counterpoint here.

 
Lindsey Turrentine 

Yeah, Rana, you're thinking about a lot of this. We were you mentioned a couple of these aspects backstage because you're at the beginning

 
Rana Kaliouby 

of absolute talk about it. I'll cover data and algorithmic bias to start with. So in our world, we basically use machine learning deep learning computer vision to train these algorithms to detect various human activities and human emotions. And absolutely, to your point, if our data set is just composed of older white guys, and then we deployed around the world, which we do our technologies deployed in 87 countries around the world. It's not going to work on people who look like me. And so we very purposely talk about and implement best practices within the company around mitigating data and algorithmic bias, starting with how we acquire the data ensuring that it's collected from, you know, different genders, different age groups, different ethnic groups, different contexts, right? Like maybe, you know, some of the data is when you're at home, watching content, some of it is you're driving around during your daily commute. So the that the diversity of the data is really important. And then how we look at accuracy becomes really critical to we're at the very beginning of as a field as a machine learning kind of community and thought leaders. We're at the very beginning of implementing these best practices to ensure that we are handling and mitigating bias. That's one aspect of it. The other I think, an element that is really important, we we prioritize that and how we think about our technologies has to be human centric, it has to start with understanding Who is the user? Why are we building these technologies? We're all about building artificial emotional intelligence and enabling these devices to be humane and how they interact. So I loved your example where you're coughing and you know, you're not feeling well then your Alexa can detect that because it has enough data about you're interacting with this device every single day. So there's enough data for it to build a baseline around who you are and how you're feeling and if you deviate from that for any number of reasons it can flag that to you it can respond empathetically, right like it said, you know, yeah, get well Brian right. So I think this is really critical. I

 
Brian Cooley 

think what we can do because we're kind of looking at the at the opportunity and the promise and I think I don't really see an either or I see a yes and there's clearly we've got so much potential here. This is not something we're going to close the door on. We're not going to say now, anticipatory future doesn't work. This is such a rich mind a future interface and relationship to tech and not to tech to the things we want to get through today. But it absolutely could outstrip almost any trend we've seen before.

 
Lindsey Turrentine  

It's there's, I feel that there's we're all waiting for the example of the thing that will change society for the better. This is something that we're all thinking about a lot. And and yes, there's a lot of optimism in this conversation. And I think that shows like CES tend to be collections of optimists, because it's in our best interests as a group.

 
Brian Cooley 

That's the environment where we're certainly at this moment in this place. Absolutely an optimistic moment that we're here this each week in January.

 
Lindsey Turrentine 

And I would I would just love to hear from anybody on the panel about your most optimistic version of this. Because we're going to keep circling back to the negative aspects that's going to come naturally,

 
Brian Cooley 

you're about to turn to the negative aspects in a moment.

 
Rana Kaliouby 

Can I can I make a case for and that was the very first use case we explored with this technology and it's it's, you know, health and well being. These technologies can act as facial and vocal biomarkers for depression. The very first application of this technology was for autistic kids helping kids on the autism spectrum, read and understand nonverbal communication. We have people using this to look at Parkinson's, to flag signs of suicidal intent. So I think there's a lot of potential in the health and well being space.

 
Brian Cooley 

The deliverables are amazing. Or in Cindy's words, very sunny.

 
Cindy Cohen 

I think that I want to just say, I think that if we lived in a world in which that information doesn't go to your insurance, you're not going to get kicked off of your insurance. You You have care that will recognize it. You know, we have a lot of ways that systems interact in our society where if it doesn't mean that the cops are going to show up at your door because they've mistakenly understood something or actually understood something that's going on. We all have somebody in our family who is who has had, I suspect, struggles with mental illness breaks from reality. You know, there's another again, I keep digging on Ring, but there's others as well. You know, a woman who saw a guy acting erratically in her front yard and she turned it over to the cops, me the cops available, made available to the cops and the guy was shot later on that night. Right? He was just having a mental breakdown, that definitely happens. But that is not the same thing as somebody who deserves to be killed. We have systems in our society that are not fair, that are not working very well. And when you add the technology on top of it all too often, we just kind of wish those problems away, but we make them worse. Right? We know, we know this. We've been through a period with algorithmic decision making where we realized that if you take bias data and you put it in, let's say, you've got bias data. That that that misunderstood And so that, that that people have that, that we know that cops arrest people of color at a much higher rate than people with white people, despite the fact that of not having differences in the amount of crime that happens. We know that we run that through a machine learning algorithm, you're going to end up with 80 to 90% people of color being identified as potentially, as potentially lawbreakers. I mean, it again, a consumer electronics context is different, but the same problems happen, right? So in China right now, there's a whole lot of social scoring going on. There's a whole lot of tracking of people that is going on, they had a problem with their facial recognition systems, because it did not recognize a black people very well. So they went to the government of Zimbabwe, Zimbabwe and they bought driver's license data photos from people like more data might not be the right answer to this surveillance state. So I just think that we I'm sorry, because I don't you know, this is a sunny place people are talking about all the good things technology can give. Technology can bring us all sorts of good things, but we have to have the law to protect We have to have the policy to protect us. And we have to have society ready for those kinds of

 
Brian Cooley 

and this thing we're talking about here, the stakes are so high around anticipation when and if it goes wrong, when it can go wrong for sure that this is the kind of counterpoint that is more essential in this than many of the other topics we've ever brought to this stage. Let's take this turn right now. And let's go full hog into the challenges and the difficulties I had. Let's take a look at some thoughts we've got on that and then we're gonna go fully into that area. anticipation will be powered by vast amounts of historical data trends and predictions combined with real time sensing to make assumptions about us. That's touchy territory at a time of historic consumer privacy push back. The anticipation era probably could not have arrived at a more difficult moment.

 
Lindsey Turrentine 

Consider are comfortable with the idea of cameras and microphones in their own phones, that they recoil from the idea of the same sensors in the devices within their homes. Everybody wants their home to be a place of security and privacy, which creates a major hurdle for the growth of cameras in the home, especially those with face tracking and image recognition. And smart speaker platforms clearly want to listen more of the time and often without a wake word. We used to call that eavesdropping. consumers will need to accept it as assistance. But nothing inspires more suspicion than the sharing of personal health data, especially when the companies that are doing it are not a part of the established regulated and trusted health care system. And it's no surprise that many consumers still want to talk to a real life doctor when their health or their life is on the line. And while brainwave interfaces hold enormous promise and health wellness. These seem comically invasive to the average consumer, not to mention hard to imagine being accepted as wearable unless they can be built into other products we already wear, like headphones or earbuds.

 
Brian Cooley 

And as with the existing history of the internet and its algorithms, anticipation has a potent risk of giving us more of what we've already consumed or looked for an insidious bias, it would be nice to avoid. Justice concerning would be anticipation tech that limits in another way, denying people opportunities because of bias the assumptions that may make about them. And if the AI that begins to pull levers for us is a black box, it can't be readily examine for errors and bias. And there's an urgency to solving all these problems, because much of the technology and consumers hands that's causing these issues is already out there. It's on the show floor all around you. From Cindy to the right. All three of you are surveilling in some degree. That's not the word you'd like to use but that is at its core what you are doing at an essential data gathering level what do you possibly say to consumers? I imagine many in this room who are going wow, this is a deep deep invasion of areas I didn't think technology whatever interface with what's your what's your backstop I mean we've we got a pretty good dose of that from you run it on the Smart Home side. Michele, what's your back though cuz your stuff the most in market and the most tangible right now.

 
Michele Turner 

Right? And you know, there are massive benefits to the things that we have today I was going to mention like the smoke detector I I've worked at nest been there Google nest for about four and a half years now. I can't tell you the number of people that have whose lives we've saved through things like the smoke detector and things like that and, and break ins that you know, we've caught the bad guys so many times with nest cams, right? So there's that trade off and I get it. But every single day we deal with the privacy issue, and many of you may be familiar with like the works with nest stuff that we deprecated last IO. And we did that because we were exposing occupancy data to developers who had all been vetted. But we didn't feel confident being able to do to provide that level of data anymore. And we made a lot of people impact them. National, a lot of you angry at us. That was me. So, um, and but we did it for the right reasons. And I think there's this I know, at Google. And I think in general and smart home, there is a shift that's happening right now. And really understanding we hold inside of people's homes, very sensitive data, some of the most sensitive data, and we have to protect that and it's on us. And so we're taking a lot of measures to protect that data. And as we bring more and more of that sensor data together, to create this predictive anticipatory home, that I think one day is going to be really, really cool when we can bring all this together. We have to figure out that intelligence layer that's there, how does that see protected? How does that stay private? How do we let consumers know what they have access to how to We make them allow them to very easily opt out. Because there are people who today, you know, have Nest Cam IQ cameras, and they said, I don't I don't want this to pick up, do facial recognition. And that's opt out from the start, right. So we have to give consumers control. We also have to educate consumers on how to take that control. And I think a lot of that, because we were talking about this before, before the session, which is consumers don't know, right, we can't just expect the consumers are going to know how to do this or, or expect that they're going to understand what part of their data is being exposed. So we're working really hard to let them now to have them opt into these things and make it very easy for them to opt out when they need to. But I do think that that is where we're at on this. We're, we can't deliver these anticipatory technologies unless we have access to a lot of this very sensitive data. But we always have to give the users control and they always have to be able to opt in or opt out of it. Understand there's a there's a benefit there and for every single person Consumer, the benefit of that is going to be different for each of them. And some of you decide to opt out. And some of you are going to choose to opt into that and provide more data. And that needs to be completely up to the consumer.

 
Lindsey Turrentine 

Doug, as you look, as you look to invest in companies that are on the very forefront of this is that consideration for privacy and security, a key part of what you're looking for from founders.

 
Doug Clinton 

It is we wrote a piece probably a year and a half ago or funds three years old. So this was kind of halfway through our life about how we think about the ethics of the technologies that we invest in. And I would say there's two really fundamental tenement tenants to the ethics piece that we wrote. The first is that when you approach these foundational technologies, there are known issues, which which I think Cindy has talked a lot about, and we always want to invest in founders that are aware of those issues and tried to mitigate those issues. And then there are unknown issues and so the unknown issue comes to the fact that any mass scale technology deployment is this complex adaptive system, where you may know and understand all of the individual components that make up the system. But you don't know what that means in terms of how the system will actually react in the real world. So I think any new technology has that issue. And even if we mitigate all the issues that are known, there will be an unknown that we never foresaw that we will still run into and have to deal with in the future regardless, so technology is neither good nor evil. But the way that humans use technology and adapt it, for their purposes, could be viewed as good or evil. And I think we always try to keep that in mind. To that end. What the other big piece of our sort of ethical viewpoint on investing in these companies is, it's always really about the character of the founders, to the people who are creating these companies are the best backstop that we can create and build around those unknown issues. If we invest in people who we think are thoughtful and good people Want to try to do right by the world, then they will address those issues, I think responsibility when they come up, as they inevitably will, can

 
Rana Kaliouby 

I can I, like, just underscore why this is so important to have investors that are committed to ethics, you know, ethics and how we build them deploy these technologies. Because if a company prioritizes you know, if a small startup prioritizes fairness or accountability, or explain ability, that's the key competitive advantage for that startup. But that's only true if the investor community recognizes that and because sometimes it's like, you know, as a company, it's sometimes really hard to prioritize these things, versus feature development or shipping new products. And, and unless it's, you know, top of mind for the company, and it's supported by the investors I so I just really applaud you I think that's really critical

 
Brian Cooley 

to want to take up a definitional issue is in this framework we're talking about is the debt definition of the term invasive a defined by how deep and how much data you're gathering? Or is it defined by the degree of consent, you have to go deep and grab data? You know, what is invasive in this era? We often use the term very, very frequently in this era of of all kinds of data gathering, but what does it really mean? what's what's the benchmark that says, That's invasive? And that's not if I consent to any vast amount of data gathering? Is that, is that invasive? I don't think or, or if I diffused it, then

 
Cindy Cohen 

I don't think it's just consensus. Like, what value are you getting in? Right, right, right, we're talking about right, like, are you getting value? Are you the person who's sharing this data getting some value in return for it? Or is there some other entity that's, you know, in control of this data and and leveraging that data on you know,

 
Michele Turner 

it? This isn't a new problem, right? We I mean, since the internet we've there's a lot of data sharing this got out that we all know this right? And I agree, I don't think it's so much invasive is Every one of us has a different threshold for what is okay in terms of how much we're willing to share. Okay? And that is, like you said, there's a value trade off there. And some of us get more value because we're sharing more data. And that is the benefit to me. And I'm willing to hand over that amount of data because I, the value I receive in return is higher. And then other people like nope, I don't want to share that I don't want you know, any of these entities to have this data on me. I want to keep it protected. And that is an individual choice. So I don't I don't think that that's something that can be really prescriptive. I

 
Lindsey Turrentine 

wonder, I want to ask you, I want to sort of I think you're about to answer this question. But I want to I want to set it up a little bit. I would think that in order to make an informed choice about what data you want to share, you have to have a very active imagination about what could happen with that data. So that's one one concern we have to we don't even know what could possibly happen with that data and I think one can hope to come back to poor Ring, again. Jamie siminoff ski the founder It says said to us that he, he cried when he saw that video was he was so upset by it, but also sticks by I think we are, we're doing more good than bad by sharing this information with the police. So there are legitimate disagreements among parties about what is good and what is bad. How do we legislate or set up those guardrails so that a normal range of human opinions can do the right thing?

 
Michele Turner 

Well, I do think that there is a level of, I think, kind of, you know, a lot of foxes convincing themselves that the hens really are down with this right. But I think that that is, you know, Fox Fox man. Um, and and I appreciate that he's got a business to run and he wants to sell things, but I just don't know that it's his decision, right, you know, sitting there as the person who's making all the money off of the thing going well, I think balance, it's better. They're not the right people to make those decisions, right? We as a society have to make some of these decisions. And I, I do think that that while there is a place for individual choice, we have taken the idea of consent and turned it into absolutely nothing like what I learned in law school about a meeting of the minds between two people, it's become a click to get the thing you want. We need to reinvigorate the idea of consent, we need to reinvigorate the idea of knowledge. And I do think we need a floor. I think that I think that there are you know, you there are many decisions there. There are some decisions that are best left to people to be able to evaluate the trade offs. But if people cannot evaluate the trade offs, then it's not really fair as a society to do this, you know, Nobody sells you a car and then sends you out to try to figure out what breaks you out of by right. Like we we we require people selling these devices to make sure they are safe for us, and we don't let them sell them and let They don't. So we're beginning to see some experiments in what the what you know what the standard is for collecting of data, we've got the general data privacy regulation, the GDPR. In Europe, California is just launching its new California consumer Privacy Act, there's probably going to be a second one. And we're beginning to experiment with what public policy and law ought to look like what accountability out to look like, you know, you know, what, what does it mean if it goes wrong and who's responsible for it is a question that right now is a lot of shrugging going on. And and we need to actually create that, you know, we have mechanisms to create accountability and other parts of our, our world and we need to begin to bring them into some of this space around data. So I think that it's fair to give consumers control. When you ask what's invasive, I think, the invasive I think that oftentimes privacy gets married into the idea of secrecy. And I don't think privacy and secrecy are the same thing. I think privacy is about control and over who has access to information, not what information is shared,

 
Brian Cooley 

was interesting that you brought up Cindy is this idea of awareness and comprehension of what we're even talking about. For the average consumer. This room gets it. You're not normal. But the average consumer can't even pronounce half this stuff. No disrespect to them. They're not stupid. They're busy, they have lives to live, they're not going to sit down and get a consumer electronics one on one degree, it's never going to happen. So how do we make this so digestible? And then my mind goes to know that's what we have regulators and bodies and certain structures in place to do so I don't have to become that expert in the world is not to become all of us. That's a lot of homework. And you were mentioning we don't really have anything in regulatory. Doug, your industry. You're going through a lot of FDA clearance with your companies. That's adult supervision. There, Rana feel like you're kind of in the middle of smart home is kind of a wild west, to be honest, Michelle. So there's different levels of scout leaders here in terms of the industry. Just Who's overseeing? Yep, you and it varies.

 
Rana Kaliouby 

So we're part with the FF and Google were part of the partnership on AI Consortium, which was started by the tech giants and stakeholders like ACLU and Amnesty International. And I love it. We have a lot of work to do. We're just talking about that backstage. But what I love about it is it brings together all the stakeholders that typically do not talk with each other, right? Like we do not typically engage with the FF, but through the partnership on AI. We are in these conversations on how to advance fairness, accountability, transparency, so that we're not all consenting to things we don't know. You know, we don't know who's who has the data, what is it being used for? And so I think that's a great way to advance this.

 
Brian Cooley 

And most important to me, I didn't even know that consortium existed. So that's news to me that you are at least in the pool with them

 
Cindy Cohen 

a little bit. I think that there is a big question, right? I mean, there's a long history of business for social responsibility type places. end up being just whitewashing of what businesses want. I think the partnership on AI was started with very good intentions and we will see if it can if it if it can,Yeah, can actually well you know, look, this isn't my first rodeo right like we we we have seen technology serve humanity and we have seen technology not and we are in the middle of this experiment now. And if you are not paying attention to Hong Kong, and China and other places around the world, and you're just here listening to the sunny presentations, then I think that we're all going to be in for a bit of a reckoning at some point like these technologies are powerful, they're awesome. They can serve us they can destroy us, and we have to get a handle on both sides of that conversation as you know as much as I feel like the you know, kind of the skunk that got invited to the garden party.

 
Brian Cooley 

We love the contrast because

 
Cindy Cohen 

it's well it I but it is important and I think people in this room, I totally support the idea of individual founder I think that's great. It's tremendously important. It cannot be the only thing standing between us and the apocalypse. Right? We need some law that suggested

 

Brian Cooley 

that. So and I know you're

 
Cindy Cohen 

not saying that you're but I, I think that that there are many who I hear this, you know that if we just had better founders, it might be better. And that is awesome. It's tremendously important at the rate that technology is moving, if we don't have good people with ethics at the top of it, regulators and law will always trail right. And so in that delta, especially, it's tremendously important, but I think we need you know, we do need some law, we do need some accountability, we need the ability for a consumer that's been hurt by the misuse of their data to be able to hold the company that did it accountable and accountable in a real way and and you know, we're not in a you know, let's you know, not it you know, the FTC, for instance, is a regulator that has responsible For this, it is less than half the size of what it was during the Reagan administration. Whereas its brief and the scope of what it's supposed to be regulating has grown exponentially. We have withered some of the governmental entities that had the brief to watch out over us,

 
Brian Cooley 

the ones that would do your homework for us

 
Cindy Cohen 

that would do some of our homework for us. We need to we need to build them back up again. regulators are not going to be the total answer. We need ethics, we need regulation. As I said, we need law. We need policy and we need consumer education. The answer has to be all of the above.

 
Brian Cooley 

Let's Yeah, we've been asking the questions time for the smartest minds in the room, you all to ask some questions. We have two microphones. There's one in the back. There's one here in the front. Let's get people up to ask questions for some q&a here. I know you come from a lot of different sort of technology and industry backgrounds. So let's hear what you have to ask. If you've got questions, go ahead and start lining up at the microphones. We've got about six minutes left. So as we're going to start to wrap it up here, get your questions together, and we'll take probably two or three of them if you want to get there at the front or the rear microphones.

 
Lindsey Turrentine 

I have a question for Michelle. Michelle, you were talking about communicating choice to your buyers and to the people who are using Google technologies in the home, short of having regulatory bodies coming in and saying, this is how you need to communicate privacy choices with your audience. How do you think I'm just curious, how do you think about that? What are the conversations you have internally about how to structure that? So it's understandable allow

 
Michele Turner 

it a lot. And and we we do have, we have a lot of lawyers internally that are, you know, helping us try to figure out how do we best communicate this to consumers? Right. Um, the one thing that we did it I Oh, we presented our privacy principles at IO, and they're published online, just to get the philosophy about it. But I can tell you every new product that we come out with every new feature that we have, we're vetting for privacy, and then how are we figuring out how to how to communicate consumers. One of the things in general that we're facing right now in the Smart Home industry is consumer education. Just looking at voice technologies, people don't know how to use these things, right? They don't really know how to ask the right questions and things like that. So consumer education of smart technologies, voice technologies, all of that is, I think it's an industry thing that we're facing right now. I was talking to some of my partners this morning. We're all facing the same issue, right? So better ways of educating users on how to use the products, how to, you know how to opt in how to opt out. Google's got a pretty good page on it. If you go to Google my services, it gives you a very clear listing of how to opt in and out of all the Google services you have, and it's very top level, but every new feature that we have we discuss internally, how do we expose this to consumers and how do we explain what what data that they're they're exposing and give them the choice to expose it or not? I think one of the challenges I know one of the challenges we have in smart home with that though, is it We don't get certain data, certain things don't work like that, that really cool predictive home that you have, if we don't have the data, that predictive homework, and so and then consumers don't know Is that me? Is something broken? How come that doesn't work? And I think going forward, we don't have the answer to that today. That's something I deal with Ashley every day. How are we going to be able to explain to consumers, if they don't provide this level of information, this functionality just doesn't work. This is the trade off that they have. So we're figuring it out.

 
Cindy Cohen 

I just wanted to mention one other kind of great governor of this kind of stuff, which is the ability to leave if a technology isn't serving you well. And one of the things that EFF has been talking about a lot is something we called adversarial interoperability, which is a lot of syllables, but basically means that somebody needs to be able to make a tool that serves you better if the tool you have isn't serving you very well, and you need to be able to leave and you No, not so in that in I know, in the smart homes, they're beginning to have conversations about this. But certainly in the context of social networks and other things, you know, you really have to kind of choose your prints, and then you're just beholden to that prints, right? In terms of whether you're a Facebook person, you're an apple person, you're a Google person, and then everything doesn't work. If you try to leave, right, you can't take your data and go, or you can't decide that you want to participate with one piece of the business and not the other piece of the business. Something that I think is a problem with Google a lot. It's very hard to just use one Google service they really want you to, they really push you push you push you to do all of them in the in the online environment. So we need to actually shore up some of our questions, some of our laws and policies and technologies around creating competition, creating the ability for somebody to build a if you don't like it, the way that your Smart Home is tracking you and how that information is being used. You should be able to leave

 
Brian Cooley 

That's the adversarial part, the interoperability parts what's so intriguing, that's the ability to leave, not the impetus to leave

 
Cindy Cohen 

well, the ability to leave or the ability to leave a little bit, right, the ability to want this one thing. But I don't want this other thing of it, as opposed to now where you either sign on and you get tracked all the time, or you don't have the technology at all, you gotta leave. There's lots of spaces in between this. And so while I, you know, I've talked about regulation, I've talked about law, I've talked about policy, I think we also need competition, to help make sure that consumers are empowered, and that people who want to sell you a more secure a more secure strategy for your home can do so. And that you can use the technology that you already bought. You can flash that sucker put new software in it and that software is going to protect you better than software that happened before that you had before.

 
Brian Cooley 

Okay, it looks like this topic terrified you all so much. No one. You might be tracked, recorded, somehow anticipated. So let's finish with a lightning round with challenges and promise and all together within this context of anticipation. Let's start with you, Rana. And come down here toward the end. Give us a quick one, what's on your to do list for 2020 within what you do to move the ball forward as a member of the anticipate anticipation industry? What are some things that are going to be key, continuing the thumbnails?

 
Rana Kaliouby 

Yeah, continuing to invest in this idea of building technologies that understand humans, but doing it with a high focus on ethics and diversity, and inclusion and how we design and think about these technologies. All right.

 
Michele Turner 

Yeah, I think the big thing we're working on is how do we move from having a smart home that really isn't that smart is just a collection of devices into an integrated really, truly helpful home with an intelligence layer that is rooted in security and privacy.

 
Doug Clinton 

I think for us, it's always finding really great companies and great founders to invest in and I think in particular, if I think about the time scale that we're trying to invest on related to brain computer interface. One area that we're very interested in is around the ear and devices around the ear. The ability to collect data from that very data rich part of the body, I think is one theme that I'm paying a lot of attention to for this year.

 
Brian Cooley 

So do you get the last words?

 
Cindy Cohen 

Well, I think that our job is to try to make sure that as we move into this brave new world, this, this world is serving you that your technologies are not two faced, they're single face, they're only facing you. They're not two faced in serving somebody else's interest, whether that's gathering advertising information, or helping the cops in in their endeavors, or any of the other things that we're doing. So we're, again, we're consumers are very quickly getting educated to the downsides of some of these technologies, and we want to help them figure out how to make their voices heard, not just in what they buy, but in the policies and laws that get developed.

 
Brian Cooley 

Okay, my last question is for Doug, what am I thinking? Yeah.

 
Doug Clinton 

you need one of our devices.

 
Brian Cooley 

We'll talk about that. Okay, we hope you got some great insight for this remarkable panel. How about a hand for Cindy? Doug, Michelle and Rana. Thank you guys. Thank you. Okay. little housekeeping, a lot of great programming coming up the CNET stage. If you haven't been with us in the last couple years. We're over at sands tech West.

 
Lindsey Turrentine 

That's right. We're over at sands we're at Tech West were there for the rest of the week, we are @ ces.cnet.com. And you can find pretty much anything you need to know about the biggest stories of the show, right there ever at our stage all the way through Friday.

 
Brian Cooley 

Yeah. And Lindsay and her team put us on a really great mission this year, filter, D hype focus. It's really looking so good. Our app is I would, even if I didn't work there, it's the greatest guide to CES. So we have our app is a great guide to get around. Let's see we've got a best of CES daily wrap up that I'll be hosting with some of our cohorts today at five and tomorrow at five and we have a Interesting torture test. That's when you thought you've seen every torture test on YouTube. We're taking some products out to have the Las Vegas Golden Knights slap shot them on the ice. You haven't seen that before. And I can't often say that. Thanks, everybody.

 
Lindsey Turrentine 

All right. Thanks for being here. So much.

CTATECH-PROD3