Cortana

Cortana has been in news ever since “Virtual Assistant” coming to Windows Phone with Windows Phone 8.1 has been widely reported and rumored. Now, in an interview with Bloomberg TV, Head of Microsoft Research Peter Lee discusses Microsoft’s use of artificial intelligence and also about a “Virtual Personal Assistant”. So, it may certainly throw some light on what we imagine as Cortana for windows phone.

The interview highlights that this Virtual Personal Assistant will not only be able to answer owner’s questions like many of its counterparts on other platforms, but also be intelligent enough to go behind those questions. Peter also mentions that Virtual Assistant may take male or female avatars depending upon one’s choice.

Below is a scenario mentioned by Peter and it shows what this virtual assistant may be capable of doing,

Somehow the virtual assistant is understanding that eric is not in the office right now.

We think he will be back in three minutes.

Sit still and i am sure you can meet with him.

That kind of ability to predict people’s intentions and what might happen next, and the same way that people do, those are things that we think will be built in very deeply in the environment all around us.

If it knows — if your environment knows, for example, that it is lunchtime.

You had spoken about having lunch with a colleague on the second floor.

It notices that you seem to be leaving your office to go to the elevator, the elevator can be smart enough to take you, without your need to operate anything.

You can watch the interview video by clicking here. Below is the full text from the interview video. The part about virtual assistant is highlighted in “Bold”.

You are in charge of more than 1000 research labs around the world.

What kind of thing are you focusing on?

We are really covering a huge array of things.

Almost anything you could imagine in computing research and other parts of math, physics, computer science.

A big focus right now, really on point for this segment, is artificial intelligence.

We have been very focused.

It is our largest investment area right now.

One of the things you’re working on is a virtual assistant.

What could he or she do?

One way to think about it is — they try to answer questions and figure things out.

One type of question is to figure out what questions?

What did that person say or if you see a picture, what is in that picture or video?

Another kind of question that is challenging is to answer the why questions?

Why does this person want to see a certain person for an appointment?

Why did this person asked these questions?

Understand the intention of people.

That is the basic subject of this personal assistant project.

We are looking at the footage of what you have been working on.

In the case of this virtual assistant, i noticed that you have a male figure on screen.

From an interface standpoint, how do you choose what the administrator should look like?

In fact, in the system, as we have it set up, you can choose what kind of avatar you want.

Some people choose males and some choose females.

It really is something that is part of the research.

We’re trying to understand how people relate to these kinds of systems.

What kind of symbiosis or can management make.

Did you ever consider hiring scarlett johansson?

[laughter] i’m going to give that to you.

That is a gift.

Do with it what you want.

Peter, what about the other stuff you’re working on?

I have always wanted my dog as my assistant.

Excellent idea.

To each his own.

It is all about customization.

Her salvation.

Speaking of personalization, you are also working on a smart elevator.

Is this an elevator that would know what for you want to go to?

How does this work?

Right.

This is pretty cool.

We set up a bunch of sensors.

In front of elevators.

Without any programming, we just had an ai system that watched what people did.

For about three months.

Over the three months, the system started to learn, this is how people behave when they want to enter an elevator.

This is the type of person that wants to go to the third floor as opposed to the fourth floor.

After that training.

Period, we switched off the learning period and said go ahead and control the leaders.

Without any programming at all, the system was able to understand people’s intentions and act on their behalf.

There is a general theme that we have in the labs.

Today, people talk about operating a computer.

You sit down in front of a desk at the computer and operated.

You take the computer out of your pocket and operate it.

In the future, you will not operate computers, but they will work on your behalf.

This elevator project is one expression of that idea.

I want to settle on this forone second.

The idea is one second.

That the elevator system would learn my habits.

Perhaps at some time they are going out for lunch and another time i’m going to meet with somebody.

How does that break down?

You can imagine a connected world.

In the future, the sensors around you, on your body, in your environment, start to understand your physicality and what you’re saying and what your teens are.

Even what your plans are.

We have seen that with the virtual assistant.

Somehow the virtual assistant is understanding that eric is not in the office right now.

We think he will be back in three minutes.

Sit still and i am sure you can meet with him.

That kind of ability to predict people’s intentions and what might happen next, and the same way that people do, those are things that we think will be built in very deeply in the environment all around us.

If it knows — if your environment knows, for example, that it is lunchtime.

You had spoken about having lunch with a colleague on the second floor.

It notices that you seem to be leaving your office to go to the elevator, the elevator can be smart enough to take you, without your need to operate anything.

This is a calling — this is a question that i spoke about at nyu.

Do you ever feel like the technology exceeds our comfort level?

There is a tension there as to how quickly you could implement some of these things?

You might be able to, but it would freak us out.

When we are having a few laughs in the hallway, sometimes we make a joke about skynet.

In fact, the same kind of intelligence ultimately will have to be brought to bear to understand what are the boundaries.

What is proper and improper?

That will help to protect people’s privacy, if that is a concern.

Also, to understand exactly what kinds of things may or may not be appropriate to share.

To give you an example, in the elevator project, the system actually is incapable of seeing people’s faces.

It is just looking at the motions of people in the hallway.

It does not understand their identities.

It is still able to learn exactly what people’s intentions are.

We are seeing these patterns emerge over and over again.

Right now, it is incredibly surprising.

We are able to look at english words that people type into things.

We can look at the letters, three letters and we are able to learn the word hot is the same as warm.

Both of them are related to burn, and so on.

Fascinating stuff.

It will be interesting to see