Introducing Dr Arash Eshghi, Head of Linguistics

Photograph of Arash Eshghi

Share This Post

Share on linkedin
Share on twitter
Share on email

How would you describe your role at Alana?

Alana’s overall goal is the design and implementation of natural, human-like conversational AI. My research on understanding human conversation, feeds directly into that process.

In normal conversations, we are not thinking about the structure of our communication. There is a lot of sophistication in even the simplest of conversations. We take turns in orderly ways, most of the time. There is very little silence between when my speech ends and your speech begins. People are able to predict very accurately when the other person is going to stop speaking. Conversation is not random.

As head of linguistics, my job is to understand and model the subtleties of human language. I’ve spent hours upon hours studying people’s everyday conversations, to analyse and extract systematic rules.

Once we have built a system and deployed it, I look at how Alana is interacting with users and debug those conversations. We’ve all experienced speaking with an automated call centre – the conversations can be very frustrating. The AI interrupts us, it suddenly goes silent, and there are long pauses. A non-expert in the field would hear a conversation and know that it doesn’t flow. But I can say exactly what’s gone wrong and then propose ways of fixing it. In real terms, I make interaction with Alana seem more natural, responsive, coherent, understanding and even empathetic. 

What do you love about working at Alana?

Alana is the perfect path to channel my research, which looks at the intricacies of how people interact, carry a conversation forward and understand each other. Sometimes I used to complain to myself silently, that I did this really interesting work but didn’t see direct impact on real people’s lives. Now, even what we’ve done so far has the potential to transform how people work with machines. 

What are some of your goals for the development of Alana?

I want to design algorithms for detecting misunderstanding. In real conversation, we correct ourselves and each other. For example, I might ask, ‘what did you mean by this?’ Current systems don’t have that capability, so when there is a misunderstanding, conversation tends to break down. Capturing misunderstandings and miscommunication has been ignored by some researchers and within the industry, because it’s really tough.

Also in line with my research, is the design and implementation of algorithms for more real-time conversation. In real conversation, you don’t wait for me to finish speaking before you start understanding what I’m saying. That means you’re able to respond very quickly, in real time, or even continue what I was saying mid-sentence. Much of the lack of responsiveness one feels in many current conversational systems stems from this lack of real-time language understanding.

Is it fair to say that your role is to humanise the technology?

We don’t want to create a human being. We are enabling machines, which are essentially tools, just very complex ones. They will be very intertwined with the everyday way that people get things done, which is through conversation. You have to capture that in order to make it work.

What are some of your expectations for the development of conversational AI in the next 18 months?

Current algorithms in machine learning and in conversational AI sort of start from a blank slate. I think less data will be needed because more of what we already understand of conversation is going to be incorporated, so systems won’t need to start from scratch. I also see there being more automation of the process of creating conversational AI and less need for experts to be involved.

And finally, what about fictional AI – do you take any inspiration from AI in the movies?

I loved Hal from 2001: A Space Odyssey. Hal’s conversational capabilities emerge from being an all-round very intelligent being. If you’re as good as a human being or a child is at conversation, then haven’t you got a human being? If I were to feel like I was talking to a human, that would make me a little bit scared. It opens a whole Pandora’s box of questions about ethics and about the status of that piece of machinery. Perhaps thankfully, the industry is quite a way from that.

More To Explore