A Guide to AI Chatbots

Season 1 Episode 13 | 34 minutes


How does AI even work? What is NLP? LUIS? Dialogflow? Watson? Lex? What is machine learning? What about IOT? Once you get beyond the simple rule-based chatbot, things get interesting. And talented guys like Andy Dharmani of ako.ai get busy. In this episode of The Conversologist Podcast, Jam Mayer talks to Conversologie’s tech partner Andy about chatbots, AI, NLP, intent, entities, digital humans and more.


This conversation was lifted from a February 2019 webinar on AI chatbots. 99% of the conversation is self explanatory, but you may wish to view the webinar slides also.

Hosts & Guests



Jam Mayer

Andy Dharmani

Episode Conversation

Episode Transcript

Introduction

[00:00:00.233] – Jam

NLP, NLG, Intent, Entities … this is the stuff that makes Chatbot tech exciting. We’ve talked a lot about the conversation design and chatbot personality in this podcast series. Today we’re looking behind the UI, at the AI. Buckle up!

 

[00:00:18.793] – Jam

Welcome to the Conversologist podcast, where we talk about the art and science of conversation in the digital space. We know that technology can be a powerful enabler in the customer journey, from marketing to customer service, but communication and emotional connection still need to be at the core. I’m your host Jam Mayer, and I invite you to converse with us.

 

[00:00:41.613] – Jam

This is Episode 13, A Beginner’s Guide to AI Chatbots. Today, I’m sharing with you highlights from a webinar I did earlier this year with Andy Dharmani, founder of New Zealand based AI solutions company ako.ai … or simply Ako.

 

[00:00:59.643] – Jam

We’re digging, this time, into the next level of chatbot development, and asking some questions around AI, voicebots, and even digital humans. Andy is our tech partner in developing enterprise-level chatbots. We’ve worked on a couple of projects already where, we did the conversation, writing, and personality of their clients’ chatbots. I tell you, this guy is awesome.He comes from a background as a solutions architect. His specialty was infrastructure, with qualifications in Computer Science and Technology, as well as Business. Very smart guy, don’t you think? He’s passionate, and highly regarded for his vision, and his ability to deliver practical and efficient solutions. OK, that was quite the spiel, so let’s get into it.

 

[00:01:48.103] – Jam

Hi Andy, thank you for joining us.

 

[00:01:49.473] – Andy

Hey, Jam. Thanks for the lovely introduction. I definitely learned a few things about myself today. Thank you for that. I am really excited to work alongside Jam. As you can guess from the name Conversologie, Jam does a lot of work in chatbots and conversation design, and I’m passionate about technology. So we together work on delivering some high-performing chatbots . So we will get straight into it then.

What is AI Chatbot?

[00:02:16.653] – Jam

Yes, definitely. So we have a few questions, and Andy, of course, is going to answer those questions. OK, so, well, Andy, let’s start off with sort of the basics. What is actually an AI chatbot?

 

[00:02:30.333] – Andy

So if you look at basically all the different channels users interact with, we are all used to things like making a phone call, sending out emails, obviously chat on a website, on the likes of Facebook Messenger, WhatsApp. We use different interfaces. In last few years, we have also seen a lot of voice-enabled devices, likes of Alexa, Microsoft Cortana, Google Home and a lot coming up like Samsung Bixby. So I guess for peer-to-peer messaging, we obviously are used to these channels quite a lot, and with chatbots, what we are trying to do is provide a chatting experience. So as a user, they are on Facebook Messenger. They want to interact with, let’s say, a telecommunication company or a bank, or insurance company. And chatbots can obviously provide a consistent response, and which can also engage and convert visitors into customers and leads. I guess chatbots are basically trying to use technologies available, like artificial intelligence, to understand what customer is saying, and provide an awesome response back to them.

 

[00:03:42.963] – Andy

We cannot forget that now with technologies and use cases of IOT coming up, Internet Of Things, we can start interacting with like of fridges in our house. I can tell my AC to switch on and off. I can switch on the light using voice commands. So voice is coming back too, in quite a big way. Specifically because a lot of these things we are trying to talk to, like a washing machine, they don’t have screens to send a message or type. Yeah, voice voice is growing.

NLP, NLG, LUIS? What are these acronyms and jargon?

[00:04:14.583] – Jam

So here’s another question for you. There’s a lot of acronyms going around; there’s N.L.P, N.L.G, Luis – L U I S – I don’t even know if I pronounce it correct. You couldn’t help us understand a little bit more about all these things?

 

[00:04:33.243] – Andy

I will try to explain it simply, because actually it is simple. One aspect of the chatbot is providing the interface, which could be, again, Facebook Messenger, could be a chat running on your website. Or with voice, it could be likes of Alexa’s, where you’re talking to an Alexa, a Cortana. So you interact with the chatbot using one of these channels. One important aspect is conversation design, and channels. But the core of the chatbot, where all these acronyms come from, is actually the NLP platform. Usually. Which stands for Natural Language Processing. If you look at a chatbot, which is obviously trying to imitate what a human basically is capable of doing, so when two people have a conversation, somebody says, “what is the time today?” So the other person needs to understand the context in their brain, and respond back. What NLP does, which is quite a big part of it, is trying to extract meaning out of utterance. So if somebody says “can I have espresso?” NLP can understand that, and figure out what’s happening within it.

 

[00:05:44.823] – Andy

So what it usually does is, if it is “can I have espresso?” usually a NLP engine will understand it and score it. So it will say I’m 90% confident this person is trying to ask for a coffee. It might also say that I am 40% confident that this person is asking for a tea. So it does this scoring, and then the highest-scoring intent, in this case, this person is asking for coffee, is picked up and then the important information is basically extracted out of that phrase. So in this case, espresso is important. I know which coffee they want. I know they’re asking for a coffee, and I also know which coffee they want. So the entity will be extracted, and then I can go ask them the next question. “Do you need any sugars?” Whereas if the person says, can I have a coffee? I know this person is asking for a coffee, but they didn’t tell me which coffee they want. So the next question the bot will ask is, “which coffee can I get you?”

 

[00:06:45.663] – Andy

So the two key elements are; understanding the natural language, and the second big part of it is extracting the entities, or the important information, out of that particular phrase. It’s usually not one question at a time. That’s where Context Management comes in. In a chatbot, where I know I’m having this conversation in context of taking a coffee order. So I need to, as a chatbot, understand the context, so that I can ask the right questions. And when the person says ‘one sugar,’ I know they’re asking for one sugar for this coffee. So that’s kind of the NLP, which has these different components. Then the other part is obviously, to do anything in a business sense, if you like, there is always integration with business systems. So if, as a chatbot, I’ve taken a coffee order, obviously I need to pass it to the barista. So I might have to integrate with the point-of-sale system at the cafe, let’s say. That’s called the fulfillment.

 

[00:07:39.903] – Andy

These acronyms are mostly around Natural Language Processing, Natural Language Generation. It is also called Natural Language Understanding. The LUIS, particularly, is a Microsoft product, which stands for Language Understanding Intelligence Service. This is one of the many services available where you pass a phrase, “can I have espresso?” and it will return you back, “I’m 90 percent confident this is a coffee order.” Now, with the kind of technology development in this area, it has become so much easier to implement these chatbots, because you don’t have to build your NLP. So, yeah, you need to know what it is, but there are NLP engines available, as services from the likes of Microsoft, IBM and many others. Hope I didn’t confuse you?

 

[00:08:24.933] – Jam

I couldn’t have explained that better. I just wanted to move on to more around the user interface and stuff. Can you explain a little bit more about … How does the UI elements, or User Interface elements, as you’ve shown here, you know – the responses, the buttons, how does that work as well?

 

[00:08:48.243] – Andy

So, look, probably this is the most simple aspect of the chatbot, and this is also the one which, as a normal user, we see every day when interacting with chatbots. So like when building a website, you need to figure out where to put an image, where to put a video, how you need to section your website, it’s very similar to that. In a chatbot there are only a limited set of UI elements you can use, like buttons, multi choice options, you can have links to external webpages, you can add images and videos. So this is the aspect where it has more to do with conversation design, where you need to figure out “what elements I need to put where, so that I can provide the best possible experience?” So this is like the front end, and when a user types something, the front end interface will send a message to the likes of LUIS, and then try to figure out what to say next.

 

[00:09:40.923] – Jam

Got it. And that’s what we help with, with Andy’s team and stuff, in terms of the front end …

 

[00:09:46.533] – Andy

This is – sorry, Jam – I just want to add, I said from a technical aspect, this is simpler, but obviously I cannot do it. So this is a most important aspect, too. And this is where, you know, building a chatbot personality and actual conversation comes in. So I don’t know how to do it. But this is important.

 

[00:10:04.173] – Jam

Yes, definitely. No, totally agree. Before we move on to the next question, Andy. So is LUIS … you did mention it is from Microsoft. Question is, what are some comparable services from other companies?

 

[00:10:16.833] – Andy

Sure. Microsoft has their solution called LUIS. Actually, most of these companies did some acquisitions. So Microsoft acquired LUIS. Google have a platform called Google Dialogflow, which is kind of competitive in that space. IBM got a solution called IBM Watson. A W S got a solution called A W S Lex, L E X, which was also acquired. And then there are many other companies who provide this solution as well, specifically to address a particular use case. So just one example is a company called Sound Hound. S O U N D H O U N D. Sound Hound are building NLP engines specifically to be used in cars. So they are working for the likes of BMW, Mercedes. So there are these niche companies in particular areas. But in general, LUIS, Google Dialogflow, IBM Watson and Amazon LEX, are the ones you will come across most of the time.

What is machine learning?

[00:11:12.413] – Jam

Awesome. OK, next question. What is machine learning? What does that actually mean? I’ve heard that you’re supposed to train a chatbot. So a chatbot needs a gym, like us human beings. So …

 

[00:11:26.393] – Andy

Yeah.

 

[00:11:26.963] – Jam

Yep. If you can explain please, Andy?

 

[00:11:28.943] – Andy

Sure. Look, this is the aspect which I don’t want to say is easy, because this is what we do and I tell everybody that what I do is very complex, but … In very, very simple terms, OK, if some of you are familiar with a normal kind of programming language, or how the logics are written, is in kind of context of “if-then-else.” So I say if distance is five kilometers, do Y. If distance is 10 kilometers do Z. So on and so forth. So most of the technology, or things we interact with today, are driven by kind of “if-then-else” logic. With machine learning coming now, it’s probably not just for chatbots, but it’s having impact on a number of other things. Like medicine, and infrastructure, and it’s impacting in a lot of areas. The core difference is, in machine learning you don’t have to write things like “if this happens, do next thing. If this happens, do something else.” You build a model and then you train that model with some training data.

How do you train a chatbot?

[00:12:33.083] – Andy

So if I want to train a chatbot to answer my car rental questions. So I will train it to, say, if somebody says when you want to pick up the car, and somebody responds back “tomorrow” – they can say “tomorrow” they can say a date – I need to train with only like some training data, and then the model is supposed to learn itself. And interpret everything else as well, which is around that. So that’s where I said the intent scoring comes in. So if I train a bot with five phrases, and then if somebody says that same thing in a different way, the machine-learning model, which is Natural Language Understanding in this case, is supposed to pick that up. So that’s how machine-learning works. There are various models, but especially in natural language, again, you need to feed it with training data before you go live with the bot. You should try to put at least ten, twenty phrases for each intent. And then it will pick the rest. And then obviously you will see that when a bot is live, and a lot of people are interacting with it, it will never be perfect. From day one. You will always pick things that it didn’t understand, and then you need to continue training. Say next time, if somebody says this phrase, they mean this intent. You continuously train it, and over a period of, you know, depending upon how many people are interacting with it, it will get better and better. So that’s what we call chatbot training.

 

[00:13:57.263] – Jam

Awesome. So, making a chatbot more intelligent! Are we supposed to start getting scared now? Is this smart? Is this a Terminator type of chatbot? Can you just – what are your thoughts on this one: Identification, Extraction and of course, Validation?

 

[00:14:16.163] – Andy

Yes, so chatbots can do a lot. Depends upon how much you train them to do. And they can self learn as well. But it’s still within the framework you train it on. So hopefully if you train it with all the good things, it will do the same. But to make a chatbot intelligent, like if a bot will say, “how can I help you?” “I would like to have a coffee.” Right? So what happens is, people can respond back with things like “I need a coffee,” or “I need a cappuccino,” or “I need a trim cappuccino.” Somebody can also say “I need a cappuccino with one sugar.” So what you need to do is, you need to train the bot to understand the intent, and extract the right entities out of it. So in this example, you have entities like coffee type, which could be cappuccino, flat white, espresso. You can have a milk type; regular, trim. You can have a count of sugars, how many sugars that person wants. And then the bot needs to be intelligent enough to extract those entities, extract them from the phrase, and then validate that. You know, if somebody says, I want Y sugars, you need to validate and say, “sorry, tell me how many sugars you want?” Y is not a number. Something like that. So basically you need to train the bot to pick up all these things and interact properly. So that’s what it means by Identification, Extraction and Validation.

 

[00:15:38.573] – Jam

It makes perfect sense. I guess – and this is the same conversation I have with our clients as well on the conversation design and the personality of a chatbot. They think, “oh OK, just build a chatbot and that’s it.” But actually it isn’t. It continuously needs to be trained, as you mentioned earlier.

 

[00:15:58.973] – Andy

So basically, once a bot is live, users are interacting with it, you need to have three stages, always ongoing. One is how the bot is doing. How many people are interacting with it? Are people following the path that they are supposed to? Then you can analyse how many people are dropping off, at which particular point in the conversation? And then the third important bit is based on all that data. You need to continuously optimise and truly train the bot. One point: you cannot leave a bot alone. So you have to continuously train and optimise it. It doesn’t matter if it’s in live or production, for one year or two years, but you always need to be at its side. You need to have proper KPIs for a bot. You need to monitor them, and you need to make sure it achieves that KPIs. Because if it is left alone, it will actually damage your brand, rather than giving a better experience.

 

[00:17:01.373] – Jam

Exactly. And I couldn’t agree more. And again, because you are interacting with human beings, we are such a complicated creature with different personalities, etc. So it is a machine and we need to train it. Poor bots if we just leave them alone. Anyway. OK, let’s go – before we move on to the next topic – I know there are some technologies in the process around it, training data. How can you acquire good training data, especially if your use case is niche?

 

[00:17:36.233] – Andy

So there is no straight answer to that. If your use case is niche, then you need to come up with the training data. So what we usually do is, if we are implementing a chatbot, either text messaging or a voice bot in, let’s say, a telecommunication service provider, we will look at their logs from last six months, pick up all those conversations, and then use that as the training data for the bot. So the data will come from within the organisation you are trying to implement it. Or at least from within the industry vertical. But definitely this is something you would need to get from the historic conversation with your customers. In specific cases, let’s say a hotel, you can reuse that data across hotels, because most of the people ask questions in a similar way. “What is the check out time?” “Do you have a cot if I’m bringing my baby?” But when it comes to niche, you have to spend time building your own data. Or, again, getting it from historic conversations.

 

[00:18:35.513] – Jam

Yeah. So it’s basically sort of starting from scratch, in a way.

 

[00:18:39.593] – Andy

It’s similar. So if I have to put it in context of a human: If a new employee has come, and joins your organisation, and your organisation is into some niche field, then obviously you need to spend a month training that employee before they can be productive. So if you’re bringing a chatbot in, you need to spend your month, maybe more, to train that bot properly before they can actually start interacting.

 

[00:19:00.893] – Jam

OK, kind of similar or related – please correct me if I’m wrong, of course, Andy – is the available technologies and the process?

 

[00:19:10.103] – Andy

Yeah so, I think probably, you know, the one question asked earlier is about the kind of technologies available. So if you look at what you need to build a chatbot, obviously it needs to be built for a particular channel, which could be Messenger, or Alexa, or Skype, or your website. So there are lots and lots and lots of platforms available. And I think that probably adds a bit of confusion as well. Which is the right technology for you? But that’s why we have mentioned that you need to spend time planning and picking the right technology. Because with bots, if you have started your journey with a bot, it’s difficult to switch to something else. So it’s better to spend time planning right. And then obviously building right, and continue to optimise. Chatbase: It is an awesome tool for chatbot analytics. It provides you so much insight into what your customers are talking to you about. You should consider part of these components, at least, when building a chatbot.

 

[00:20:05.933] – Jam

So is Chatbase just for Enterprise, though? Because I know there are a lot of the small and medium size type of platforms out there?

 

[00:20:14.543] – Andy

Um, yeah, that’s like, while Chatbase, you need to integrate it into your chatbot, spend some time configuring it, so it’s more for enterprise, I will say; but if you’re building a small chatbot, with like a hundred interactions a week, for example, I will suggest just sticking with, likes of BotEngine, Flow XO, they give you enough information.

Applications of AI Chatbots in Business

[00:20:35.183] – Jam

Awesome, that’s very helpful. All right. Moving on. Now that we kind of know a little bit about the abbreviations, what are the applications in business for AI chatbots? Do you have any examples?

 

[00:20:48.723] – Andy

I guess if you look around, mostly where conversational AI is being used today is a lot in service desk automation. So if you look at within a large organisation, I want to apply for leave, or I want to check how do I do my expense claim, or how do I reset my password, how do I get access to this particular application? So a lot of those things have been automated.

 

Actually a lot of bigger enterprises, they are starting with their internal service desk bots first, before they go out to their users. Because this is a good learning curve for them, and they can experience how this whole thing works. Call centre employ is a big, big thing. I personally think this is the area where it will have the biggest impact.

 

When a call comes in, a bot can say, “how can I help you?” And have a good conversation with the customer. So I think this is the area because nobody likes being in call centre queues for some time. On the other end, Enterprises can’t afford to keep scaling the manpower they have. So I think this is where that’s going to have the biggest impact. Besides that, in the likes of conversational commerce, lead capture, that’s more around selling. I think this is where there is still a way to go because chatbots are not as engaging as a human can be.

 

So this is the growing area, but probably still a way to go. And then another one is a digital concierge. So this is if you look at most of the, let’s say, hotel chains, they are all experimenting with digital humans who can help guests check in, ask questions like “what’s around?” – basically help them around in a hotel. So these areas are where we see in most of the activity happening.

 

[00:22:28.953] – Jam

Awesome. And I do agree with, having spent probably half of my career life around call centre operations, I do understand. I think that’s why my progression towards the years, and it does make sense that I’m into this business as well, and around conversations and stuff. Totally agree that it will help around the industry, and around that space. Please carry on, Andy, sorry.

 

[00:22:57.783] – Andy

Sure, now I just want to, like, on the call centre specifically, if you look at New Zealand context, Vodafone New Zealand has started a program, last year actually, and they are targeting that 60% of the interactions will be handled by virtual agents by I think 2021, it is. So that’s kind of the numbers we are talking about. Similarly, Spark is another telecommunication service provider in New Zealand, and their studio had a statement in, last year, that October 2018, their chatbot IP handled, I think, same workload as 43 employees in a month. So it is huge and I think this is how it is going to happen, I think. Specifically in this area.

 

[00:23:42.873] – Jam

Yep. Totally agree. So that’s a lot of applications for sure. I mean, I’m seeing, I mean, right now there’s a lot of chatter around marketing and lead generation, which you’ve already mentioned, et cetera. But this is pretty much the bulk of the interactions of how chatbots can be very beneficial. Especially in the enterprise side of things. So I’ll move on to the next question: What are your thoughts on the current state of voice bots? So let’s move a little bit to the voice side of things.

 

[00:24:15.033] – Andy

Sure.

 

[00:24:15.843] – Andy

The base itself of voice devices is growing a lot. So you can walk into a store, electronics store, and buy Alexa, or Google Home. So these devices are growing a lot. So there are three things. One is the devices like Alexa and Google Home, which are stand-alone devices. Then Microsoft’s Cortana is something you can use from within Alexa,, or on your laptop.

 

So you can enable Cortana in your Windows device and interact with it, ask “when is my next appointment?” and all of that. And then on the phone, there are personal assistants available like Siri, Bixby from Samsung, Google Assistant.

 

And then, as I mentioned in IOT, you can give commands to literally any device around you. So with that happening. I guess voice is going to be huge, especially in terms of bots, because that’s where the development is happening. You know, when all these devices Siri, Alexa, you’re talking to a bot, if you like.

 

So if you build Alexa a script, this is using the LEX I mentioned. So it is Amazon LEX. So all of these devices are using Natural Language Understanding. A chatbot kind of conversation design, to answer your question. So this is going to grow probably even faster than chatbots, because you can see in the last two years already, people are used to using these devices and probably we don’t even realise that we are talking to a bot.

What are digital humans?

[00:25:36.753] – Jam

Yep, I’m excited about that. I mean, on a conversational design, again, on the front-end side of things, it’s now considering the humor, not even just the humor, but the type of words, or the language, the tone, etc. So it doesn’t really just sound like it’s a robot. Some of the groups that I’ve been into as well are part of this, and they’re starting to find how they can put in some specific cultural nuances, etcetera. Because it is totally different. If you’re trying to make a joke, for example, Right? American humor is different from the UK, for example. OK, now let’s move on a little bit towards the highest end. And again, correct me if I’m wrong, it’s called the Digital Humans. Now, this is, as some would say, close to Terminator. I’ve mentioned that earlier as well. Right. And that’s just some of the chatter. That’s why they’re a little bit … mmmm not so sure about Chabots, because they’re a little scary. So what are digital humans? I mean, how are they different from the other chatbots? And of course, any examples you can provide? It will be great.

 

[00:26:44.673] – Andy

Sure. So I think if you look at, like, how we interact. We touched base on Messaging chatbots, we touched on Voice. But the one element which is missing is actually looking at a face. Looking at the expressions. And looking at how a person is feeling and responding to you. I think that’s the Stage Three of the chatbot, where you are actually interacting with a person. Where you can see their expressions, you can actually feel how they’re feeling. So that’s where a lot of advancement is happening. Actually, there is a digital human. The first digital human in Saudi Arabia to get a citizenship. I think she’s called Sophie. I saw Sophie talking, and actually participating in an interview, and she was awesome. She could answer questions, with very nice expressions. She answered all the questions from the reporters and the public. You can look at the YouTube video. And then she ended that conversation by asking for investment. So. It was an investment forum, and she said, hey, look, you have seen how I am today, and what are my capabilities, so I’m looking for investment to get better. This is Stage Three. And I think luckily New Zealand is the country where a lot is happening in this space. We have two startups which are growing quickly, gained a lot of investment. One is called Soul Machines, and another is called Face Me. So these two organisations, startups in New Zealand, are working just on, basically, the 3D faces, adding the emotional intelligence and visual and auditory systems. And look, with these ones, they are very separate from, actually, Natural Language. This whole subject is about creating the expressions, the face, the image, and then in the background, they also use the Natural Language Engines we spoke about. The likes of LUIS or Dialogflow, or if you can build your own. But the main thing is this expression. And if you look at the use cases around, I’m sure all the countries look the same. All major banks in New Zealand and Australia are coming up with their own digital humans, and now telecommunication service providers have started this journey too. Like Vodafone New Zealand are implementing one. And it’s the same trend across the globe. So if I see, two years from now, or maybe a little more, they will come into mainstream. So when you go to a branch, you will be talking to this assistant. Or if you are going to, let’s say ASB website, these digital humans will be there talking to you. So I guess this is the next evolution from chatbots. And again, a lot and lot and a lot of research is going into this area.

 

[00:29:26.073] – Jam

Yup, and I personally had the chance to see Soul Machine’s demo, and how it works in a few conferences and events, and of course, Face Me as well. It’s … Soul Machines’ is a little creepy, but really mind blowing, so to speak. The technology behind it. I’m also excited about the digital humans, and how it interacts and stuff. And of course, I mean, we can’t talk about it right now because we don’t have enough time, but correct me if I’m wrong; I think it’s Josie, from ASB, right? And I know I read a few articles. I mean, of course, it’s mixed feedback. Some hate it, some loved it. It’s technology. … Just a few more before we end. There is actually another question. Just around – since we’re talking about emotions and digital humans, I think this is the best time to ask this question – it says, “what potential is there for recognising, or even responding appropriately, to the emotional state of a user? So drive some degree of emotional matching? So this is about matching now. Or is this what the digital humans are actually doing right now?

 

[00:30:33.903] – Andy

No, that’s an awesome question, actually. So, look, this is a field that is evolving a lot. But I can tell you that how generally this thing is available. So Microsoft LUIS, when you pass it a phrase, it actually tells you “I’m 90% confident this person is asking for a coffee. and I am also scoring this person out of a hundred where they are emotionally. So a hundred points means they are extremely happy, and zero means they are angry. Or sad.

 

So basically, today, if you use LUIS, you have to do one check box, it will tell you how this person is emotionally. And then you can respond appropriately. So if you see that score is 10 out of a hundred, you know that person is angry and then the bot can take a different tone. Or take them to a different pathway. Or maybe transfer it to a real person.

 

So this is already happening. And when it comes to digital humans, then digital humans also look at the face expression. So they can also look at your face expressions, and then bring that into the picture as well. And again, score you even better. So, again, scoring is based on what you have written, tone of your voice, and then your face. So if you combine these three things, what you have said, in which tone you have said it, and how your face is looking, very accurately, you can figure out the emotional state of that person. So, yeah, I think we are probably getting there very fast.

What’s the state of New Zealand’s AI chatbot industry?

[00:32:01.843] – Jam

Awesome. So, exciting stuff. OK, I have one last question. So what is the state, Andy, right now, here in New Zealand, when it comes to AI chatbots, the industry, in terms of developing chatbots? Any thoughts about that?

 

[00:32:18.753] – Andy

Yeah, look, I can talk about the general state of AI in New Zealand. So it’s being invested in a lot, from all sectors. So as I said, the likes of Soul Machines and Face Me are leading, in the world actually, when it comes to digital humans. So that’s happening right in New Zealand. There are also messaging chatbot platforms being developed. One is Ambit, another one is Journey. Ako.ai of course, which I represent. We also build a chatbot platform. So there is a lot happening in New Zealand. Especially when it comes to the enterprise space. I still feel that we don’t have enough companies putting effort in, kind of, small to medium customer range. But there is definitely a lot of activity in enterprise. Even, Jam, looking at your organisation Conversologie, who are focusing on just designing a good conversation, giving a good personality to the bot, selecting the right tone, a lot of importance is given to all these elements, and there are organisations doing in technology, conversation design, navigation flows, platforms, research. So yeah, I think we are up there and will continue to grow.

 

[00:33:30.533] – Jam

And it’s very, very exciting to be part of it, isn’t it, Andy?

 

[00:33:35.093] – Andy

It is, absolutely.

Closing

[00:33:36.353] – Jam

Definitely. I mean, the team and I are very, very excited about this. Thank you so much, Andy, for joining us today. Very, very valuable information for sure.

 

[00:33:46.253] – Andy

Yeah, I really enjoyed it.

 

[00:33:48.383] – Jam

That’s it for this episode. Maybe it raised more questions than it answered. Well, we very well may get Andy back to guest again, so, please, comments, questions, feedback, I’ll keep an eye on them. If you’re on anchor.fm you can leave us a voice message. Don’t be shy. And if you found us on Social or your usual podcast app, drop a comment, join the conversation.

 

Or you can visit our page at Conversologist.show and tell us what you think, there.

 

Music bed, of course, was composed by Carlo Vergara, and this podcast was edited together from the original one-hour webinar by my podcast stooge, first in his own Google search, Rew Shearer.

 

Till the next episode, thanks for listening and remember to keep the conversation going.

Keen to listen to more episodes?

Metaverse
The Conversologist Podcast with Rew Shearer
by Jam Mayer 10 Dec, 2022
What if you could talk to the future, and the future talks to us? Our thoughts around the technologies behind "The Peripheral" and a few real-world applications.
Our Stories
The Girl, The Lab and Nerdgasms
by Jam Mayer 07 Dec, 2022
Nerdgasms? Yup. An integral part of the Conversologist Lab. This episode is not about what it is or how it's done but the WHY. This is my story that led me to start it and how it can potentially make a difference in people's lives.
Social Media
Gunnar Habitz Guest in the Conversologist Show
by Jam Mayer 05 Jun, 2023
Discover what 'Social Selling' truly means in this episode of the Conversologist podcast. It's not about spamming on social media - so what does it take?
AI and Chatbots
Human-AI Partnership: Unveiling the Essential Skills with Peachy Pacquing
by Jam Mayer 22 Feb, 2024
Take a deep dive into the impact of AI on human element and the essential skills needed in the age of AI.
Education
by Jam Mayer 29 Nov, 2022
Why traditional workshops don't work. Here's how the Conversologist Lab's learning framework is changing how workshops are done.
Copywriting
by Jam Mayer 07 Jun, 2019
From the effects of words on the dopamine reward centres, to the psychology of tone and nuance, the Cortex Copywriter says that copywriting is actually a science.
Share by: