The sudden death of the website

You may not know me or even my company, LivePerson, but you’ve certainly used my invention. In 1995, I came up with the technology for those chat windows that pop up on websites. Today, more than 18,000 companies around the world, including big-name brands like T-Mobile, American Express, Citibank and Nike, use our software to communicate with their millions of customers. Unlike most startup founders who saw the birth of the internet in the mid-1990s, I am still CEO of my company.

My longevity in this position gives me a unique perspective on the changes that have happened over the past two decades, and I see one happening right now that will radically transform the internet as we know it.

When we started building websites in the mid-’90s, we had great dreams for e-commerce. We fundamentally thought all brick-and-mortar stores would disappear and everything dot-com would dominate. But e-commerce has failed us miserably. Today, less than 15 percent of commerce occurs through a website or app, and only a handful of brands (think: Amazon, eBay and Netflix) have found success with e-commerce at any real scale. There are two giant structural issues that make websites not work: HTML and Google.

The web was intended to bring humanity’s vast trove of content, previously cataloged in our libraries, to mass audiences through a digital user experience — i.e. the website. In the early years, we were speaking in library terms about “browsing” and “indexing,” and in many ways the core technology of a website, called HTML (Hypertext Markup Language), was designed to display static content — much like library books.

But retail stores aren’t libraries, and the library format can’t be applied to online stores either. Consumers need a way to dynamically answer the questions that enable them to make purchases. In the current model, we’re forced to find and read a series of static pages to get answers — when we tend to buy more if we can build trust over a series of questions and answers instead.

The second problem with the web is Google. When we started to build websites in the ’90s, everyone was trying to design their virtual stores differently. On one hand, this made them interesting and unique; on the other, the lack of industry standards made them hard to navigate — and really hard to “index” into a universal card catalog.

Then Google stepped in around 1998. As Google made it easier to find the world’s information, it also started to dictate the rules through the PageRank algorithm, which forced companies to design their websites in a certain way to be indexed at the top of Google’s search results. But its one-size-fits-all structure ultimately makes it flawed for e-commerce.

Now, almost every website looks the same — and performs poorly. Offline, brands try to make their store experiences unique to differentiate themselves. Online, every website — from Gucci to the Gap — offers the same experience: a top nav, descriptive text, some pictures and a handful of other elements arranged similarly. Google’s rules have sucked the life out of unique online experiences. Of course, as e-commerce has suffered, Google has become more powerful, and it continues to disintermediate the consumer from the brand by imposing a terrible e-commerce experience.

I am going to make a bold prediction: In 2018, we will see the first major brand shut down its website.

There also is a hidden knock-on effect of bad website design. As much as 90 percent of calls placed to a company’s contact center originate from its website. The journey looks like this: Consumers visit a website to get answers, become confused and have to call. This has become an epidemic, as contact centers field 268 billion calls per year at a cost of $1.6 trillion.

To put that in perspective, global advertising spend is $500 billion, meaning the cost of customer care — these billions of phone calls — is three times more than a company’s marketing expenses. More importantly, they create another bad consumer experience. How many times have we been put on hold by a company when it can’t handle the volume of incoming queries? Websites and apps have, in fact, created more phone calls — at increased cost — and upended digital’s promise to make our lives easier.

There is something innate to our psychology in getting our questions answered through a conversation that instills the confidence in us to spend money. This is why there is so much chatter about bots and AI right now. They tap into an inner understanding about the way things get done in the real world: through conversations. The media are putting too much focus on bots and AI destroying jobs. Instead, we should explore how they will make our lives easier in the wake of the web’s massive shortfalls.

As I have discovered the truth about e-commerce, in some ways it made me feel a sense of failure from what my hopes and dreams were when I started in the industry. I have a lot of hope now that what I call “conversational commerce” — interactions via messaging, voice (Alexa and so on) and bots — will finally deliver on the promise of powering digital commerce at the scale we all dreamt about.

I am going to make a bold prediction based on my work with 18,000 companies and bringing conversational commerce to life: In 2018, we will see the first major brand shut down its website. The brand will shift how it connects with consumers — to conversations, with a combination of bots and humans, through a messaging front end like SMS or Facebook. We are already working with several large brands to make this a reality.

When the first website ends, the dominoes will fall fast. This will have a positive impact on most companies in transforming how they conduct e-commerce and provide customer care. For Google, however, this will be devastating.

Choosing the best language to build your AI chatbot

No, this is not about whether you want your virtual agent to understand English slang, the subjunctive tense in Spanish or even the dozens of ways to say “I” in Japanese. In fact, the programming language you build your bot with is as important as the human language it understands.

But how do you differentiate between them? Facebook, Slack and Telegram all support the most popular languages, while API platforms such as Dialogflow, LUIS and wit.ai offer SDKs for the majority.

Of course, the caveat should always be to veer toward the language you are most comfortable with, but for those dipping their toe into the programming pond for the first time, a clear winner starts to emerge. Python is the language of choice.

Why Python and not the others: natural language processing

Python is essentially the Swiss Army Knife of coding thanks to its versatility. It also is one of the easier languages for a beginner to pick up with its consistent syntax and language that mirrors humans.

This meant that when Python was first released it was applied to more diverse cases than other languages such as Ruby, which was restricted to web design and development. Meanwhile, Python expanded in scientific computing, which encouraged the creation of a wide range of open-source libraries that have benefited from years of R&D.

With regards to natural language processing (NLP), the grandfather of NLP integration was written in Python. Natural Language Toolkit’s (NLTK) initial release was in 2001 — five years ahead of its Java-based competitor Stanford Library NLP — serving as a wide-ranging resource to help your chatbot utilize the best functions of NLP.

Stanford NLP and Apache Open NLP offer an interesting alternative for Java users, as both can adequately support chatbot development either through tooling or can be explicitly used when calls are made via APIs. But NLTK is superior thanks to its additional support for other languages, multiple versions and interfaces for other NLP tools and even the capability to install some Stanford NLP packages and third-party Java projects.

While critics argue that NLTK’s inefficiency and steep learning curve make it more of an academic’s theme park than the solution to chatbots, TextBlob solves this problem by using it as a springboard to provide a more intuitive interface and a gentler learning curve for users.

What better approach than to look at some hard data to see which language the experts prefer?

An interesting rival to NLTK and TextBlob has emerged in Python (and Cython) in the form of spaCy. It does have some advantages. Namely, that it implements a single stemmer rather than the nine stemming libraries on offer with NLTK. This is a problem when deciding which one is most effective for your chatbot. As seen here, spaCy is also lightning fast at tokenizing and parsing compared to other systems in other languages. Its main weaknesses are its limited community for support and the fact that it is only available in English. However, if your chatbot is for a smaller company that does not require multiple languages, it offers a compelling choice.

NLTK is not only a good bet for fairly simple chatbots, but also if you are looking for something more advanced. From here a whole world of other Python libraries is opened up to you, including many that specialize in machine learning.

Machine learning

On the subject of machine learning, what better approach than to look at some hard data to see which language the experts prefer? In a recent survey of more than 2,000 data scientists and machine learning developers, more than 57 percent of them used Python, while 33 percent prioritized it for development.

Why is this? Similar to NLP, Python boasts a wide array of open-source libraries for chatbots, including scikit-learn and TensorFlow. Scikit-learn is one of the most advanced out there, with every machine learning algorithm for Python, while TensorFlow is more low-level — the LEGO blocks of machine learning algorithms, if you like. This versatility is why Python shines.

Many of the other languages that allow chatbot building pale in comparison. PHP, for one, has little to offer in terms of machine learning and, in any case, is a server-side scripting language more suited to website development. C++ is one of the fastest languages out there and is supported by such libraries as TensorFlow and Torch, but still lacks the resources of Python.

Java and JavaScript both have certain capabilities when it comes to machine learning. JavaScript contains a number of libraries, as outlined here for demonstration purposes, while Java lovers can rely on ML packages such as Weka. Where Weka struggles compared to its Python-based rivals is in its lack of support and its status as more of a plug and play machine learning solution. This is great for small data sets and more simple analyses, but Python’s libraries are much more practical.

Where does Python struggle?

Python’s biggest failing lies in its documentation, which pales in comparison to other established languages such as PHP, Java and C++. Searching for answers within Python is akin to finding a specific passage in a book you have never read. In addition, the language is severely lacking in useful and simple examples. Clarity is also an issue, which is incredibly important when building a chatbot, as even the slightest ambiguity within one of the steps could cause it to fail.

If speed is your main concern with chatbot building you will also be found wanting with Python in comparison to Java and C++. However, the question is when does the code execution time actually matter? Of more importance is the end-user experience, and picking a faster but more limited language for chatbot-building such as C++ is self-defeating. For this reason, sacrificing development time and scope for a bot that might function a few milliseconds more quickly does not make sense.

Natural language processing implemented with Python

Let’s take a look at one aspect of NLP to see how useful Python can be when it comes to making your chatbot smart.

Sentiment analysis in its most basic form involves working out whether the user is having a good experience or not. If a chatbot is able to recognize this, it will know when to offer to pass the conversation over to a human agent, which products users are more excited about or which opening line works best.

We could use sentiment analysis to determine if an interaction is negative or positive. Have a look at this sentence for example:

“Brilliant, my card is not working.”

Of course, the sentiment here is negative, but that might be difficult for a bot to detect given the word “brilliant” is used. How do we equip our bot with robust sentiment analysis? Note: Examples of the actual functions that have been described below can be found here and here.

While it is arguably much simpler to use spaCy and TextBlob, understanding how NLTK works provides a solid grounding in order to help grasp the concept of sentiment analysis. Using NLTK, we can train a bot to recognize sentiment by first examining a set of manually annotated data. We create this by taking three lists: one of positive comments, another of negative comments and a test list that contains a mixture. The more examples we have on each list the more reliable the sentiment analysis will be. The manually annotated data will test the exactitude of our classifier.

Like choosing the best tires on your racing car, the language you choose for your chatbot depends on a number of conditions.

Following this, we need to extract the most relevant words in each of the sentences (in the example given above it would be “brilliant,” “not” and “working”) and rank them based on their frequency of appearance within the data. To do this we can get rid of any words with fewer than three letters. Once completed, we use a feature extractor to create a dictionary of the remaining relevant words to create our finished training set, which is passed to the classifier.

The classifier is based on the Naive Bayes Classifier, which can look at the feature set of a comment to calculate how likely a certain sentiment is by analyzing prior probability and the frequency of words. From here, a measurement of how likely a sentiment is can be given.

While it is factually correct to argue that “language is just a tool” to equip your chatbot with AI, using Python and its wider variety of libraries and off-the-shelf algorithms means it is a much more straightforward option than other languages.

Like choosing the best tires on your racing car, the language you choose for your chatbot depends on a number of conditions. What kind of bot are you hoping to create? With which language are you most comfortable? Which is robust enough to handle your specific project as it continues to grow?

But if you are starting out fresh and are wondering which language is worth investigating first to give your chatbot a voice, following the data science crowd and looking at Python is a good start.

Twitter launches a new enterprise API to power customer service and chatbots


Twitter’s big news this week is its announcement that it’s now enforcing its new policies around hateful content and abuse, but today the company is rolling out something new for developers, as well: an enterprise-level API providing access to real-time activities like tweets, retweets, likes and follows.

The addition of the API is part of Twitter’s broader plan to revamp and expand its API platform, announced this April. Similar to how Twitter is now trying to make things right with its user base -who have too often been the victims of harassment, abuse, hate, and threats of violence – the company has been trying to reset relations with its developer community, too.

Over the years, Twitter has pulled the rug out from underneath developers’ feet too often. For example, it used to encourage third-party apps, then it began restricting them. It hosted developer conferences, then it killed them. And it once offered a suite of developer tools, only to turn around and sell them.

This year, Twitter aimed to turn things around by streamlining its API platform, while taking full advantage of its investment in Gnip. This included the launch of new APIs and endpoints for developers, as well as a published roadmap in an effort to boost transparency around its developer-focused efforts.

Today’s news of the new enterprise Account Activity API is a part of those promised changes.

Specifically, the API is designed to help developers build apps that can power customer service, chatbots and brand engagement on Twitter, the company says – an area Twitter has been increasingly invested in this year.

The existing Account Activity API lets developers pull the full set of activities related to an account, in real-time. The new enterprise version of this API is designed for those who need data for a larger number of accounts, plus multiple webhook URLs, reliability features like retries, and managed support.

In addition, Twitter says it’s expanding the beta for the standard API that delivers activities for up to 35 accounts. And on January 15th, typing indicators and read receipts for Direct Messages will be included as activities in the API, so developers can build more natural conversational experiences. (Meaning, customers will feel like they’re talking to a person, not a bot).

Alongside this launch, Twitter is launching a suite of developer tools for Direct Messages out of beta.

These features include Quick Replies, Welcome Messages, Buttons on messages, Custom Profiles, and Customer Feedback Cards. All have been previously announced, launched, and put to use by brands like Samsung, MTV, TBS, Wendy’s, and Patrón who used the tools with their chatbots. Other, like Tesco and Evernote, are using them for customer service.

The exit from the Direct Message beta will see some features removed, Twitter notes. This includes Location quick replies and location cards, text input quick replies, support indicators, response hours, and the prominent message button on profiles.

These latter features – like support hours and the big message button – seemed to have been launched in response to Facebook Messenger and its own advances as a platform for customer service. But there were some concerns among consumers that the message button the CS account’s profile served as a way to not address customer inquiries and concerns in public, in order to protect the brand’s image.

Twitter’s new APIs and DM toolset are available now.

Featured Image: Bryce Durbin/TechCrunch

Microsoft makes Azure Bot Service generally available for developers


Microsoft introduced the Azure Bot Framework over two years ago and companies have building chatbots for a variety of scenarios ever since. Today, the company made the Microsoft Azure Bot Service and Microsoft Cognitive Language Understanding service (known as LUIS) generally available.

“Making these two services generally available on Azure simultaneously extends the capabilities of developers to build custom models that can naturally interpret the intentions of people conversing with bots,” Lili Cheng, corporate vice president at the Microsoft AI and Research division wrote in a company blog post on the announcement.

Conversational AI allows humans to have a conversation with a bot, such as an online chat or in a chat tool like Facebook Messenger or Kik. The customer asks a question, and if it’s routine, the bot can answer it in a natural conversational manner that feels like you’re talking to a human — at least that’s the theory.

Microsoft has created a whole set of tools for developers to create their bots including the Bot Framework and Cognitive Services. Cheng says Microsoft has designed the bot framework to be as flexible as possible. You don’t even need to host it on Azure if you don’t want to. The Bot service is actually part of a broader set of Azure services Microsoft has created to help developers build applications with artificial intelligence underpinnings.

Graphic: Microsoft

“You can build a bot and auto provision on Azure and you can publish on Facebook Messenger, Slack and most of the Microsoft channels [such as] Cortana, Skype and Skype for teams,” Cheng explained. You can also embed the bot in a web page or in an app and customize the UI as you see fit.

When you combine the bot building tools with the LUIS language understanding tool, you get what should be a powerful combination. The latter helps the bot understand and parse the query to deliver the correct answer (and understand related queries).

Cheng said over 200,000 developers have signed up for the Bot service, and they currently have 33,000 active bots in areas like retail, healthcare, financial services and insurance. Companies building bots with the Microsoft tools include Molson Coors, UPS and Sabre.

Featured Image: vladwel/Getty Images (IMAGE HAS BEEN MODIFIED)

Google’s chatbot analytics platform Chatbase launches to public


At Google I/O this year, Google quietly introduced a new chatbot analytics platform called Chatbase, a project developed within the company’s internal R&D incubator, Area 120. Today, that platform is being publicly launched to all, after testing with hundreds of early adopters including Ticketmaster, HBO, Keller Williams, Viber, and others.

The idea behind Chatbase’s cloud service is to offer tools to more easily analyze and optimize chatbots. This includes giving bot builders the ability to understand what works to increase customer conversions, improve the bot’s accuracy, and create a better user experience.

This data is available through an analytics dashboard, where developers can track specific metrics like active users, sessions, and user retention. These insights give an overall picture of the bot’s health and see general trends.

The dashboard also lets bot creators compared the bot’s metrics across platforms, to see if some platforms need additional optimizations.

The system today integrates with any messaging platform, Google says, including Allo, Kik, Line, Messenger, Skype, Twitter, WeChat, and more. It also works with any type of bot, including voice or text.

And though it has had many high-profile testers in its early days, it’s not necessarily meant to be used only by larger companies. As a free service, Chatbase supports bot builders of any size – whether they have one or hundreds of bots in operation.

Google notes, for example, that an early customer, BLiP – a bot platform for brands – has been using Chatbase to track over 2 million messages to date across over 50 bots. Ingenious.AI, meanwhile, uses Chatbase with a bot built for a large, Australian healthy insurer to help customers of its eyeglass stores. And Keller Williams uses Chatbase with a bot that lets its 170K associates ask questions, manage appointments, connect with other associates, and track their goals.

Other testers on Chatbase’s platform have included bots for external-facing customer support, entertainment, advice and e-commerce, as well as internal-facing bots for productivity and information discovery, Google says.

The Chatbase website’s customer list includes: HBO, Keller Williams, Ticketmaster, Poncho, Swelly, Botnation AI, Viber, inGenius.AI, Starbutter AI, Foxsy, Crystal, FitWell, push, mia, and Unicef.

Beyond bot analytics, the tool leverages Google’s machine learning capabilities to figure out what sort of problems could be affecting the bot.

Typically, developers would have to scour through log files to find patterns in user messages, but Chatbase’s system instead clusters user messages that aren’t being handled (this is still available only to Early Access testers) and finds opportunities to answer more requests. It then offers ways to optimize the bot for both problems.

“One example would be for finding and fixing ‘misses,’ or alternate phrasing of supported actions that weren’t originally anticipated by the developer,” explains a Google spokesperson. “Like in so many other areas, machine learning and natural language processing are opening up powerful new opportunities in bot analytics. Putting some of Google’s machine learning capabilities to work for our users is a clear differentiator, and our users are really excited about that,” they added.

Rakuten-owned Viber, which has over 900 million users in 193 countries, detailed Chatbase’s success with a stickers bot it runs.

“We increased query volume by 35% for a popular stickers bot by optimizing queries with high exit rates,” the company said, in a statement shared by Google. “Chatbase has been immensely helpful in improving our bot. Instead of combing through logs, we rely on its machine-learning capability to help prioritize required optimizations — saving precious time that we need to focus on building new features,” Viber added.

Another notable capability in Chatbase is the auto-generated data visualization of conversation flows across sessions. This lets bot developers see what common paths users take, and where they often exit the application. A Funnel report highlights these steps and shows the success rate per step.

The company announced the general availability of Chatbase via blog post today, adding that it’s free to use.

When asked how the company plans to monetize the platform, Google said that’s something it’s thinking about for the future, but didn’t offer details on those plans.

Google also noted users of Dialogflow (formerly API.AI), Google’s end-to-end platform for building cross-platform conversational experiences, will automatically get access to basic Chatbase features within Dialogflow.

The public availability of Chatbase comes at a time when chatbots themselves have faced criticism for not being as useful as promised, and often suffering from usability issues. But the market is still in its early days, and chatbots aren’t exiting the scene. Some have even gotten better as developers figure out what works and what doesn’t.

Chatbase isn’t the only solution for chatbot analytics, but the machine learning angle could give it an edge. Plus, Google’s ability to offer it for free could help it achieve market share that bot analytics companies can’t necessarily compete with. However, as an Area 120 project, it’s unclear to what extent Google will back the project long-term. To date, most Area 120 projects have been more experimental. Chatbase seems like the kind of thing that should graduate to a Google product in the future.

Google launches a paid enterprise edition of its Dialogflow chatbot builder


Google today announced the beta launch of its enterprise edition of Dialogflow, its tool for building chatbots and other conversational applications.

In addition, Dialogflow (both in its free and enterprise version) is now getting built-in support for speech recognition, something that developers previously had to source through the Google Cloud Speech API or similar services. Unsurprisingly, this also speeds things up (by up to 30 percent, Google tells me), because apps only have to make a single API call.

Dialogflow now also features a number of basic analytics and monitoring capabilities, courtesy of Google’s Chatbase service.

You may still remember Dialogflow as API.AI, which was its name when Google acquired it last year, but the company has since renamed it. The main idea behind API.AI/Dialogflow was always to give companies the building blocks they need to build their conversational agents and other text- and voice-driven interactions and to make them easy to use.

To gain users quickly, the service was always available for free (with some rate restrictions), but that’s not what big enterprises want. They are happy to pay a fee in return for getting 24/7 support, SLAs and enterprise-level terms of service that promise data protection, among other things.

With the Dialogflow Enterprise Edition, they can now get all of this. Dan Aharon, Google’s product manager for Google Cloud AI, also noted that this version of Dialogflow is now part of Google Cloudt. That may sound like a minor thing, but it means that enterprises that want to adopt it can do so under the same terms they already have in place for Google Cloud. “Say you are Spotify, you can now add Dialogflow pretty easily because it already answers all of the requirements of being a Google Cloud product,” he told me. This also means that users who want to sign up for the enterprise edition have to do so through the Google Cloud Platform Console.

Google is charging enterprises $0.002 per text interaction request and $0.0065 per voice interaction request.

Aharon also stressed that the free version of Dialogflow isn’t going anywhere. Indeed, free users will also get access to the new speech recognition integration, though with a limit of 1,000 interactions per day (or 15,000 per month). Both versions also continue to offer support for 14 languages and integrations with virtually any major chat and voice assistant platform, including from Google competitors like Microsoft and Amazon.

When Google acquired API.AI, it was already one of the most popular tools for building chatbots and Google argues that this momentum has only continued. Google PR told Aharon not to say that it’s the most popular tool of its kind on the market, but chances are it actually is. He told me that the service now has now signed up “hundreds of thousands” of developers — and definitely far more than the 150,000 developers number the company shared at its Cloud Next event earlier this year.

“What we hear from customers time and time again is that the quality of the natural language understanding is head and shoulders above anything they have tried,” he said. “You don’t want to deploy something in production if it’s not very, very good” (though some companies obviously do…).

Beside the natural language understanding, though, it’s also Dialogflow’s flexibility that allows developers to go beyond basic decision trees and features like a deep integration with Cloud Functions for writing basic serverless scripts right in its interface that set Dialogflow apart from some of its competitors. Dialogflow also makes it easy to connect to other applications — no matter where they are hosted. That’s something you need if you want to integrate your conversational app with your ordering and shipping systems, for example.

Aharon tells me that it took about a year to port all of the API.AI features to the Google Cloud. Now that this is done, the service’s users can profit from all of Google’s investments in AI and machine learning. And given that Google is doing its best to attract more enterprises to its platform, it doesn’t come as a surprise that Dialogflow is joining this parade now, too.

Featured Image: David Paul Morris/Bloomberg via Getty Images