AI: Why Now?

Matt Swanson Blog Leave a Comment

Historically, AI has been surrounded by a lot of hype, but wasn’t able to deliver meaningful results to impact business. All that is starting to change.

Technological trends are converging to bring the fulfillment of AI’s promises closer than ever before. Several of these trends are converging to place customer communications at the forefront of that fulfillment. These include: the digitization of communication, the cloudification of data, access to computational resources, and the emergence of deep learning frameworks.

The Digitization of Communication

Perhaps ironically, the key to unlock AI’s full potential starts with a shift in the behavior of humans. Consumer preferences for how to communicate with corporations and brands are changing, and an important byproduct of that change is better data. Specifically, we’re seeing a generational shift in which younger consumers increasingly prefer to use asynchronous channels like messaging and live chat, instead of linear channels such as voice or in person.

This sea change has made it possible to process communication in ways we simply couldn’t in the past. Voice communication is very noisy, making it difficult to get a digital representation of the text. Even though speech recognition has improved, it’s still not reliable in unconstrained environments (unknown speaker, multiple speakers, background noise, etc). Written digital communication leaves behind a clean text transcript.

Messaging and chat also yield short-form conversation, in contrast to older digital communication platforms such as email. These exchanges tend to focus on one issue at a time, reducing the effort it takes to detect and isolate key concepts. The combination of these features—fidelity, brevity, and clarity—provides small pieces of very clean data that are powerful inputs for machine learning (ML) and natural language processing (NLP) algorithms.

The Cloudification of Data

The last ten years has seen an explosion of cloud data services, and a parallel explosion of effort inside the enterprise to move data into the cloud. This shift has liberated data that previously languished in far-flung silos, giving us access to information in a way that would not have been possible a decade ago. The unprecedented level of access to data in the enterprise means we can greatly improve the accuracy of our models.

Smaller is better when it comes to individual pieces are communications data, but we want to have as many of those pieces as possible. The more data we have, the more reliably we can use algorithms to map out a given industry environment, understand where the edge cases are, and build intelligent systems that work well inside that environment.

Access to Computational Resources

All of those pieces of communication and other enterprise data only help if we’re able to process them. Now that it’s relatively easy to access massively distributed computational resources, we can not only make use of modern algorithms and data modeling techniques, but we can also start to use algorithms that were previously deemed infeasible due to limited resources.

One of the popular things to do now in graduate programs at notable research institutes is to revisit research papers from the 1980s, and apply modern computing resources to resuscitate the ideas and algorithms they contain, demonstrating how they can be used today. Not only is the current AI hype real, but we’re finally living up to the initial era of AI aspirations by putting to use technologies that were initially deemed unusable.

The Emergence of Deep Learning Frameworks

Though we would not be able to do any of what we’ve seen without the trends that preceded it, most of current buzz surrounds the concept of deep learning. Newer deep learning approaches are in fact critical to the inflection point we’re now hitting. As recently as 2010, computer science graduate programs were still advocating against deep learning techniques, so we’re seeing a shift in industry mindset as well as a shift in how we approach algorithmic learning.

Part of what is reopening people’s minds to these possibilities is a process of standardizing algorithms through frameworks. Similar to how web development exploded with the advent of Ruby on Rails once the Rails framework made it possible to create custom websites without rebuilding every component from scratch, we now have tools to standardize the repeatable pieces of model building. Standardization make it easier to deploy and diagnose these new deep learning machine learning models, reducing overhead and susceptibility to errors.

The Current Hype is Real

AI wasn’t a good bet to make in the past, but it is a much better bet today. The combination of better customer communication data to use with ML and NLP, availability of enterprise data through the cloud, access to computational resources that enables powerful algorithms of the past and present, and deep learning frameworks that reduce overhead and errors means investing in AI to augment your business is a sure way to see results.

Leave a Reply

Your email address will not be published. Required fields are marked *