Artificial intelligence, or AI, has become such an important and integral part of our modern society. According to Forbes, even as far back as two years ago in 2017, 51% of modern enterprises already implement AI, with the industry itself valued at $16 billion USD. This is projected to exponentially grow further, reaching as far as $190 billion USD in 2025. Two of the most influential types of AI today are machine learning and deep learning systems. But what are these two exactly? And for that matter, how do we define artificial intelligence in the context of these two?
What is Artificial Intelligence?
The 2019 official entry of the Oxford Dictionary states that artificial intelligence is “the theory and development of computer systems able to perform tasks normally requiring human intelligence.”
In other words, AI interprets information in a manner similar to you and me, whether it is needed for image detection, speech recognition, or some other automated, decision-making system. One very simple example of AI is a video game computer opponent. It uses data from the game, as well as input from the player, in order to create a sequence of decisions and tasks in order to engage the player. AI has been introduced as early as the 1950s, but it only started to truly take off during the 70s and 80s, when personal computers and game consoles started to make the development of AI a basic requirement for its operations.
Other examples of AI in our daily lives might include:
- Smart devices
- Stock exchange bots
- Data recognition (speech, voice, face, etc.)
Today, many AI systems are used in almost any application where data is used such as management software, recommendation algorithms, media analysis, or even voice assistants. In fact, even simple tracking apps now use AI. As a rule of thumb, if there is a relatively complex task completion process that needs to be regularly executed without direct human intervention, it most likely would have AI in it.
What is Machine Learning?
Machine learning is a type of artificial intelligence (and therefore a subset of it) that specializes in parsing and analyzing given data in order to adapt from it and make adequately intelligent decisions. To put it simply, this AI is made to observe and notice a lot of stuff, and then take one or more courses of action based on the information it received.
Typical machine learning tasks today could have:
- Link recommendations
- Content moderation
- Search results display
- Curating timelines (in social media)
A machine learning system is capable of analyzing an enormous amount of data in a short time, creating solutions or conclusions from it. It optimizes its algorithm to give accurate interpretations, much more than what humans can do with the same time constraints. For example, we want to determine automatically if a certain email is spam or not. A machine learning system will sift through thousands upon thousands of emails in order to find patterns that would help it determine a spam email. It would then give a rough classification of either spam and regular email, the data of which it would use yet again to find even more patterns that would help it refine its analysis even further.
When given newer and newer sets of data, machine learning systems could adapt and update its algorithms to get even better at what it does. Or at the very least, minimize the likelihood of mistakes. This is what makes machine learning very important in our current data-driven era.
What is Deep Learning?
Deep learning is, yet again, another subset, this time of machine learning. The basic design of deep learning systems is based on an organic brain. Whereas we form new memories using a complex web of neural patterns, this kind of system weaves its own complex web of decisions using an artificial neural network, which is composed of countless algorithmic layers.
A few fairly notable deep learning systems are:
- Watson (defeated contestants at Jeopardy!)
- AlphaGo (defeated professional Go player Lee Sedol in March 2016)
- Deepfake (generating eerily realistic but artificial representations of actual people)
- OpenAI Five (a gaming deep learning project, defeated pro DOTA player Dendi last 2017
Unlike standard machine learning systems, which can still perform quite well even given relatively basic sets of data, a starting deep learning system would literally start from scratch. It is characterized by its ‘limping period’, where the first few generations of its AI would only start providing actual results after an adaptation period from several countless failed generations.
When it does reach a fairly complex level of efficiency, deep learning systems simply start to overwhelm everything else before it. DeepMind’s AlphaGo, for example, started using an initial set of 160,000 amateur Go matches before it stumbled its way towards beating professional Go players by playing millions of times against itself.
Deep learning systems, unlike other previously designed machine learning systems, hugely rely on matrix multiplications to generate data. As such, commercial GPUs are usually the best hardware for these systems, as they are capable of delivering the high-level parallel processing requirements needed to maintain operability.
Standard AI and Machine Learning
Although artificial intelligence and machine learning can be used interchangeably for many common applications, it is important to note that machine learning has one very distinct characteristic: adaptation. This means it learns. It may make a lot of initial mistakes, unlike pre-built AI, but it is designed to learn from them, to build from them, and eventually to supersede whatever it is designed to optimize for.
From a design aspect, machine learning also gains the advantage of not being overcomplicated with its initial build. A typical AI may need specialized coding or specific instructions for every single situation that the developer could possibly foresee. But a machine learning system can simply operate on a decision tree, plus a learning standard or two, as well as the necessary processing capability, and then make its way up to getting better and better at its task.
The importance of differentiating machine learning systems comes from the fact that we still use the standard, regular AI of such systems on other lesser important tasks and assignments today. After all, you won’t necessarily need machine learning systems for, let’s say, automating simpler file management decisions. In the same manner, it might not be proper to categorize something as sophisticated as a speech recognition system as ‘merely AI’, and thus we classify it properly.
Machine Learning and Deep Learning
Perhaps the more important distinction that we need to learn is the difference between machine learning and deep learning. First of all, as mentioned earlier, deep learning IS machine learning, technically one type, or a subset of it. Machine learning, however, is not always deep learning. The distinction largely has to do with the way both are built.
Machine learning has been developed within the same computer environment as many of our software during the last few decades. As such, it is in a way, linear, and even if it is built to adapt to Moore’s Law, it is still limited by its decision trees and algorithms. Deep learning, on the other hand, meshes all of its algorithms within a neural network. It is designed for high-level parallel computing, what we can now consider as the next generation in machine learning.
One fairly reliable way to determine if a deep learning system is being employed is to assess the complexity of the AI task. Usually, the more non-numerical, arbitrary variables there are to consider, the more likely it is to be a deep learning system. For instance, Netflix recommendations are not as complex as language translation, even if they do learn from data pooled from the entire internet user base. This distinction can apply to two similar tasks, such as two separate self-driving systems. The one that relies more on crunching sensor data should be the general machine learning system, with the deep learning one most likely relying more on humanly visible environmental cues, something like what Tesla is currently developing at the moment.
Regardless of whether the differentiation is clear or not, it is absolutely certain that deep learning is the future. For our purposes, however, separating deep learning AI from standard machine learning AI is essential in understanding just how different it truly is, and just how advanced it could actually be. Despite still being at its developmental stages today, it is almost already incomparable to everything else that came before it.
A fake Barack Obama stating a few short sentences on an equally fake background may seem uncanny to the normal viewer, but to us who already understand the distinction, we know that it is just but one of the vast possibilities of such a game-changing technology.
The Vietnam AI Grand Challenge
Want to learn more about artificial intelligence? Kambria is spearheading the Vietnam AI Grand Challenge 2019, a hackathon series whose mission is to train young AI developers. In collaboration with the Vietnamese government, McKinsey & Company, and VietAI, the Grand Challenge will bring together the country’s best AI talent to support corporations in Vietnam and globally in designing the ultimate AI virtual assistant.
How to Participate:
1. Register on the Kambria platform: https://bounty.kambria.io/
2. Follow the Grand Challenge Facebook page for all upcoming event information: https://www.facebook.com/
On Saturday, June 1, 2019, Kambria will host a workshop in Da Nang called “Create Your Own Virtual Assistant From Scratch” to provide training and education to participants of the Vietnam AI Grand Challenge. Click here for more information about the workshop. Space is limited to 40 participants so be sure to sign up soon!
Also published on Medium.