AI Series #2: Into the weeds we go
If you’re new here, welcome! You might benefit from reading the first piece in my series on explaining AI in a simple and accessible way. If you already read the first piece, welcome back for Part Two!
In our first piece of this series we defined what artificial intelligence is, took a brief tour of the technology, and discovered why we’re hearing so much about it in the news lately. In this piece we’ll dive into how our lives have changed technologically in the last thirty years and look at examples of AI at work in your daily life.
The Internet of Things
The Internet of Things (or IoT) defines what is likely the biggest change in technology in the last 15-20 years. I grew up in the 1990s and remember a childhood pre-internet. A major change that happened in the mid to late 1990s was the advent of Internet accessibility on our personal computers in our homes. As the Internet advanced to wireless and increased in speed, and as smart devices (devices that are connected to the internet and to one another) became more readily available, we had more and more connected devices in our homes. It was no longer just a computer and a landline connected by a physical cable.
The state of interconnected devices that can send and receive data to and between one another is referred to as the Internet of Things or IoT. Some examples of interconnected devices that might be in your home or daily life include:
Smart speakers
Wearable devices (Apple Watch, for example)
Internet-enabled security cameras or doorbells
Smart thermostats
Bluetooth-enabled devices, such smart outlets, appliances, etc.
The number of connected devices is increasing rapidly. In 2018 there were 7 billion connected devices, but by 2020, there were 31 billion. That’s more than a 4x increase in two years.
Machine learning, pattern recognition, and algorithms (Oh my!)
We covered in our first piece in our AI series some examples of AI you’ve likely already encountered personally. A few from that list include:
Netflix offers you suggestions for movies or shows to watch based on your viewing habits.
Your smartphone unlocks with facial recognition or your fingerprint in lieu of a passcode.
Your email account flags certain messages as spam.
Your smart home thermostat adjusts heating and cooling with local weather trends.
Your smartphone or word processor auto-corrects spelling errors.
Most of these examples of AI are either pattern recognition or predictive analytics forms of machine learning.
Let’s take a quick pause and define those three phrases in the last sentence:
Pattern recognition: These are instances of a computer system analyzing huge amounts of data, identifying patterns, and learning from those patterns in order to make predictions.
Predictive analytics: This type of analytics is able to forecast what will happen in the future based on historical data and trends.
Machine learning: This refers to developing algorithms that help machines to learn and change in response to new data, without the help of a human being.
An example of pattern recognition:
Your email provider has recognized that many email messages that contained the phrase “to repay your kindness, I will send 1,000,000 USD to your account” (a classic “Nigerian prince email scam”) were flagged as spam. The pattern predicts that future emails arriving with that phrase to your inbox are just as likely to be spam.
Another example of pattern recognition:
Netflix knows that many individuals who watched Love is Blind also watched Love Island. Therefore, when you watch one of those shows on your account, Netflix will recommend the other to you.
An example of machine learning:
These tools self-learn as new data comes in. A romance reality television show similar to Love is Blind called The Ultimatum is uploaded to Netflix. The technology does not require a human being to manually go into the back end of the Netflix website and program all viewers of Love is Blind to also receive a suggestion for The Ultimatum - rather, the tool recognizes these patterns autonomously between the two shows. The tool decides to share the viewing recommendation with appropriate users.
“Okay, I get the Netflix thing, but how does the streaming platform know that those two shows are similar?”
Let’s take it back to the 1990s, all right? Remember Blockbuster? The movies weren’t just randomly all over the place - they were sorted by genre, recent releases, etc. It’s the same as the modern-day equivalent of going to Barnes & Noble or the public library. New releases might be grouped on one table, romance in another section, and cookbooks in another. When the content is online and digital vs. physically being on a table or shelf in a brick-and-mortar store, the content is sorted by labels or tags.
A romance reality television show might have tags or labels like “reality television,” “dating show,” “millennials,” “generation Z,” “trending now,” etc. In order for data to be as useful as possible for AI to analyze, data should be labeled or tagged.
We’re going to talk about the generative AI tool ChatGPT in our next piece. This concept of tagging or labeling datasets is important for you to understand how that tool works.
“That makes sense. But how does Microsoft Word detect spelling errors or know to auto-correct my text?”
In that example, Microsoft Word is using an algorithm. Algorithms are a set of rules that machines can follow to complete a task. Let’s break down the algorithm behind spell-check in Microsoft Word:
The program scans all of the text you’ve entered (every single character and word).
The program then compares all of your words and characters against every known word in the dictionary.
The program then separates out the non-words that did not appear in the dictionary. It performs an algorithm on these non-words to detect the most likely corrections in the instance of a typographical error.
Perhaps you added one extra letter inadvertently with an errant keystroke.
Perhaps you eliminated a letter inadvertently.
Perhaps you transposed two letters accidentally.
Perhaps you got close to the word you were aiming for but misspelled it and were off by a character or two.
The algorithm then recommends (or perhaps even autocorrects to) the correct spelling. It’s not ALWAYS correct - sometimes you have a uniquely spelled word such as a proper noun (the name of something). But it often corrects the most common errors.
Algorithms are one example of how AI can perform useful functions.
Takeaways:
This is our second piece in a series about AI. In the last 20 years, we have seen more devices developed that are internet-enabled and connected. The Internet of Things (IoT) is common in households across the U.S., and we encounter AI on a daily basis through our connected devices. AI uses algorithms, pattern recognition, and machine learning to perform useful functions.