An article posted on MachineLearningTimes.com discusses 4 common fallacies or myths regarding artificial intelligence (AI). These misconceptions lead to many misunderstandings and fear* regarding AI.
Wikipedia defines AI as “intelligence demonstrated by machines, unlike the natural intelligence displayed by humans and animals, which involves consciousness and emotionality.”
I like Investopedia’s definition better*: “the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions.”
In the post, Melanie Mitchell, Davis Professor of Complexity at the Santa Fe Institute and author of Artificial Intelligence: A Guide For Thinking Humans, lists the 4 most common fallacies that I would summarize as follows:
- Narrow intelligence (being really good at one task) leads to general intelligence (being good at many things, the way humans are). In other words, computers will become super-smart and take over the world.
- Easy tasks are hard to automate/hard tasks are easy to automate.
- AI works like the human mind. This comes from using ‘human-y” terms like learn, understand, read, and think, which leads some to believe AI can achieve humanness.
- Intelligence is all in the AI brain. In other words, “the right algorithms and data…can create AI that lives in servers and matches human intelligence.”
The link to the article is below. I used the link to PredictiveAnalytics/MachineLearningTimes (MLT), which then links to the original post elsewhere. I started with the MLT link because 1) that’s where I saw the post, and 2) one of the brains behind MLT is Eric Siegel, an AI/Machine Learning guru. I have spent at least 50 hours reading his articles and watching his tech videos and would recommend him highly if you’re interested in AI and machine learning.
While this is not an easy-read post, it’s not complicated either. It just take a little more thought, so slow down and think through it. It will be worth the 5-8 minutes.
Here’s the link.
*I will explore my thinking on this in a later post.