Basics of artificial intelligence and machine learning

Basics of artificial intelligence and machine learning


In recent years, the terms artificial intelligence and machine learning have started to pop up frequently in tech news and websites. The two are often used synonyms, but many experts argue that they have subtle but real differences.

And of course, sometimes experts disagree with each other on what those differences are.

In general, however, two things seem clear: first, the term artificial intelligence (AI) is older than the term machine learning (ML), and second, most people consider machine learning to be a subset of it. ‘artificial intelligence.

Artificial intelligence versus machine learning

Although AI is defined in several ways, the most widely accepted definition being “the field of computing dedicated to solving cognitive problems commonly associated with human intelligence, such as learning, problem solving and pattern recognition, “in essence, is the idea that machines can have intelligence.

The heart of a system based on artificial intelligence is its model. A model is nothing more than a program which improves one’s knowledge through a learning process by making observations about its environment. This type of learning-based model is grouped under Supervised Learning. There are other models that fall under the category of unsupervised learning models.

The term “machine learning” also dates back to the middle of the last century. In 1959, Arthur samuel defined ML as “the ability to learn without being explicitly programmed”. And he went on to create a computer checking app that was one of the first programs that could learn from its own mistakes and improve its performance over time.

Like AI research, ML fell in vogue for a long time, but it became popular again when the concept of data mining started to take off around the 1990s. Data mining uses algorithms to find data. models in a given set of information. ML does the same thing, but takes it one step further – he changes the behavior of his program based on what he learns.

One ML application that has become very popular recently is image recognition. These apps need to be trained first – in other words, humans need to look at a bunch of images and tell the system what’s in the image. After thousands and thousands of repetitions, the software learns what pixel patterns are commonly associated with horses, dogs, cats, flowers, trees, houses, etc., and it can make a fairly good estimate of the content of the images.

Many web-based businesses are also using ML to fuel their recommendation engines. For example, when Facebook decides what to show in your news feed, when Amazon highlights products you might want to buy, and when Netflix suggests movies you might want to watch, all of these recommendations are based on predictions based on models in their existing data.

Frontiers of Artificial Intelligence and Machine Learning: Deep Learning, Neural Networks and Cognitive Computing

Of course, “ML” and “AI” are not the only terms associated with this area of ​​computing. IBM frequently uses the term “cognitive computing,” which is more or less synonymous with AI.

However, some of the other terms have very specific meanings. For example, an artificial neural network or a neural network is a system that has been designed to process information in a manner similar to the functioning of the biological brain. Things can get confusing as neural networks tend to be particularly good at machine learning, so these two terms are sometimes confused.

In addition, neural networks form the basis of deep learning, which is a special type of machine learning. Deep learning uses a certain set of machine learning algorithms that run on multiple layers. This is made possible, in part, by systems that use GPUs to process a large amount of data at once.

If you are confused by all of these different terms, you are not alone. Computer scientists continue to debate their exact definitions and likely will for some time to come. And as companies continue to invest money in research into artificial intelligence and machine learning, it’s likely that a few more terms will crop up to add even more complexity to the issues.