Machine learning is useful only if you have the right data and have questions to ask that it might be able to answer. Machine learning algorithms find patterns in data but cannot do useful things magically.
One of the most common techniques in traditional machine learning is called supervised learning. We want to look at data, understand the patterns and relationships among the data, and predict the results if we are given new examples of different data in the same format.
Regression involves drawing a line through a set of data points to most closely fit the overall shape of the data. Regression can be used for applications such as finding trends between marketing initiatives and sales, or determining rent like in the next example.
A simple example of supervised learning: https://rhurbans.com/machine-learning-intuition/
Classification aims to predict categories of examples based on their features. Can we determine whether something is a car or a truck based on its number of wheels, weight, and top speed?
Unsupervised learning involves finding underlying patterns in data that may be difficult to find by inspecting the data manually. Unsupervised learning is useful for clustering data that have similar features and uncovering features that are important in the relationships.
On an e-commerce site, for example, products might be grouped based on customer purchase behavior. If many customers purchase soap, sponges, and towels together, it is likely that more customers would want that combination of products.
Reinforcement learning is inspired by behavioral psychology and operates by rewarding or punishing an algorithm based on its actions in an environment. It has similarities to supervised learning and unsupervised learning, as well as many differences.
If you're interested in more details about ML algorithms, see Grokking AI Algorithms with Manning Publications: http://bit.ly/gaia-book, consider following me - @RishalHurbans, or join my mailing list for infrequent knowledge drops: https://rhurbans.com/subscribe.
Top comments (0)