DEV Community

Josias Aurel
Josias Aurel

Posted on

Introduction to machine learning - Definition and common terms

Machine Learning is not a fancy new thing. You have probably heard about it several times. This concept has helped developed many software solutions and solved many problems.

What is Machine Learning ?

Computer programmers solve problems by giving instructions to the computer on what to do and how to do it in order to solve problems. This method of solving a problem is very prone to human error and bias. In this traditional method, computer programmers give instructions or rules to the computer and some information is given as output. The rules treat the information and give some output.

Machine learning by contrast, will take some information and output in order to build a model (or rules). The computer accomplishes this by making use of some mathematical operations and algorithms in order to obtain some rules for solving a problem.
Machine learning is therefore a problem solving method where the computer learns to come up with rules for solving a problem.

In this blog post, we are going to go through some basic commonly used machine learning terms.

Model

A model or machine learning model represents the patterns that has been learned by a machine learning algorithm.

Machine Learning Algorithm

A machine learning algorithm sometimes known as a predictor refer to the steps or procedure that helps to come up with a model.
These algorithms run on some data to find patterns.

dataset

A dataset is the sample data used in training a machine learning model

Weights

A weight in the context of machine learning refer to the internal variable to some equation that are fine tune during the training of a model to solve the problem statement.
Machine learning is based on mathematics and some equations under the hood are modified (fine tuned) to fit the particular situation.

bias

Bias are constants attached to the weighted input. This helps to handle situations where patterns are non-linear.

Training

The training of a machine learning model is a process whereby the computer continually loop over some sample data given to it for a given period of time, making use of machine learning algorithms to find patterns and build a model.

cost function

A cost function sometimes called loss function is a function used to calculate the error rate or error margin between the prediction of a model and the real value. An example of such a function is mean squared error

loss

Loss is the difference between the value a model has predicted and the actual value in the train data of a model

Optimizer

An optimizer is a function that serves as to modify the internal values if a model (weight and bias) to get closer to the correct output.

Neural network

When a problem is complex to solve, basic machine learning algorithms will not be good enought to solve the problem. In this case, there are special machine learning algorithms that can model complex pattern in data.
Neural networks are therefore special machine learning algorithms that finds complex patterns in data by passing it through many small tiny functions with small coefficients that have to be learned.

Neuron

This are units in a neural network layers that take some weighted inputs, applies some activation functions to produce a single output.

Activation Function

An activation function also known as a *transfer function*are functions that determine the output of a neuron in a neural network. Their output usually range between -1 and 1.
They could be linear or non-linear.
When they are linear, their output would range between -infinity and infinity.
When they are non-linear, their output will have a output values within a certain range.

Examples of activation functions include : ReLU, Sigmoid.

Activation function determine how the input of the next neuron will look like.

You have reached the end of this article.
Make sure to share it.
Suggestions are welcome 👏

Top comments (0)