DEV Community

Cover image for Introduction To Tensors: The Building Blocks Of Artificial Intelligence
Victor Isaac Oshimua
Victor Isaac Oshimua

Posted on

Introduction To Tensors: The Building Blocks Of Artificial Intelligence

Ever wonder how artificial intelligence (AI) algorithms process different kinds of unstructured data? Like what happens when you pass in your audio data or feed an algorithm with images or text to process. Well, this is not some kind of rocket science. It is simply processing this data as tensors.

If you have done some college math or physics, you should be familiar with tensors. But that's not a requirement to follow this article. That's what this article is all about; here you will learn what tensors are and how they are used in making AI systems.

Prerequsites

To get the most out of this article, you should:

  • Have knowledge of coding with the Python programming language.
  • Be familiar with the PyTorch framework for deep learning. However, this is not a strict requirement, as the PyTorch code in this course will be thoroughly explained, and basic Python understanding should suffice.
  • Not being intimidated by the math behind AI algorithms (no mathematical fears here!).

Now let's get to it!

What are tensors?

Just like arrays and matrices in Python, tensors are a fundamental data structure in artificial intelligence, providing a means of storing both input and output data within a model.

Tensor meme

From a technical standpoint, tensors can vary in dimensionality depending on the data they represent. For instance, a single scalar value would be considered a zero-dimensional tensor, while an array of values would be represented as a one-dimensional tensor. More complex data, such as images or video frames, are stored as higher-dimensional tensors, enabling efficient handling of large datasets.

Tensors

Tensors represent a structured format for organising information essential for neural network operations. Whether it's representing input data, weights, or biases within a neural network, tensors provide a versatile framework for handling various forms of data.

Anatomy of neural networks

Understanding the anatomy of neural networks is crucial for grasping the practical application of tensors in AI. Neural networks are AI algorithms that take data as input, perform computational tasks, and return an output. This input might consist of images of cats and dogs or text messages. The tasks could include classifying the images as dogs or cats or categorising the text messages as spam or not spam.

Layers of a Neural Network

  1. Input Layer: The input layer receives data, typically represented as tensors, and passes it to the subsequent layers for processing.

  2. Hidden Layers: Hidden layers, positioned between the input and output layers, perform complex transformations on the input data through weighted connections and activation functions.

  3. Output Layer: The output layer produces the final prediction or classification based on the processed input data.

Neural network

Role of Tensors in Neural Networks

Before computers can understand what's going on in a data input, they need to represent it as numbers. In other words, numerical encoding of the data is essential. Tensors serve as the means of encoding this data; they store the weights, biases, and intermediate computations within neural networks. As the neural network processes data through its layers, tensors facilitate efficient data manipulation and propagation of information, enabling the network to learn and make predictions.

Anatomy of neural networks

Working wiith tensors

Welcome to the coding section of the article! Now that you've understood what tensors are and how they affect AI, let's work with tensors.

Understanding tensor operations is essential for working with deep learning frameworks like PyTorch or TensorFlow. Tensor operations are fundamental mathematical manipulations performed on multi-dimensional arrays. Mastery of these operations enables efficient implementation of neural networks, data preprocessing, and model training.

Proficiency in tensor operations is crucial for debugging, optimising performance, and grasping the mathematical principles behind machine learning algorithms.

Matrix meme
We will use PyTorch to implement tensor operations. In case you're not familiar, PyTorch is a deep learning framework that seamlessly transitions AI from research to deployment. It is straightforward to understand, especially if you have a basic understanding of Python.

For this tutorial, I'll be using Google Colab. The reason is that Google Colab comes pre-installed with PyTorch and other necessary frameworks, saving us the hassle of installing them again. If you prefer to use PyTorch on your local machine, you can refer to this guide from the PyTorch documentation.

Creating tensor

Creating tensors in PyTorch is similar to handling arrays in NumPy. However, in PyTorch, we use the torch.tensor class.



import torch

# Create a tensor from a Python list
tensor1 = torch.tensor([[1, 2, 3], [4, 5, 6]])

# Create a tensor of zeros with shape (2, 3)
zeros_tensor = torch.zeros(2, 3)

# Create a tensor of random values with shape (2, 3)
random_tensor = torch.rand(2, 3)

# Print tensors
print("Tensor 1:", tensor1)

print("Zeros Tensor:", zeros_tensor)

print("Random Tensor:", random_tensor)



Enter fullscreen mode Exit fullscreen mode

Result:

Creating tensors

Basic operations with tensor

With PyTorch, you can perform various arithmetic operations on tensors. Here are some basic ones:



# Addition
tensor2 = torch.tensor([[7, 8, 9], [10, 11, 12]])
sum_tensor = tensor1 + tensor2

# Multiplication
mul_tensor = tensor1 * 2

print("Sum Tensor:", sum_tensor)
print("Multiplication Tensor:", mul_tensor)


Enter fullscreen mode Exit fullscreen mode

Result:

Tensor operations

Reshaping tensor

You can change the dimension of a tensor. In this example, I will transform the original tensor, which has two rows and three columns, into a tensor with three rows and two columns.



# Reshape tensor1 to shape (3, 2)
reshaped_tensor = tensor1.view(3, 2)

print("Original Tensor:", tensor1)
print("Reshaped Tensor:", reshaped_tensor)



Enter fullscreen mode Exit fullscreen mode

Result:

Reshaping tensor

Matrix multiplication

You can perform matrix multiplication using the torch.matmul() function in PyTorch. This function takes two tensors as input and returns their matrix multiplication result.



# Matrix multiplication
tensor3 = torch.tensor([[1, 2], [3, 4], [5, 6]])
tensor4 = torch.tensor([[7, 8], [9, 10]])

# Perform matrix multiplication
matmul_result = torch.matmul(tensor3, tensor4)

print("Matrix Multiplication Result:")
print(matmul_result)


Enter fullscreen mode Exit fullscreen mode

Result:

Matrix multiplication

Indexing and Slicing

Similar to NumPy arrays and Python lists, tensors in PyTorch support indexing and slicing operations. This means you can access specific elements or subsets of elements within a tensor. Let me show you how:



# Indexing
element = tensor3[0, 1]  # Access element at row 0, column 1

# Slicing
slice_tensor = tensor3[:, 1]  # Slice all rows, only the second column

print("Indexed Element:", element)
print("Sliced Tensor:")
print(slice_tensor)



Enter fullscreen mode Exit fullscreen mode

Result:

Indexing

These fundamental operations lay the groundwork for more advanced tensor manipulations as you delve into various AI algorithms. Understanding these basics is crucial for effectively working with tensors and building upon them to tackle more complex tasks in artificial intelligence.

Closing remarks

Tensors are the core of AI models, serving as containers for crucial information needed for learning and decision-making. Whether it's recognising images, understanding language, or predicting trends, tensors are indispensable in every aspect of AI.

By mastering tensor operations, you've taken the first step towards becoming proficient in AI. But remember, there's always more to explore and learn in this field. So keep experimenting, keep building, and keep pushing the boundaries of what AI can do. Who knows? The next breakthrough could be just around the corner!

So, as you explore AI, stay curious, stay creative, and above all, enjoy the process. Happy coding!

Top comments (4)

Collapse
 
gitlmb12 profile image
GitLMB12

I liked the topic very much, it provides a lot of benefit, but I have a question: how many hours a day can I spend to become an artificial intelligence engineer?

Collapse
 
victor_isaac_king profile image
Victor Isaac Oshimua

Hi, What matters is consistency. You have to show up every day to learn new stuff.

Collapse
 
dhirajsaindane04 profile image
Dhirajkumar Saindane

very clear and concise with the topic.

Collapse
 
victor_isaac_king profile image
Victor Isaac Oshimua

Thanks Dhirajkumar.

Some comments may only be visible to logged-in visitors. Sign in to view all comments.