DEV Community

Cover image for Start your Journey with TensorFlow
Hugo Estrada S.
Hugo Estrada S.

Posted on • Updated on

Start your Journey with TensorFlow

Alt Text

First things first, you can find the notebook with all the content of this lecture in my GitHub repo:

https://github.com/hugoestradas/DataScience-101


Let's find out how to start using TensorFlow from the very entry level.

For this lecture, use any TensorFlow 2.x version.
I'm going to use the Azure Databricks platform, but feel free of using your preferred notebook solution (Jupyter, Kaggle, Colab, etc).

Part i: What on earth is a Tensor <?>

According to Wikipedia: "Is an algebraic object that describes a (multilinear) relationship between sets of algebraic objects related to a vector space."

Using the previous definition, we can say for our needs that tensors are TensorFlow's multi-dimensional arrays with uniform type. They're very similar to NumPy arrays and they're immutable, meaning once they're created they cannot be modified or altered. You can only create a new copy with the edits.

Let's go to our environment to see how Tensors behave using a simple coding example.
Before actually making our first tensor, we need to (of course) import the TensorFlow library, and for now I'm importing NumPy as well.
Quick note here, in my Azure Databricks Cluster, I need to manually install TensorFlow before importing it:
Alt Text.

Checking, the library was installed without problems:

Alt Text

Now, opening a fresh notebook I should be able to import TensorFlow without any issues:

Alt Text

Part ii: Creating Tensors

So far I installed TensorFlow with NumPy and imported both libraries into my notebook, time to work with Tensor Objects.

There are many ways to create a "tf.Tensor" object.

Here are a few examples.

First, you can create a tensor object with several TensorFlow functions.

Let's create a tensor object using "tf.constant":

Alt Text

It is possible to create tensor objects only consisting of "1s" with the "tf.ones" function:

Alt Text

Also, you can create tensor objects only consisting of "0s" usnig the "tf.zeros" function:

Alt Text

And finally, there's the "tf.range()" function to create tensor objects:

Alt Text

And here's the output of each method I used:

Alt Text

These are the easiest and most common ways of creating tensor objects in TensorFlow.

If you were observer, we created tensor objects with the shape (1,5) with three different functions, and a fourth tensor object with shape (5,) using the function "tf.range()".

The "tf.ones" and "tf.zeros" accept the shape as the required argument, since their element avlues are pre-determined.

Part iii: Classifying Tensors

We use "tf.Tensor" in order to create TensorFlow tensor objects, and they have many characteristic features, the main 3 are:

The first one that is important to understand, is that they have a rank based on the number of dimensions they have.

The second mort important characteristic of the tensor objects in TensorFlow, is to know that they do have a shape. Which for our understanding means that is a list that consists of the lengths of all their dimensions. All tensors have a size, which represents the total number of elements within a tensor.

And thirdly, their elements are all recorded in a uniform Dtype (data type).

Let's dig deeply into these features.

Tensors can be categorized based on the number of dimension they have:

Alt Text

0-D (Scalar) Tensor: it's a tensor containing a single value and no axis.

1-D (Rank-1) Tensor: it's a tensor containing a list of values in a single axis.

2-D (Rank-2) Tensor: it's a tensor containing 2 axis.

3-D (Rank-n) Tensor: it's a tensor containing n-axis.

Taking the above image as our source of abstraction for the dimension (axis) of the tensor objects, we can create a 3-D tensor by passing a three-level nested list object to the "tf.constant" function, and we can split the numbers into a 3-level nested list with three element at each level:

Alt Text

The shape feature is another attribute that every tensor has. It represents the size of each dimension in the form of a list to make it easier to understand.
We can view the shape of the "tensor_3d" object we created with
the ".shape" attribute:

Alt Text

Size is another feature tensor objects have, and it represents the total number of elements a tensor has.
It's not possible to measure the size with an attribute of the tensor object, instead the "tf.size()" function is used, and then it's necessary to convert the output to NumPy with the instance function ".numpy()" to get a human-readable result:

Alt Text

Last, but not least we have the Dtypes, which contain data types such as ints and floats, but ay contain many other data types suc as complex numbers and strings. Each tensor object, however, must store all its elements in a single uniform data type. Therefore, we can also view the type of data selected for a particular tensor object with the ".dtype" attribute:

Alt Text

Part iv: Operating with Tensors

Now that we understand the basic properties and features of the tensor objects with TensorFlow, we can play a litte bit with them.

Let's start with indexing. As you already know if you're reading this, an index represents an item's position in a sequence, and this sequence can refer to many things: a string of characters, a list, a sequence of values, etc.

Luckily, TensorFlow follows the Ptyhon's standard for indexing, similar to list indexing usign NumPy, the rules are:

  1. Indexes start at zero (0).
  2. Negative indexes ("n") value means backward counting.
  3. Colons (";") are used for slicing: "start:stop:step".
  4. Commas (",") are used to reach deeper levels.

Following the rules above, let's create a 1-D tensor object:

Alt Text

Now I'll apply the 4 rules of indexing:

Rule #1 - Indexes start @ 0:

Alt Text

Rule #2 - Negative values means backward counting:

Alt Text

Rule #3 - Colons slice:

Alt Text

Rule #4 - Commas reach deeper levels:
Alt Text

Okay, now that I've showed you the basic indexing techniques with tensor objects, let's make some operations with them.

First, let's create two tensor objects to interact with later one:

Alt Text

Let's start adding one tensor to another, for this we can use the "tf.add()" function and pass the tensors as arguments:

Alt Text

Following up, now I'll make use of the element-wise multiplication, for this I'll use the "tf.multiply()" function and will pass the tensors as arguments again:

Alt Text

We can even do matrix multiplication with the "tf.matmul()" function... yes! passing the tensors as arguments:

Alt Text

Let's say we'd like to know the maximum or minimum value within a tensor, well there are the "tf.reduce_max()" and "tf.reduce_min()" functions:

Alt Text

Similarly, to find the index of the maximum element is possible using the "tf.argmax()" function:

Alt Text

We can make operations and play with the shapes of the tensors.
If you're familiar with Pandas DataFrames or NumPy Arrays, then you'll understand the concept of "Reshaping a Tensor".

The "tf.reshape" operations are very fast, since the data does not need to be duplicated.
Let's see how it works:

Firstly, create a new tensor object:

Alt Text

and

Alt Text

then a third tensor:

Alt Text

Now, if we pass -1 in the "shape" reshaped argument, then we flatten our tensor object:

Alt Text

Reshaping tensor objects with TensorFlow it's ridiculously easy. But it's important to keep in mind that when doing reshape operations, you must be reasonable, otherwise the tensor object might get messed up or can even raise fatal errors.

Part v: Special Types of Tensors

So far we've talked about tensors abstracting them into rectangular shapes and store only numerical values on them.
But they are more powerful than tat, tensors support irregular or even specialized data within them. There are:

  1. Ragged Tensors
  2. String Tensors
  3. Sparse Tensors

Alt Text

Starting with Ragged Tensors, are tensors with different numbers of elements along their size axis:

Alt Text

Moving forward the String Tensors are tensors that store string objects within them, they're created as a normal tensor object, the only difference is how you store the strings:

Alt Text

And finally, we have the Sparse Tensors with are rectangular tensors for sparse data.
These are useful when you have holes, null values or other messy kind of things in your data. Sparse Tensors are to-go objects, hence they're a bit more consuming and should be more mainstream:

Alt Text

Top comments (0)