## DEV Community

Terra

Posted on • Originally published at pourterra.com

# Mathematics for Machine Learning - Day 11

So, turns out, the answer I searched was correct... It was on the next page. Well, lesson learned, I'll finish a section from now on if possible instead of using a time measured learning system.

## The answer to my previous post (Day 10)

### let's say I have these vectors

$x_1 = \begin{pmatrix} 1 \\ 2 \\ -3 \\ 4 \end{pmatrix}; x_2 = \begin{pmatrix} 1 \\ 1 \\ 0 \\ 2 \end{pmatrix}; x_3 = \begin{pmatrix} -1 \\ -2 \\ 1 \\ 1 \end{pmatrix} \\$

### To know if they're linearly dependent:

$\lambda_1 x_1 + \lambda_2 x_2 + \lambda_3 x_3 = 0$

#### Equals:

$\lambda_1 \begin{pmatrix} 1 \\ 2 \\ -3 \\ 4 \end{pmatrix} + \lambda_2 \begin{pmatrix} 1 \\ 1 \\ 0 \\ 2 \end{pmatrix} + \lambda_3 \begin{pmatrix} -1 \\ -2 \\ 1 \\ 1 \end{pmatrix} = 0$

#### Sooo

If we use the RREF it'll be:

$\left[\begin{array}{cccc} 1 & 1 & 1 \\ 2 & 1 & -1 \\ -3 & 0 & 1 \\ 4 & 2 & 1 \end{array}\right] \to \dots \to \left[\begin{array}{cccc} 1 & 1 & -1 \\ 0 & 1 & 0 \\ 0 & 0 & 1 \\ 0 & 0 & 0 \end{array}\right]$

And since every element of the column is a pivot column, there's no trivial solution making every single column a unique column which means :

$\lambda_1 = \lambda_2 = \lambda_3 x_3 = 0$

and these vectors are linearly independent from one another.

## Generating Set and Basis

Finally, from a long time ago I didn't know what a basis was and it's going to be discussed today!

### Consider the vector space:

$V = (\nu, +, \bullet)$

#### and a set of vectors

$\mathscr{A} = {x_1, \dots, x_k} \subseteq \nu$

If every vector set of the vector is a part of the vector space, meaning:

$\nu \in V$

and can be expressed as a linear combination of A (Yes, that absolute horrendous letter is an A but using \mathscr). Then A is a generating set of V

The set of all linear combination of the vectors A is called the span of A.

$\text{If } \mathscr{A} \text{ spans the vector space } V. \\ \text{} V = \text{span} [\mathscr{A}] = \text{span} [x_1, \dots, x_k]$

#### Consider a vector space:

$V = (\nu, +, \bullet) \text{ and } \mathscr{A} \subseteq \nu$

If there's no smaller set, then the last generating set is called a minimal\basis.

$\tilde{\mathscr{A}} \subseteq \mathscr{A} \subseteq A \subseteq \nu \text{ that spans } V \\ \text{Then } \tilde{\mathscr{A}} \text{ is a minimal or basis}$

### One more example

#### Let

$V = (\nu, +, \bullet)$

#### and

$\mathscr{B} \subseteq \nu, \mathscr{B} \not = \emptyset$

#### Then,

1. B is a basis of V
2. B is a minimal generating set
3. B is a maximal linearly independent set of vectors in V, i.e. adding any other vectors to this set will make it linearly dependent
4. Every vector
$x \in V \text{ is a linearly combination of vector from } \mathscr{B}$

and every linear combinations are unique, i.e. with:

$x \sigma_{i = 1}^k \lambda_i b_i = \sigma_{i = 1}^k \psi_i b_i$

with

$\lambda_i, \psi_i \in \reals, b_i \in B \\ \text{ it follows that } \lambda_i = \psi_i, i = 1, \dots, k$

## Thoughts

Honestly, today's section is pretty great to read, since today is focused on answering the previous chapter's question / confusion and finally learn what the hell is a basis, it almost feels like eating a dessert after a buffet, not too difficult but enough since it completes a few gaps in the previous sections.

### A little TL;DR:

1. A generating set is a subset of the vector space that can recreate the vector subspace with linear combination
2. A generating set can be linearly dependent, since it's not a smaller chunk of the vector space, but can be an entirely different vector all together, but can recreate the subspace
3. A basis on the other hand is a linearly independent vector subspace, since this is the smallest known generating set that the vector space can create. So removing even a single vector from this set will remove the ability to recreate a portion of the vector space.

## Acknowledgement

I can't overstate this: I'm truly grateful for this book being open-sourced for everyone. Many people will be able to learn and understand machine learning on a fundamental level. Whether changing careers, demystifying AI, or just learning in general, this book offers immense value even for fledgling composer such as myself. So, Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong, thank you for this book.

Source:
Deisenroth, M. P., Faisal, A. A., & Ong, C. S. (2020). Mathematics for Machine Learning. Cambridge: Cambridge University Press.
https://mml-book.com