What an absolute clown. When it's just separate examples, it's working fine, but when combined I'm suddenly confused like a headless chicken. I need to stop finding shortcuts only to just re-learn it again the right way.

Well, another lesson learned. No use dwelling on yesterday's idiocy.

## Basis Change.

It's combining a few sections we've discussed before into one. So far what I've known to be related directly are:

- Transformation Matrix
- Linear mapping

Which combined will become... Transformation mapping. I know, so original.

### So what is it?

It's transforming matrices while taking into consideration linear mapping if the basis is changed.

So, with

#### Consider 2 ordered basis of V and W.

Note:

If you forgot, here's what the above basis are for.

With both basis change and linear mapping can be inversed. So anything can become anything, you just gotta believe in yourself.

#### Oh yeah, also

are the transformation matrix for the basis change.

### So what?

So, consider this transformation matrix:

with respect to the standard basis, if we define a new basis.

we obtain the diagonal transformation matrix

which is easier to work with than A.

## Huh... How?

I hope that's what you're thinking right now, because I spent well over an hour knowing how the hell did this happen.

If you already know how... Please review to see if it's really correct or not :D I'm new in this.

### Step 1: Transform the new basis into a matrix

### Step 2: Find Inverse B

### Step 3: Calculate (Inverse B) AB

#### Calculate AB

#### Calculate Inverse B to AB

## For some of you, this is enough.

You might already understand the steps and what does each step mean, to which I say, congratulations the other texts below aren't need for you so feel free to leave.

**But honestly, I still don't get it :D**.

How did I read through a bunch of sections and only now inverse is used again, what's happening here?. So I asked myself a second time.

### Huh... How?

It all stems back to linear mapping, the bastard that's laughing maniacally feeling superior as I use an ineffective method on my journey to learning this.

Consider this:

P is what we defined as B beforehand.

#### That's easy, then just inverse it to get the answer right?

That's right. P is the B that we used on the example.

#### But what about the linear transformation T, what's that?

And since the vectors any vector inside of T on the standard basis, we need to map the vector to the new basis as well.

#### How?

Notice it's familiar with the T formula for changing basis right? that's because it's the same.

#### Add it to the formula

#### Combine with the new basis linear transformation.

Boom. This gets the formula.

### Sidenote:

I absolutely have no clue why some are able to be subscript but some aren't. I think there's something wrong with the notation or maybe there's a parameter that's messing it up. But either way, just incase, here's a clarification.

## Acknowledgement

I can't overstate this: I'm truly grateful for this book being open-sourced for everyone. Many people will be able to learn and understand machine learning on a fundamental level. Whether changing careers, demystifying AI, or just learning in general, this book offers immense value even for *fledgling composer* such as myself. So, Marc Peter Deisenroth, A. Aldo Faisal, and Cheng Soon Ong, thank you for this book.

Source:

Axler, Sheldon. 2015. Linear Algebra Done Right. Springer

Deisenroth, M. P., Faisal, A. A., & Ong, C. S. (2020). Mathematics for Machine Learning. Cambridge: Cambridge University Press.

https://mml-book.com

## Top comments (0)