DEV Community

AWS SageMaker and Amazon stack

SageMaker

  • Before I get away from myself and start to promise that SageMaker will solve all of your life's problems, let's just be very clear what SageMaker is.
  • At its core, SageMaker is a fully managed service that provides the tools to build, train and deploy machine learning models.
  • That's a very concise mission statement for what SageMaker helps you achieve.
  • It has some components in it such as manage notebooks and helping label and train models, but at its core, SageMaker should be thought of as where you go to when you need to build, train and deploy models.

Amazon stack

The first layer

Now, to look at how it fits into the larger Amazon stack, we have this diagram:

Image description

  • At the top of this diagram or at the top of the stack, if you wanna be a little more formal, you have some of the application services.

Image description

  • These are tools such as Rekognition, Transcribe, Lex, Translate and Comprehend, that are really full service options for developing a very focused machine learning model.
  • They basically provide a heavily pre-trained model that allows you to then sometimes tweak it and fit in some custom identifiers.

  • These application services are more serviced by what we call a level one machine learning user.

  • So at the top level, just to reiterate, you have some of the straightforward pre-trained machine learning models that maybe allow some tweaking and adjusting.

The second layer

  • The next layer below application services is platform services.

Image description

  • Whereas application services are like going to a bakery and buying a cookie, platform services are more like buying a cake mix with a recipe on the side.
  • It gives you the tools to make your machine learning product, but you need to bring your expertise on how to assemble and how to fine tune it to get the best result.
  • In this layer is where SageMaker lives along with some other very popular services, such as Spark, EMR, perhaps you've heard of Databricks or DataRobot. Those all live in the platform services range.
  • Although sometimes with some of the more advanced features, they can drift towards application services.
  • Platform services are really what everyone from level two upward would use. Once you start to dabble and making your own models and prepping your own data, you start to really need a platform to get started.

The third layer

  • Now below both applications and platforms are frameworks and hardware.

Image description

  • Many of the platforms allow you to use these frameworks in order to improve your machine learning product without having to start coding from the ground up. Things such as TensorFlow, MXNet and Pytorch come with the code four models in them, and you have to start to feed it hyper parameters in data. So it's not like you're starting from the mathematical thesis, you're able to import libraries.

SageMaker and Amazon in general does a really good job at tying together their entire ecosystem. So if you wanna be able to pull in things such as graphical processing units, that's a p3 instances if you wanna talk about specific servers or attach extra CPU or memory, Amazon allows you to add hardware in order to help accelerate your solution and change its performance characteristics as you become a more advanced user.

So this is really the stack of how you should think about the machine learning ecosystem on Amazon. It actually really applies to all of the clouds, but the products slotted in here are more AWS specific.



GitHub
LinkedIn
Facebook
Medium

Top comments (0)