In the last two previous posts, I introduced you to our new high-level services, and to the new capabilities we added to Amazon SageMaker. Thought I was done, didn’t ya? Nope, there’s more. There’s ALWAYS more.
In this final post, let’s talk about frameworks and infrastructure.
As always, happy to answer questions here or on Twitter.
Not sure when this one fits! AWS DeepComposer is the world’s first musical keyboard combined with a generative AI service.
Inf1 instances are powered by AWS Inferentia chips, and are designed to provide you with fast, low-latency inferencing.
Using the Neuron SDK, which is pre-integrated into popular machine learning frameworks like TensorFlow, MXNet and Pytorch, you can compile and load your models on Inf1.
Neuron SDK: https://github.com/aws/aws-neuron-sdk
TensorFlow 2.0 is also available in the Deep Learning AMIs. I hear that it’s coming soon to Deep Learning containers and SageMaker as well, but shhhh.
By the way, did you know that 85% of all TensorFlow workloads in the cloud run on AWS? Check out the report for more details (PDF).
Documentation and examples: https://www.dgl.ai/
The Deep Java Library is an open source library to develop Deep Learning models in Java. Does this mean that Java developers don’t have to learn Python anymore? ;)
Documentation and examples: https://www.djl.ai/