3 Resources to Jumpstart Your Generative AI Journey:
- Generative AI for Beginners Curriculum
- Build Generative AI Apps Code-First With Azure AI
- Responsible AI Resources For Developers
Welcome to the sixth post in my This Week in AI News series! Today, I want to talk about Generative AI For Beginners - from prompt engineering to fine tuning. Let's dive in!
I was out on vacation so this post and the next are being published together today instead of on their regular cadence. They are related though, so hope you find them useful.
Skilling Up on Generative AI
By now you've probably asked ChatGPT to answer a question, or asked Bing Create to generate an image, or asked GitHub Copilot to explain some code to you. These are all examples of generative AI experiences driven by large language models (LLMs).
But what exactly does an LLM do? And how does a generative AI solution differ from traditional AI/ML applications that we've built before? What are the fundamental concepts to know, and how can you apply these concepts to build real-world applications?
These are some of the questions we tried to answer in the Generative AI For Beginners curriculum that was released as an open-source series of 12 lessons (v1) last October. Each lesson covers a core topic and provides an assignment to help reinforce the ideas you learned.
Since the launch, ~25K users have starred the repo, with many users also contributing issues, translations, fixes and suggestions to help us improve this. It is now the #1 "generative-ai" repo on GitHub.
Today, I'll focus on the lesson I contributed to this curriculum - the chapter on Prompt Engineering Fundamentals - that sets the stage for the rest of the course.
What Is Prompt Engineering?
Generative AI applications differ from traditional AI experiences in many ways - but a key aspect is their use of Large Language Models (LLM) to understand natural language. This allows users to engage with the applications using text inputs (or "prompts") in a conversational manner ("chat") that feels natural.
Understanding how prompts work is critical to building reliable applications that deliver quality responses to the user. The visual below shows how these concepts are connected - but what exactly does prompt engineering mean, and why do we need it?
Review: Illustrated Guide
The lesson is structured in the following sections, taking you from core concepts to best practices for prompt engineering. Check out the illustrated guide below for more details on what each section covers:
Topic | Description |
---|---|
Core Concepts | explains key terms including tokenization, that are critical to prompt usage |
Core Challenges | tells you why prompt design is critical to getting better quality responses |
GitHub Copilot | provides a real-world case study that illustrates prompt engineering examples |
Prompt Construction | describes the components of a prompt, and how complex prompts are designed |
Primary Content | highlights a number of prompt design patterns that improve quality of responses |
Advanced Techniques | shows how to improve quality by having the model critique or explain itself. |
Best Practices | wraps up the lesson by looking at the end-to-end application lifecycle and workflow |
Watch: Lesson Walkthrough
Want to get a better sense of how the lesson maps to applied usage of the concepts? Check out the slides in the presentation below, which walk you through the lesson with practical examples and hands-on exercises you can try out on Azure OpenAI or OpenAI playgrounds.
The lesson also comes with a recorded walkthrough of the slides that serves as an instructor-guided walkthrough of the lesson.
Finally, the lesson comes with an interactive assignment using the Jupyter Notebooks runtime. You will need an Azure subscription (with access to Azure OpenAI) or an OpenAI account (with a valid API KEY) to work through the exercises. The repository is setup for use with GitHub Codespaces - so you have a pre-built dev environment with no effort needed on your part.
Summary & Next Steps
Once you complete the Prompt Engineering Fundamentals lesson (chapter 4), continue on to the Advanced Prompting lesson (chapter 5) that walks you through techniques like chain of thought prompting. Then continue on to see how you can build different generative AI experiences - from chat applications to search, text and image generation. You can also explore low code options and learn how to integrate external applications with function calling.
The v1 edition of the curriculum covered chapters 1-12. In my next post below, I'll explore the v2 edition (chapters 13-18) that was just released in February - and talk specifically about Fine Tuning, an approach that can help you improve the quality or cost of your generative AI application when prompt engineering alone is not effective enough.
Fine-Tuning Fundamentals - Generative AI For Beginners (v2)
Nitya Narasimhan, Ph.D for Microsoft Azure ・ Mar 1
Until then, fork the repo to your profile, and get started exploring the lessons!! Happy learning!
3 Resources to Jumpstart Your Generative AI Journey:
Top comments (3)
Cool poster... love the clarity ...
Not sure if you had chance to look at this..
LLM
I had NOT - this is amazing - thank you for sharing it!!! I love visual storytelling and had been aware of some of the tools people use to develop these kinds of "scrollytelling" experiences but not seen this one. It explains the concepts so well!
"Exceptional! The engineering post is a divine resource for enthusiasts and professionals alike. With comprehensive coverage and practical insights, it's a beacon of knowledge in the field. Highly recommended for anyone passionate about advancing their engineering skills." Visit Our Technology Website - TrickNew