π About
I recently felt on this news:
The two key points that kept my attention were:
the model was released under the
Apache 2.0 license
,...
and
Mistral 7B
is a further refinement of other βsmallβ large language models likeLlama 2
, offering similar capabilities (according to some standard benchmarks) at a considerably smaller compute cost
Then a few weeks later the release of the "GenAI Stack":
π So I started to look around... and find the benefits I could take out of these tools.
π Then & now
Actually my only two options were:
-
Use my OpenAI account and use
gpt-3.5-turbo
flavours orgpt-4
to achieve custom projects (and share my data with OpenAI) - Use Kaggle GPUs resources & play with resources for free... still with the need to share (at least temporarly my resources) with Kaggle
π― What we'll achieve
With these two announcements, I wondered what I could achieve locally on my own hardware:
π This blog post is dedicated to the unboxing of this stack and the amazing perspectives it unleashes... and why Open Source matters.
πΏ Demo
π‘ Load any custom model into ollama
Below a short an easy trick on how to install a finetuned custom model into ollama
: jackalope-7b
(a fine tuned version of Mistral-7B
).
π Bookmarks
More about GenAI stack
Top comments (13)