On-device AI in the browser is here - kinda.
It is currently in Chrome canary which means, it will be here soon(ish).
In this article I will show...
For further actions, you may consider blocking this person and/or reporting abuse
Opera browser developer has a much better setup for this than Chrome. You can use literally over 100 AI models locally with no internet through Aria, Opera's built in AI.
GPT4ALL.io and Ollama are great for running models from huggingface locally on Linux, Windows, or macOS.
Nice article!
That is cool to know, I will have to check that out! 💗
😁 Let me know what you think!
There is an API for conversation/chat as well.
It doesn't look like there's any documentation for the JS API yet though. I'm not sure this is open source? so we might not even be able to reference the C++ code.
Based on the announcement from Google, they want us to use the API wrapper for their hosted inference, which has a built-in adapter for the JS API, which can be used with their hosted inference as a fallback.
I'd love a reply if you can find the docs or code?
I had a good look, but didn't find anything when I was writing this.
I imagine documentation will come once it all starts to filter down towards production.
If I do find anything I will let you know! 💗
Does anyone know if it's certain that this will become a standard feature in Chrome browsers? Is it safe to start building with this API for the future?
I can't imagine all Chrome users downloading the large Gemini model. How does Chrome plan to handle this?
I certainly wouldn't recommend building anything substantial with an API that has not been announced or documented in any meaningful way. It is likely to change, evolve, features get deprecated, change model etc. etc. 💗
Time to use it to summarise this article haha
Already done that, so that tells me you didn't read it all! hahahahah. 💗
Is it any good?
No
Is it fun to play with?
Yes
laugh out loud on this, the story of my life 🤣🤣
hehe, well I can't go writing articles on anything that is actually useful, it would ruin my reputation! 🤷🏼♂️🤣💗
i like fun things 💛💛
Hi, I'm on chrome://components/ but it doesn't appear "Optimization Guide On Device Model". Do you know why this could be happening?
Check your Canary Chrome is up to date, other than that I am not sure I am afraid. 💗
Hi, did you find a solution?
I have this problem too
This is pretty neat. This could open a lot of doors for accessibility and language translation.
Is this still working in the current version of 128? I tried everything but cant make it work
Yes still working.
In the console in 128, I had to do something like this instead:
Then I could do this:
This is exactly the same, just in a more complicated way.
My guess is you missed an
await
the first time when creating the session constant. 💗Linux is not supported. haha
Oh Really? That is a shame. 💗
is this not supported for ubuntu linux os.
I am afraid I have no idea, sorry. If you have run the code above on Chrome Canary and it does not function, then for some reason it would seem not.
Aren't you guys concerned of setting up AI in browser ? I mean, I'm ok with this in new browser which I don't use often but I don't want a Blackbox to run on my machine.
At the moment, no. It is heavily sandboxed.
Am I concerned for what security problems this may pose in the future. Yes, I imagine there will be security holes introduced as they give AI more freedom. 💗
The disaster of Internet Explorer and ActiveX evidently taught us nothing. If this tool were to succeed, I can already imagine an entire platoon of websites and web apps that can only be used with Google Chrome.
Goodbye, browser interoperability.
I was thinking the same. That was my first thought when my coworker told me about
window.ai
, Imagine your boss telling you that the "feature" that "you've develop" don't work correctly on firefox or safari, and you try to tell him that firefox/safari doesn't use the same model of ai that chrome use... good luck fixing that. The only way thatwindow.ai
can have a future, is that every browser uses the SAME model, because otherwise it just impossible to have the same expectation of result.