In these early phases of LLM competition it is becoming increasingly hard to decide what is the right solution for a particular AI-related project - ChatGPT API or Palm API (Bard). I've been spending a lot of time recently designing prompts for various uses, analyzing both. Let me share some of my findings on the subject.
Palm API is Google's API which essentially accesses Bard functionality. It is currently free, and in beta testing mode. This is all the more reason to use it in your projects provided you don't expect long term reliability.
Palm API has much higher bandwidth than ChatGPT, at no charge. This allows applications that require lots of prompts. ChatGPT becomes pretty expensive when you get into millions of prompts which makes some applications impossible and/or impractical.
Another point is speed - these days ChatGPT is struggling to deliver all what's required of it. According to my experience, it can become very slow, it has outages, and it has a lower rate limit than Bard. Palm API seems to have much more computing resources, providing responses faster. This is another point where Bard seems to be a better solution for applications with lots of prompts.
On the other hand, Bard clearly seems to deliver worse answers on many subjects - namely taxonomy has a lot more garbage than ChatGPT, but I'm sure there are many other areas. It's up to you to decide which is better for your particular us.
Another thing to consider is there's a notable difference in responses between API access and web user access. I've been getting different response to the same prompts on web and API, provided the same parameters. This means you've got to have a tool to design prompts that will communicate directly with APIs instead of using the GUI of LLMs. This tool should be able to provide access to non-developers as they will likely be interested to communicate directly with APIs.
I've developed a free tool to do so. It is able to post responses from both APIs, side by side, while allowing you to setup the parameters in detail. It's called APIScout.AI - you can check it out here: https://apiscout.ai . Let me know what you think. Everything is free except for bulk processing of multiple prompts, as that may consume server resources.
Long term, if you need reliability, ChatGPT API is currently the way to go for all your new applications. Palm API is in beta testing, so you don't really have any long term forecast on whether it will available, in what form, and at what pricing. You don't know if it start to have hiccups due to overloads like ChatGPT does now - maybe worse. And you don't have any information on the future commercial rate limits.
The state of Palm API's documentation tells us a lot about the state of the whole project - it seems incomplete. It's very hard to figure out what to do, there are request parameters which are next to impossible to understand. I've included many of those on my APIScout.AI project so you can experiment yourself with parameters without any coding.
Last but not least - consider ChatGPT is available globally, while Palm API is still under heavy geo restrictions. You can't, for example, take a key from a US based Google account, and work with it from a server in Canada. Server has to be in the US too. I've applied for Palm API beta testing key from Europe a month ago and haven't got the key while applications from the US resulted in providing the key the following day. But now the application process seems to be closed for the time being - hope you've got your key in time. Please let me know in the comments if you have any news on the availability of beta keys, that should be interesting for all the readers.
My project, APIScout.AI will work for you only if you provide a US based key as the server it's running on is in USA. Let me know if you have any interest to enable other countries.
Hope you can use this information to make better AI based applications. Please let me know what you think or if you find out about any updates to the current condition of the LLM APIs.