DEV Community

Sanchita Paul
Sanchita Paul

Posted on

Integrating OpenAI Chat Completion API into Laravel Console Command

Here we will be integrating the OpenAI Chat Completion API utilizing the GPT-3.5-Turbo model. The integration will be achieved through a Laravel console command that will output the results.

First create a command file named *ChatGPTCommand.php
*


<?php

namespace App\Console\Commands;

use App\Services\ChatGPTSerVice;
use Illuminate\Console\Command;

class ChatGPTCommand extends Command
{
    /**
     * The name and signature of the console command.
     *
     * @var string
     */
    protected $signature = 'chat:start';

    /**
     * The console command description.
     *
     * @var string
     */
    protected $description = 'Command description';

    /**
     * Execute the console command.
     *
     * @return int
     */
    public function handle()
    {

        $this->info("Welcome to Hood ChatGPT");
        $chatGptService = new ChatGPTSerVice();

        $this->startChat($chatGptService);
        return Command::SUCCESS;
    }


    public function startChat(ChatGPTSerVice $chatGptService)
    {
        $userInput =  $this->takeInput();

        if (strtolower($userInput) === 'stop') {
            $this->info("Thank you for using Hood Chat GPT");
            true;
        }

        $chatGptService->chat($userInput, $this);

        $this->startChat($chatGptService);
    }

    public function takeInput()
    {
        return $this->ask("\nUser:");
    }
}

Enter fullscreen mode Exit fullscreen mode

Now create a Services folder inside app names ChatGPTSerVice


<?php


namespace App\Services;
use GuzzleHttp\Client;
use GuzzleHttp\Exception\GuzzleException;
use GuzzleHttp\Psr7\Utils;
use Illuminate\Console\Command;
use Illuminate\Support\Facades\Storage;

class ChatGPTSerVice
{
    /**
     * @throws GuzzleException
     */
    public function chat($userInput, Command $command)
    {
        $client = new Client([
            'base_uri' => 'https://api.openai.com/v1/',
        ]);

        $headers = [
            'Content-Type' => 'application/json',
            'Authorization' => 'Bearer <YOUR_API_KEY>',
        ];

        $messages = [
            ['role' => 'user', 'content' => 'Hi'],
            ['role' => 'assistant', 'content' => "Hi i will give you some description about lorem. if i question you then you have to answer from that description and if you do not find answer then reply only Not Found

Lorem Ipsum is simply dummy text of the printing and typesetting industry. Lorem Ipsum has been the industry's standard dummy text ever since the 1500s, when an unknown printer took a galley of type and scrambled it to make a type specimen book. It has survived not only five centuries, but also the leap into electronic typesetting, remaining essentially unchanged. It was popularised in the 1960s with the release of Letraset sheets containing Lorem Ipsum passages, and more recently with desktop publishing software like Aldus PageMaker including versions of Lorem Ipsum.


rules to follow,
only find answer from above information.


"],
     ['role' => 'user', 'content' => $userInput],
        ];

        $body = json_encode([
            'model' => 'gpt-3.5-turbo',
            'messages' => $messages,
            'stream' => true
        ]);

        $response = $client->post('chat/completions', [
            'headers' => $headers,
            'body' => $body,
            'stream' => true
        ]);

        $stream = $response->getBody();

        $command->info("HoodGPT");
        while (!$stream->eof()) {
            $data = Utils::readLine($stream);
            $val = str_replace('data: ', '', $data);
            $val = str_replace(['\n', '\r'], '', $val);
            $json = json_decode($val, true);
            $str = $json ? ($json['choices'][0]['delta']['content'] ?? null) : null;
            if ($str) {
                $command->getOutput()->write($str);
            }
        }
        return '';
    }

}
Enter fullscreen mode Exit fullscreen mode

The chat function in this service involves utilizing a client to invoke the OpenAI API. The key role is played by the virtual assistant, whose accuracy and relevance of responses are contingent upon the specificity and clarity of the user's questions and instructions. Moreover, the use of the "stream" parameter with a value of "true" enables a gradual output of responses, akin to the conversational style exhibited by ChatGPT. This is accomplished by utilizing the "readLine" function, which ensures that responses are delivered in a deliberate and measured fashion.

It is worth noting that there exist additional parameters that can be utilized in the OpenAI API, which can be found in the documentation provided here . While in the current implementation, only the "model", "messages", and "stream" parameters are being utilized in the request body, it is important to be aware of the other available options. This can allow for greater customization and fine-tuning of the virtual assistant's behavior, enabling it to better meet the needs and expectations of the user.

Now just run this command on terminal

php artisan chat:start
Enter fullscreen mode Exit fullscreen mode

Here i am using a company info for output not lorem ipsum.

Image description

In this scenario, the user provides the virtual assistant with specific company information and the corresponding activities that the company is involved in. The assistant is then instructed to respond to a question based on this context. If the question pertains to the provided context, the assistant will respond with relevant and specific information. However, if the question falls outside of this context, the assistant will respond in a more general manner, drawing upon its broader knowledge base to provide a suitable response. This approach allows the assistant to provide more accurate and targeted information while still being able to provide helpful responses to more general inquiries.

Top comments (0)