To send a large file, over 2 gigabyte, in a HTTP request (for example to an external API) with PHP is not easy because we can be limited with the memory_limit.
I was struggling with Guzzle, and the Laravel HTTP Client for days. Even by setting
memory_limit = 5G i was having memory or write issue like :
fwrite(): Write of xxxx bytes failed with errno=0 No error
I could finally make it with the PHP curl extension
$ch = curl_init(); curl_setopt($ch, CURLOPT_URL, 'http://example.com'); curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1); $headers = ; $headers = 'Content-Type: multipart/form-data'; $headers = 'Cookie: something...'; curl_setopt($ch, CURLOPT_HTTPHEADER, $headers); curl_setopt($ch, CURLOPT_POST, 1); $path = '/full/path/to/file'; $file = curl_file_create(realpath($path)); $post = [ 'file' => $file, 'other_field' => 'value', ]; curl_setopt($ch, CURLOPT_POSTFIELDS, $post); $result = curl_exec($ch); $httpcode = curl_getinfo($ch, CURLINFO_HTTP_CODE); curl_close($ch); var_dump($result,$httpcode);
I hope it will help you !
In a next article i will explain how to chunk upload large file to your Laravel app and not bein limited by your memory limit.