DEV Community

Cover image for Suggestions for image optimisation strategies?
Ruairi Shanahan
Ruairi Shanahan

Posted on

Suggestions for image optimisation strategies?

Hi there,

I've recently created my first MVP, shuzzle.me is the name of the app. It is somewhat of a challenging tile sliding game
and a new way to have fun with your Instagram pictures.

You can try it out here -- >>

Note: you will need to login with your Instagram credentials in order to access the app

Anyhow's the reason for my post is I'm struggling to find a way to optimise the delivery of images to the end users in particular mobiles users on first load with a 3G connection. I need an approach or strategy that will scale if my application see's an increase in users and drastically reduce the size of images being pushed to the end user's device.

The Problem
My problem is that the images returned via the Instagram Basic Display & Graph API are super large in file size and dimension.

For example, upon first load of a profile view, 28 images are pulled from the API. The image size per picture pulled into the shuzzle users profile is between 500kb and 750kb. This could be an initial load of 14Mb to 21Mb which on a 3G mobile connection is dire and renders a poor overall UX.

In addition to that, the IG Graph API does not provide a way to call an image resource of a smaller size.

Possible Approaches
I've tried lazy loading the images into the browser, various PHP image optimisation libraries / frameworks but alas to no increase in performance and reduction in per picture file size or dimension.

I suspect I need to save or cache the images then process / resize them and then delivery them to the end user.

Constraints
As this is a side hobby, I have a very limited budget so hoping to achieve this with a solution or approach that is very wallet friendly / cost effective, therefore I wish to avoid using a third party image optimising service or something of that nature.

Any suggestions or best approach to my problem would be greatly appreciated?

Thanks
R

Top comments (18)

Collapse
 
botsbushman profile image
Ruairi Shanahan

Hey everyone ( @inhuofficial, @izio38 , @ravavyr )

Happy new year! Hope you all had a great festive season! So with all of your advice I refactored my approach to outputting resized IG photos back to the user using the Image Intervention library.

My approach was to loop thorough the first set of pictures, detect the original width & height and therefore work out the original ratio of each picture.

Then apply a resize reduction factor of either third, a quarter or 22.5% depending on if the image width was less than 800px, in between 800 and 1200 px or greater than 1200px. After this reduction conversion was done, my approach then applies the original ratio to either the width or height dependent on which one is bigger than the other. I then converted the resized image into base64 image url and echoed the output along with html and css for styling.

' if ($width / $height > $ratio_orig) {
$width = $height * $ratio_orig;
} else {
$height = $width / $ratio_orig;
}

                        $width = intval(round($width)); 
                        $height = intval(round($height));

                        $img_base64 = $manager->make($image_link)->resize($width, $height)->encode('data-url');' 
Enter fullscreen mode Exit fullscreen mode

Note, my image size went from circa 250kb per image down to 20-30kb per image.

All in all this did see a marked overall improvement. Currently the page download size of the images alone is 11MB and its overall page download is 18 - 20 MB taking 58.15s or nearly 1 minute to render on screen.

With the new approach the page download size went from a 18-20MB down to 4.8MB or 11MB with the first pull of images down to 4.1MB and a total render time of 28.76s to download.

However, due to how I've implemented this approach, the image conversion and output of the new image blocks the page rendering in full until the foreach loop within this script has completed. (i.e. the script does all this work in the background and then spits out to the front end all 28 converted images at once.

Is there an approach I can take with my PHP script that outputs each image as it is processed?

Ideally I would like to dynamically inject a resized image into the output HTML as soon as the image is processed?

Perhaps I could wrap the base64 image url as as json payload and push a processed image to the front end in this manner?

Or is it possible to implement web sockets with PHP ?

Collapse
 
grahamthedev profile image
GrahamTheDev

Cache the images, so resize the image once and then save it to a folder called cached-images or similar. Then use either media queries or the picture element to pull the right image for each screen size.

Once you have that working we can look at automating the process so when you update an image it automatically creates the caged versions for you.

Collapse
 
botsbushman profile image
Ruairi Shanahan

@inhuofficial thanks for the advice.

So I refactored my code and post successful auth with the IG API, I am now doing the following with a cached folder;

  1. Check if a custom user centric folder exists if not create it, so result is something like ~/cached-images/{username}, here I store each unique image according to it's IG post ID
  2. I then perform my image optimisation and resizing according to original image ratio, width & height, create & save the new image to desired resizing factor. These images are then saved to the user’s unique cached sub folder ( i.e. ~/cached-images/{username}/{post-id}.jpg
  3. I then output the images to the frontend with the use of the foreach loop

Observations:

After implementing the above approach, there has been a minor performance improvement however the page still renders slowly taking 25-30 seconds; I suspect the foreach loop that does the initial image optimisation and resizing needs to execute in full to produce the output.

Ideally, I need to implement some sort of background process that performs the image optimisation and primes the images into a unique user sub folder whilst the user journeys from the log in view to the profile view.

I also need to add some logic that skips the optimisation and caching of the images if it has already been done if I revisit the profile view. @ravavyr tagging you in here for the update ;)

Thread Thread
 
grahamthedev profile image
GrahamTheDev

Exactly, from what you describe you are still having to run the cache script on the first time the page loads. But where you should notice a massive difference is when the page loads a second time it should be much much faster. Caching in advance is defo the best bet if you can (perhaps a script that runs on a cron job or when you add a user account.)

The other thing you can do is to lazy load everything “below the fold” but do it with JS so you can request those images be generated separately so you only need to generate say 9 images “on the fly” for the page to load (hope that makes sense)

Thread Thread
 
botsbushman profile image
Ruairi Shanahan

Thanks again @inhuofficial for the feedback, it's very much appreciated bouncing ideas off you and getting feedback. Especially, seeing as this is bit of a solo project with just me working on it, so I really do appreciate your input.

It's bit of chicken & egg scenario of which one should come first for me when it comes to my app and it's UX with a first time user signing up/auth'ing via IG. There's an expectation there by the user for the app to behave and respond like the real thing, hence I really want that first view load / screen render to really slick and fast else huge bounce rate/drop off me suspects.

You mentioned automation in your previous post, this led me down the php cli rabbit hole and I think I've a possible solution using either proc_open, exec or shell_exec.

My thinking is post successful IG auth I can kick off a php background process / script that can do the caching in advance / on the fly whilst the app routes the user to their profile view.

The previous script would do all the hard work of creating a unique folder per user under the parent folder cached images and in that way my profile view can call upon that unique folder and render via lazy load images below the fold.

Do you think that's a good approach or am I barking up the wrong tree?

Thread Thread
 
grahamthedev profile image
GrahamTheDev

Right so the only concern left is the first time someone views their profile after setting up an account now it seems (correct me if I am wrong as I do not know the project and am just guessing from our conversations so far).

So I would simplify this bit to:

  • Check the cache
  • If no cached image then serve the original image straight from the source and add the image to a processing queue.
  • process the queue on a cron job (see if this is helpful: code.tutsplus.com/tutorials/managi...) or if you use cPanel just set the cron job up there.
  • then the next time those images are needed by the user they should have them cached (step 2)

You will lose a little bit of first load speed as the images aren’t optimised (bear in mind we are only going to serve unoptimised “above the fold” images and serve the rest with lazy loading) but it will still be faster. Then on further loads it will be much faster.

As you said you can move the “adding to the queue” part forward at the point the user is signing up to start the caching process earlier and (hopefully) having the images cached before they access the page. But if you do the above it should still be fast even when the server is under load.

The last observation / thought is to minimise that initial delay only create one size of optimised image on the first run (queue the rest separately as “lower priority” using a priority integer in the DB), the less processing you can do initially the faster that first page load will be.

Getting quite advanced there so in case anything isn’t clear I am using the following logic:

  • minimise how much processing is required for initial page load.
  • server from cache where we can
  • prioritise caching for initial page load

Hopefully it all makes sense! Sounds like a great project and by the end of it you should have an awesome image loading optimisation library that you can recycle into numerous projects!

Thread Thread
 
botsbushman profile image
Ruairi Shanahan

@inhuofficial thanks again for your feedback and guidance. I refactored my code merging your suggestions with the optimisations and got decent performance gains. Must admit, been learning a lot since you've pointed me in the right direction.

So my approach now is;

  • User auths against IG
  • Successful auth routes user back to my app and in the background kicks off an optimising and caching script
  • App does once off "on-the-fly" image optimisation of the first set of images pulled from the API
  • then on every subsequent page reload if the unique user's cached folder then it pulls images from that source.

Overall, this as considerably improved the initial UX of my MVP so I'm pretty stoked with what I've achieved to date.

However, looking into crystal ball I'm envisioning a few performance issues / concerns.

  • Whenever my dyno restarts the "cached" folders are lost.
  • I still suspect my current solution may not scale elegantly

  • Future approach

  • Refactor background optimisation script to store cached images in a Cloud Storage Solution

  • Build on this script to pull, optimise and store more images from an authenticated IG user and display in my app .

Thread Thread
 
grahamthedev profile image
GrahamTheDev

Not sure about the lost cached folders, I guess there must be an easy way to persist then as actual physical folders but I am a LAMP stack dev so I am not sure I can offer much help here other than saying that using a cloud storage solution as you suggested sounds like the simplest route.

As for scaling, worry about that when the time comes, if it scales to a point where it becomes a bottleneck then hopefully you will be generating revenue (at which point you can pay someone to consult on optimisation and get a more customised / detailed plan!)

Glad you are making massive improvements, I look forward to seeing it all in action! ❤

Thread Thread
 
botsbushman profile image
Ruairi Shanahan

Morning @inhuofficial if you want to check it out and you have an IG handle you can login via shuzzle.me

That is true, probs only have to worry about scaling issues when the time comes and the project is paying for itself and a little extra. I'm still intrigued with the clod object storage and potential BE solution I could implement so may spin up a side project to get a proof of concept working.

Thanks again for your insights and guidance. Much appreciated.

Collapse
 
ravavyr profile image
Ravavyr

I would not recommend you resize "on the fly" with php. In the long run this will crash your server because more users, more images loaded, one server, php will run out of memory eventually. In fact, i bet it would crash if you render 100 images on one page and just hit refresh like 50 times.

On upload, process the images and save them in a "resized" folder. That way you can keep the original uploads and the resized version separate. More storage , yes, but you keep the originals which in some cases matters. If it doesn't, just don't save the originals.

Also your image resize script is incomplete and doesn't handle all size scenarios.
Look at the conditions in the answer here: stackoverflow.com/questions/146496...
It covers more variations than yours does.

Collapse
 
grahamthedev profile image
GrahamTheDev • Edited

I suspect I need to save or cache the images then process / resize them and then delivery them to the end user.

Yes that is the technique that you will need to employ if you are unable to call smaller versions of the images from the API, trying to resize them "on the fly" will be both slow and computationally expensive so you only want to do it once!

If you are doing this you should also try and convert the images to .webp format at the same time as resizing as that will also decrease the file size.

WebP is built in if you have Imagick installed with imagewebp function.

You can also resize with $finalImage = imagecreatetruecolor($finalWidth, $finalHeight); and then use imagecopyresized to copy the large image to the new image you created ready for saving.

The only down side to using webP is you need to cache both a .webp and a .jpg and use either browser sniffing in JS (if you are loading the images dynamically) or the <picture> element to ensure you still serve .jpg images to browsers that don't support .webp

Collapse
 
botsbushman profile image
Ruairi Shanahan

@inhuofficial many thanks for your prompt response and suggested solution. The imagewebp function & format is one I have not considered. I'll give this a go later and see how I get on ;)

Collapse
 
izio38 profile image
izio38

Hi !

Have you tried to make 2 version of your images, one blurred and on full, I used this technique before, and it reduces the first image size by ~20, I think it depends on the image, but.. doable!

While the second image is loading, you show the blurred version. That's not perfect, but this help user experience in my opinion.

Another way is to make an IP filter to only accept people coming from big cities with good connection. Joke.

Good luck with your project ! 👋

Collapse
 
botsbushman profile image
Ruairi Shanahan

@izio38 haha lols I like your IP filter suggestion ;)

Hmm that's an interesting approach, to serve up a blurred version whilst the other one loads. Gives the user a better sense that something is happening. I'll take that into consideration, thanks for your input

Collapse
 
ravavyr profile image
Ravavyr

your "blurred" version would essentially be the cached smaller version [it would get pixelated if rendered at the normal size versus its own smaller size]
To save even more bandwidth you could set a 1px image as the source for each image and swap it in with the larger images using a lazy loading method, so you only show images that are near or on the screen instead of rendering all of them on the page.

i'd add cloudflare to your server so your cached images get cached via their cdn edge nodes around the world so every request doesn't hit your server directly.

Thread Thread
 
botsbushman profile image
Ruairi Shanahan

Hi @ravavyr,

Thank for your response and suggestion. I had thought about implementing Cloudflare but another consideration is that currently my app does not store any of the user's media on my server. I'm simply authenticating against the Instagram API which then allows my app to call on the user's IG media via the media endpoint then pushes that image to the end user's browser.

So not sure if I can cache images that require authentication to be viewed.

I do like your suggestion of To save even more bandwidth you could set a 1px image as the source for each image and swap it in with the larger images using a lazy loading method, and will defo implement that.

Thread Thread
 
ravavyr profile image
Ravavyr

What if you store the data after the first time the api is called [note: IG may have terms against that]
What if you hit the api, grab the data and store it to your own text files that you serve to users to reduce api calls?

Thread Thread
 
botsbushman profile image
Ruairi Shanahan

Thanks for the input, that's a good point, i'll need to check IG's API T&C's.