Why you need to enable gzip or anyother compression to increase the page speed?
Now lets's get started
Google Cloud Storage (GCS) by default serves files uncompressed. There is however an option to enable gzip compression for selected files. There is a catch though. It’s only possible to use static compression which means that the files need to be uploaded already compressed to the storage. Some servers allow you to use dynamic compression which means that they will compress files on the fly when the files are requested by a client but unfortunately GCS, as I mentioned, doesn't do this.
There are two ways to upload multiple files to gcp , we will use gsutil
gsutil -m cp -r -z css,js,map [LOCAL_DIR] gs://[BUCKET]/[REMOTE_DIR]
-z
this z will take care of compressing your files before uploading to gcp and it will serve compressed files to user so that the page speed will be increased
To increase page speed more use cache control also
gsutil -h "Content-Type:text/html" \
-h "Cache-Control:public, max-age=3600" cp -r images \
gs://bucket/images
this will helps reduce the image dowloading time and also increase your page speed
You can see what metadata is currently set on an object by using
gsutil ls -L gs://the_bucket/the_object
check if the gzip compression enabled or not on your website
and checking (for example, by using the web browser's development tools) if the response contains content-encoding: gzip and content-type: application/javascript (for JS) headers, e.g.:
For Aws Cloud front
open up the CloudFront Console, locate your distribution, and set Compress Objects Automatically to Yes in the Behavior options
Now enjoy the light house Score
Save this post for future reference
Top comments (0)