Not just the way you combine them, but also the way you actually write them. This means if you are using a feature in CSS or HTML that's not supported in all browsers; What will happen to users that use a browser that does not support the feature you've implemented?
Based on the result, I categorize them into three categories namely; The Good, The Bad, and The Ugly.
Let's begin with the good.
Can you spot the difference? Don't worry it's quite minimal.
For this page if any part of the web page fails to load it's considered bad.
The purpose of this research is to show that Resilient Web Design wins as shown by Google, Bing, Netlify, and Wikipedia.
Edited October 9, 2020: Grammar fix and deletion of obsolete canonical URL.
Top comments (20)
The importance of use User Agent to Scraping Data
Hugo Sandoval ・ Apr 30 ・ 6 min read
Thank you for the link @Pacharapol, I'll definitely read the article.
Do you have any resource that I can check for this? Or is it in the linked article?
No, I cite it myself.
GETweb scraping (e.g. Python requests), and read the HTML output (e.g. Python BeautifulSoup). You will need to resort to more heavyweight method like Selenium or Puppetteer (Chrome engine).
Excellent. Thanks for the info.
Definitely a good security practice to use noscript in my eyes. Once you get used to it there's no need to go back to js on by default.
It's satisfying to see so many adverts, Google analytics and other crud getting blocked by noscript. I get a good sense of the website by how many third party scripts it runs and what they are called.
One major pain point is cloudfront type hosting. In particular the AWS console will load entirely necessary content from literally dozens of different cloudfront domains that all need to be trusted individually... Every AWS service can use a bunch of different domains :( you can always just disable noscript for a tab though when you hit annoying niche cases like that though :)
Noscript is great, highly recommend it!
Hahaha I love how you threw DEV in there right at the end
If you turn js off then you need to live with the fact that you can't use the majority of sites!
I remember times when some developers started trying to make it look trendy and cool and also smart to create sites that can be used with js on and off too. This stupidity only means doubling or tripling the time of development and the budget needed nothing else, which of course for those people is not clear as they are insane, and still some things are simply not possible without js anymore. It is the silliest idea...
That time I had to deal with this issue and face the fact that some people wanted me to create sites like that. It was very annoying and lot of wasting time of arguing on needles stupid things. Trying to make things work without js is not possible anymore or result in a very poor website.
If we follow this logic we can try to use a computer with the CPU removed to see if it still works without it... Removing the CPU by the way also protect you from malware and viruses LOL
Experimenting is a good thing, but you should still keep your sanity!
I already commented the similar problems before: dev.to/bravemaster619/comment/mok2
How about DuckDuckGo? :)
For the Home Page, the difference is really noticeable.
P.S. If you prefer the second but do have JS enabled, use start.duckduckgo.com/ instead. :)
The beautiful and useful css.gg ❤️
Some comments have been hidden by the post's author - find out more