We are living in the world of bots, but their traffic is not welcome on our websites for many reasons. Some of them are: slowing down response time of your website, scraping content, generating fake interactions and traffic, etc.
What is your experience, is your website being scraped? How can you tell? I started working on a bot protection test service - there is a lot more to be done before the launch day, but here is my website: https://botmenot.com/