Fact: Do you know that a normal person interacts at least 3-4 times a day with a bot? but when? and how? impressive isn't it...
It is also said that in the future a person will interact more with a bot than their spouse.
Let's unfold the truth behind this...
In simple words, bots are dumb machines that are programmed to do repetitive tasks, which are automated by a human being to save time. This was the definition that I concluded after working on a few bots.
Bots are coming into existence more and more because of their accuracy, speed and much more similar behavior to humans. if you ask me where I can find a bot? I would reply in just a few words "Everywhere on the web".
The most common bot/type of bot that you can experience right now is
Just unlock your phone and speak:
- "Hey, Siri" if you have iPhone.
- "Ok, Google" if you have Android.
Similarly, you can find Alexa or Google Home/mini which are also home assistants used for many tasks that a human wants to execute within the house like playing music, asking for news or setting reminders and many more.
The above examples are advance bots that were contributed by Gaint technology leaders, but there are many small bots that developers write for their own convenience, it can be chatbots, a web crawler, social bots, and some malicious bots.
This is the section, let's dive deep with nature of bots and how they are helpful to us with our day to day task.
So, Bots are developed to automate various repetitive tasks which turns out to be useful in many ways, but few are developed to harm your resource which is classified into good bots and bad bots.
What is a Good Bot?
- Good bots are built to gain profit for the business. These bots are beneficial for both businesses and individuals. The simple example you can assume is whenever you search for any websites, products or any services, you often get next to accurate results, how?
- This is possible because of search engines spider bot which is also known as crawler bot. Bots like
- Slurp Bot [yahoo]
- Alexa crawler [Amazon Alexa]
- Reputed companies often deploy this bot by following the rules of the webmaster crawling activity and indexing rate at websites robot.txt.
- Besides from these search engine crawlers, there are many different third party bots like
By this time you might be clear about good bots, so any bot who follow the rules and regulation of webmaster and the policies which result in profit for business are good bots.
What is a Bad Bot?
- As we know Bad is always opposite to Good. These bots are built by hackers, cybercriminals, fraudsters so that they get engaged in illegal activities.
- These bots are programmed for doing malicious jobs on the web.
- Let's take an example, you have a business setup for toys and you have a unique toy that is made by you. your competitor may build a bot that can be a scraper that can collect all the content, product reviews, feedbacks and what new toy you are working on and publish fake reviews on other websites.
- The second example is they travel thousands of visits on your website within the minimum span of time, that chokes the availability for other genuine users.
This bot is highly injurious for brand reputation which results in hampering of search engine website ranking.
Good Bots are used to gain profit for the business and also helps to build your domain and website health. This bot helps by crawling website for search engine optimization(SEO), Collecting information, obtaining marketing analytics and many more.
Social Network Bots: These bots are managed and supported by social networking sites like Facebook, Twitter. Bots help to give visibility to the brand website and drive engagement to their platforms.
Feedfetcher Bots: These bots are used to collect information from different websites and help to keep subscribers updated with the product, events, and blog posts.
Parter Bots: These bots are third party bots that are developed and supported by the SaaS organization like Slack, PayPal, Stripe and many more. these bots help to integrate directly with the program within the organization.
Monitoring Bots: These bots are programmed in such a way that they periodically monitor and updates us about the uptime and health of the server/websites.
Search Engine Crawlers: These are the most common and maybe most used bots in this modern world. No matter, who you are you need search engines for simplicity and get your work done.
Let's look at few Bad Bots
- These scrapers are used to steal vital information, prices, updates, and content. This will help the competitor to undermine the business strategies which will help them to target the company's revenue.
- Point to remember is that competitor ofter use third-party scrapers to perform this illegal act.
- These Spam Bots target community forums, lead collection forms, and comments sections.
- They usually target this section by adding unwanted promotional advertisements, links, and flood up the comment section by trolling users.
- The above activities frustrate genuine users to comment or use the forum's information. The main motive of such bots is to insert the link to phishing pages which are build to collect user critical information which included bank account, username, and passwords.
- These bots target ticketing websites, they purchase 100-1000 of tickets and sell them to third party seller due to which the genuine ticket selling website lose their customers.
- Each and every activity of a bot depends on the data it can be a training data or real-time data.
- I always recommended not to expose your websites on HTTP instead always use HTTPS or use appropriate protocols to restrict such bots to crawl your website.
- In this world of Machines and Artificial intelligence, I always want each and everyone to learn how the bot works and why you need a bot.
- If you are keen to learn this new world paradigm, "First decode it" that is what I have discussed above in this blog.
I hope you enjoyed reading! Stay tuned for Part 2 where I will be building a Telegram bot from scratch.
Thank You! Do follow and share 🤗