Determining the exact percentage of bots versus human users on the internet is challenging due to various factors, including the constantly evolving nature of bot technology and the diversity of online platforms. However, it is generally believed that a significant portion of internet traffic is generated by bots.
In my personal opinion – Most Users on the Internet are actually bots. I’ve coded web scrappers myself in the past for various purposes. You could implement them in various ways to pass through automated protections. The same thing is revealed by Elon Musk in his attempts to stop bots on Twitter. On this in the middle of nowhere blog – I am receiving spam comments from Russian, Chinease and American individuals. They do not have nothing related to the actual blog posts, but contain affiliate links to external platforms.
Web Crawlers or Spiders
These bots, often employed by search engines like Google, crawl the web to index and catalog web pages for search results. They go through the HTML programmatically and could go though all the references pages. If the structure of the pages don’t change – they could extract – in more structured way the content.
Web View Bots
Example – Java Web Viewing https://www.google.com/search?q=webview+java+swing
Many programming platforms have web engine components that could load some URL. You could extract the result and see what is loaded and you could extract the data – even if it is loaded dynamically.
Executing actions on Web Views
There are several technologies that allow executing actions on web pages.
Headless Browsers
Headless browsers, such as Puppeteer (for JavaScript) or Selenium WebDriver (for various programming languages), allow you to automate web interactions, including clicking on links, buttons, and other elements. These tools simulate the behavior of a real browser without the need for a graphical user interface.
Browser Extensions
Browser extensions, like Tampermonkey or Greasemonkey, provide a way to customize the behavior of web pages. You can write scripts in JavaScript that run within the browser and programmatically interact with the page. This includes clicking on elements.
API-based Interactions
Few websites or programming platforms provide APIs (Application Programming Interfaces) that allow programmatically interacting with their services. Instead of directly clicking on web pages, you can use these APIs to achieve specific actions like submitting forms or triggering certain events. Example API that You coud use in Java is a Robot – that could type text, execute actions with the mouse – including drag and drop and so on.
Scammers could use the power of the Cloud to create farms of users that act in sync and that are limited just by the amount of computing power some malisous individual could gather. And with the help of AI – these bots could even use a human platforms like social media – with different messages and texts. This is making it even harder to recognize the robot in the room.
Image source: https://pixabay.com/vectors/robot-technology-artificial-5702074/