There are web automation tools that help with website acceptance testing, and user flow testing like selenium. There is an underutilized area of development that can leverage these technologies that abuses their power. The intended use case is to open a website that I own and test functionality that I programed into the website to ensure it's working correctly. However, this tool, and many like it can crawl and interact with any website, and you can inject python (or other language) logic in the middle of actions taken on your behalf on the website. I want to explore a tool I created to utilize this, and brainstorm some other uses that might create new business use cases.
A few years ago, I saw I was subscribed to a number of mailing lists, and instead of unsubscribing from them one at a time over weeks, I took 20 minutes and unsubscribed from 50 or so. As I did this, I recognized that all the websites that they took me to were very similar, had to take a few silly small actions, to unsubscribe. However, they were slightly different. It was a prime target for automation, and selenium was the best tool for the job. So I set up an email address to receive forwarded emails, and created a cron job to fill out these forms for others. You can check out the project at unsubscriberobot.com. It was a perfect combination of a web automation problem with a simple task that could be repeated at scale with a degree of expert system. So I built it to great success, I've stopped over 30,000 spam subscriptions.
There are some other businesses that could be ripe for using a technology similar to this, that I'd like to explore here. They aren't fully formed or billion dollar ideas in their current form, but just explorations given the state of some of the techs. Firstly, crawling for information to build out a knowledge base for a specific industry, or to create a search engine for certain information. Technologies like this can access logged in pages, and get information not generally public to search engines. It can also make intelligent decisions along crawling paths that can reduce the amount of web requests or data downloads to make it more profitable and useful to you as the crawler.
Another thought is automated survey taking. This might not be moral, but you could think of a robot that automatically takes surveys on paid survey websites, and keeps answers consistent with a simple expert system and a small amount of NLP. Lastly, auto commenting to help marketers. Identifying and blanketing the internet for a certain brand message given prompts and threads understood by the NLP system could be useful. We're probably at a point where this can be valuable to the consuming party, but we definitely will get to that point in a few years if the language processing isn't there today. Combining generating content using ML with automated crawling would be a powerful tool for marketing agencies.
All in all, I think this tech is ripe for people to come in and use and abuse it to a greater extent. I'd be curious to see what other ideas people have to utilize selenium and similar libraries. If you want help building yours in the spirit of the Unsubscribe Robot, I'm here to help.