Edge Testing Help From AI
Edge Bot Finding a bug in Edge Bot

Edge Testing Help From AI

Many people have asked for an AI to help them with “edge” testing. Edge test cases try inputs or actions that are uncommon — that are ‘on the edge’ of expectations. Edge tests push the limits of expected user behavior, and test things that the developers likely hadn’t even thought of during design and implementation — or their own testing.

We’ve all seen how AI can do the ‘basics’, the obvious test cases, but humans want AI to help with brainstorming the more difficult, and more likely to break, types of tests. Not only did the human testers ask the bot for Edge test cases — but they wanted to be inspired by the AI to come up with even more. Ah, what a wonderful world where AI Bots and Human Testers can work collaboratively, hand-in-hand.

To that end, we added a new “Edge bot” to the Checkie.AI “coTestPilot” Chrome extension. You can sign up for access here. Yes, there is an API too. Let us see some of what the “Edge Testing Bot” is doing. I’ve learned a few new testing ideas myself — for web pages, and the extension itself.

coTestPilot Showing Bots and Results from “Edge Bot”

Let's explore the different suggestions the “Edge” bot has for the Google home page.

Long Queries

The Edge bot has suggested trying a query that it thinks is longer than most common queries and even gives the tester a quick value to copy/paste and try in the search box. This is great, as the tester is now thinking of even longer strings, and considering common queries and their lengths to try additional values based on length.

Non-Existent URL

We usually think about testing search engines' ability to find something, but what if that something doesn’t exist? Here the bot is suggesting a search for the unfindable. The bot also suggests that humans look for useful error messages and make sure ‘generic error messages’ don’t show up. The human testers might also start thinking about searching for other things that don’t exist like words, or unique numbers, or GUIDs, etc.

Multi-Clicks

Here the Edge bot is suggesting the tester try clicking the Store link a few times, in quick succession. Maybe this will cause an error, or opens multiple store pages? Interestingly, the Google web page click/event handling is quite complex and doesn’t use the default browser/HTML href link handling, so who knows what might happen? :) Maybe other links should be clicked rapidly too. Maybe links on the website you are testing…

Search for CSS

Edge bot is trying to mess up the Google webpage with a strange query — CSS code. CSS code is the web code that styles or formats the content of the page. Web developers search for CSS all the time while building websites. The Edge bot is suggesting that CSS searches might accidentally alter the look/feel of the Google search bot itself.

Here the Edge bot suggests a short, but dramatically obvious and clever CSS string — if it causes an issue, the Google page would likely turn a glaring red color. Pretty smart CSS test input whose impact would be very visually obvious to a human tester. What other CSS, or other code inputs like HTML or JavaScript might also be interesting to test for?

“While testing the Edge Bot, the extension’s body color started changing to red. I couldn’t find ‘red’ in the source code…doh! fixed now.”

Doh!

Summary

Great testers, after a coffee, might think of many of these cases. But, it is likely most testers will see a few testing ideas that they hadn’t had before. The Edge bot had many other ideas, these were just the first few. A great example of Humans Collaborating with AI’s, in the real world. Stay tuned for a post on how the other AI Testing bots can help us Human Testers.

“Humans Collaborating with AI’s, in the real world.”

Family of

At Checkie.AI, we are working to make you a smarter, better tester with AI — and it's now as easy as a single click. If you’d like to try out the Edge bot on your website, in your testing, You can sign up for free access here. Please share your ideas and thoughts in the comments.

— Jason Arbon

To view or add a comment, sign in

Explore topics