December 14, 2016
QualiTesters have been busy keeping up with what’s new in the world of QA and software testing at conferences around the globe.
QualiTesters have been busy keeping up with what’s new in the world of QA and software testing at conferences around the globe. We wanted to let everyone know about their experiences, and welcome you to attend conferences in the future.
Christopher Livett, a Senior Test Consultant, attended the Selenium Conference UK which took place on November 15-16. Christopher felt that the highlight of the event was hearing Dan Cuellar, creator of Appium, talk about Advanced Appium. Dan Cuellar talked about the origins of Appium as well as his view that just because “experts” have a defined solution to a problem doesn’t mean that you cannot find a better solution, in a way that was very motivational talk and highlighted some of the features of Appium that you may not have known about such as that it can be used for desktop applications as well as mobile apps. In addition to providing interesting talks and good networking opportunities, the conference providing a break from one’s regular work routine.
Glenn Tejidor attended the 2016 STARWEST Software Testing Conference in Anaheim, California (next year will be the same place October 1-6). One of the speakers was Scott Miles, a quality engineer/Scrum Master/automation specialist/testing enthusiast from Australia. He states that most script failures are due to bugs in the test scripts, not in the product under test, and wants to address the automation framework frustration factor of high cost and low return. His idea is to make your existing framework more adaptive to changes such as UI elements that might have changed or unexpected test interruptions, and that using what he calls “adaptive elements” and “adaptive actions” is the key to a more stable automation framework. The basic concept is to calculate a percentage of how many elements of an object still match and how far values of the property are off, after which the user would then decide to make the changes during run time. This technique does not require purchasing new tools or re-writing existing tests.
Another STARWEST speaker, Janna Loeffler, was responsible for testing many of Disney’s Parks and Resorts websites and the My Disney Experience mobile application, developing many of the automated tests for the Walt Disney World website, and being Project Lead overseeing development operations and software testing. She talked about how it takes more than faith, trust and pixie dust to test all of the Disney theme park attractions and ensure quality experiences for their guests. She emphasized the importance of knowing who the customer really is and how to get out of your comfort zone to deliver a complete quality solution. She loves what she does while describing the testing technologies, or magic as she likes to describe it.
Tom Drake, Test Consultant, attended TestEXPO 2016 in London, which focused heavily on Agile methods and DevOps, and had a lot to say about the many seminars he attended.
Andreas Sjostrom of Sogeti’s “Does your testing capability make you predator or prey?” discussed the importance of high quality delivery in an information heavy, fast moving digital world. Agile is needed to keep up with the marketplace and testing needs to be fully integrated as the general public are so fickle they will move on (apparently after 3.5 seconds). Automation and DevOps approaches were mentioned as ways to become predator.
Archie Roboostoff of Micro Focus’s “Business Agility with Enterprise DevOps” described the benefits of DevOps, and moving from Waterfall to Agile to CI to CD. He described the benefits of having automated test processes, especially automated release and deployment processes (including a pitch on how Micro Focus products can assist in all of these situations). He also announced that Micro Focus has purchased all of the Hewlett Packard Enterprise (HPE) assets.
Jitesh Gosai, Senior Developer in Test at BBC Digital – Mobile Platforms, had by far the most engaging talk of the day — “Mis-Adventures in Test Automation – iPlayer Mobile” with really interesting insights into the growth of their testing environments and practices. He started off very heavily automation focused and finding that this caused enormous overheads, then ended up in actually nothing being tested rather just checking lots of things. In the environment they work in (with multiple releases per week), automation helped them slash the repetitive tasks, especially where testing is more of a checking job. They are moving towards a far more balanced approach and seem to have found a sweet spot.
Gordon Alexander of Seapine’s “Verification and Validation vs Agile adoption” asks, “Can you use Agile methods in a heavily standardized environment (such as safety critical systems that must meet many ISO standards)?” The talk began by saying No, for reasons such as traceability etc. using Word & Excel. This turned to Yes, you can do this by using Seapine’s XYZ products. The moral was that standards can be met within Agile projects, even with a lower level of documentation if it has traceability. Just linking a test to a requirement is not enough, but with their tool you can easily trace, code, test and confirm requirements together.
Graham Perry of Neotys’s “How to Fit Performance Testing with Agile and DevOps” described how their (Neotys) performance test tools can be integrated at different stages throughout a project, but that testing earlier yields cheaper, quicker, and easier fixes (yay to shifting left). While performance testing mostly remains an end of project task, perhaps it should be adopted into the adopt-left family. Within Agile projects, performance tests should be run as part of every sprint or build. Within DevOps, more people should be up-skilled in performance testing, and that this can be a focus of Developers and Operations teams, not just stand-alone Performance Testers, leading to the notion of Performance Testers becoming Performance Engineers. Performance can be built into a product, rather than tested and tweaked once it’s built.
Perfecto Mobile: They offer cloud based mobile testing. The software seems highly intuitive and simple to set up (far more so than what EggPlant offers), but they’re hugely limited in that all devices are on their site, so … no testbed testing. Seems far more suitable for App testing than Network testing.
Micro Focus: Their Silk test management product, which directly competes to something like ALM (QC) seems really good, far simpler, and far more accessible (reports can be web accessed by anybody, without need to license). The integration of Requirements also seems far more successful than QC ever has been, with really useful TTRM (test to requirements matrix) type reporting. Makes it really easy to see what regression testing might be needed after a bug is fixed or new patch etc. goes live.
All QualiTesters who attended these conferences reported that they enjoyed themselves, that they learned and were inspired, and that they look forward to attending next year’s event as well.