In this episode The Testing Show tackles the NEST Thermostat, early release of inmates in Washington state due to a computer error, and what can we do when our password site gets hacked? These examples help set up the main topic, which is “how can we develop the needed skills so that quality testing can occur in our organizations?”
We discuss test design, bug advocacy, long term auditions, and think about how robots can help you see the way software testers pivot (or not).
Panelists:
References:
- Nest Thermostat Glitch Leaves Users in the Cold
- Apple iOS 9 Update Fixes Security Bug That Lingered For Two Years
- 2 Prisoners Mistakenly Released Early Now Charged In Killings
- Irony alert: Password-storing company is hacked
- The QA job market is a killer, would you stand a chance?
- Seth Godin, Linchpin: Are You Indispensable?
- Whiteboard Testing YouTube Channel
- Agile and Beyond 2016 – May 5-6 – Ypsilanti, MI
Transcript:
This episode of The Testing Show is sponsored by QualiTest. Deploy Software that You and Your Customers Trust with QualiTest Testing and Business Assurance Services. Learn more about QualiTest at www.qualitestgroup.com.
MATTHEW HEUSSER: Hello, and welcome to The Testing Show, a podcast about software testing, brought to you by QualiTest and Excelon Development. I’m your host, Matt Heusser, and we’re here to talk about what’s new and exciting in the world of testing. So, let’s got to Justin Rohrman who’s coming to us all the way from Nashville, Tennessee.
JUSTIN ROHRMAN: Good morning. Actually, I’m in Houston today. I had to catch a flight out a day early to avoid the one inch of snow that will shut down Nashville today. Well, I’ve got a news item today that you might think is interesting. There is a Nest Thermostat bug that was occurring mostly in the Northeast last week. The Nest was doing something that was causing its battery cycles to completely drain. So, people would wake up to a completely cold house in the morning. So, I think this is a real, live example of what happens with Internet of Things and software bugs. They have actual consequences.
MATTHEW HEUSSER: I thought the Nest ran off your house’s electrical system. It doesn’t?
JUSTIN ROHRMAN: There is a very low voltage line that goes to the Nest to charge your battery, but that can run down. Actually, that happened to me last summer. I woke up and our Nest had shut off and the house was about 80 degrees. I had to take the Nest Thermostat off the wall and plug it into the IGROW USB Phone Charger. Luckily, I had one of those.
MATTHEW HEUSSER: It reminds me of the iPhone, “Under ideal conditions, the battery can last this long, and we can watch that much video.” The iPhone 6, as it keeps getting thinner, the battery gets smaller, because that’s what thinner does. Apple actually has a case for iPhone 6 that has a battery inside of it. It has to be charged overnight, basically every night. We don’t really ever talk about that in terms of requirements. Like, as a customer, actual battery life wasn’t part of the discussion.
JUSTIN ROHRMAN: The thing that went wrong with my Nest, supposedly, was that it got in some sort of infinite loop, talking back to the router, and that just burned up the battery. I guess, under normal operating conditions that would never be a problem. [LAUGHTER]. It was the unexpected “Thing” that caused it to work out of sync with the design.
MATTHEW HEUSSER: That’d be a really interesting thing to test. Like, how would you find that bug? Look, you could test it and it would work just fine the first 15 times you tested it. You’d have to write a tool and test it 10,000 and say, “Oh, gee. It drained the battery in 4 hours instead of 40.”
JUSTIN ROHRMAN: I think it’s a great example of Black Swan. I mean, nobody expected this thing was going to get into some weird loop that just churns through a battery.
BRIAN VAN STONE: I think what we’ll see them doing now—it seems like what you’re experiencing—is this power threshold where the demands of the advice are exceeding the output of this low-voltage line that’s supposed to sustain it. So, I think it’s interesting to try and explore, “What is that threshold, and what kind of performance metrics can we put around when we’re going to exceed that and we can reliability predict that we’re going to have an issue under the battery running dry?”
MATTHEW HEUSSER: Does anybody have an engineering degree? Shouldn’t it be possible to predict how much power you’d ever need and just send it that much?
JUSTIN ROHRMAN: You’re constrained by the existing thermostat wiring. People are just replacing their existing thermostat with these things.
MATTHEW HEUSSER: Oh, it’s replacing a Legacy System with it?
BRIAN VAN STONE: Yeah.
JUSTIN ROHRMAN: That low-voltage line.
MATTHEW HEUSSER: That seems like something an electrical engineer might have been able to think about. I don’t know about a software tester? So, that other voice you heard on the line was Brian Van Stone, at test architect with QualiTest Group. Good morning, Brian.
BRIAN VAN STONE: Good morning, everybody. How is everyone today?
MATTHEW HEUSSER: What’s new and exciting in the world of testing?
BRIAN VAN STONE: iOS 9 update that just came out January 21st, fixing a security flaw that was first reported just over 2 years ago. Basically, captive portals, the authentication screens that pop up when you hit a public Wi-Fi Network at Starbucks or Target, had access to the same cookie store as your normal Safari Browser, which meant not only could the normal, [LAUGHTER], phishing attacks that people love to do with captive portals still happen, but they could actually have access to information from your browsing history before you ever connected to the portal. Luckily, that has now been addressed, so captive portals get a separate cookie store. Shouldn’t have to worry about that one moving forward.
MATTHEW HEUSSER: So, that would be like they could hijack your session because your session ID is in your cookies?
BRIAN VAN STONE: Yeah. And, they could even hijack a session that you had thought was now terminated and you were moving on to different activity.
MATTHEW HEUSSER: How would you test for that, if you were an Apple Tester?
BRIAN VAN STONE: Anytime you’re popping up a browser through any functionality, be it Safari or not, there’s an understanding that there’s shared components throughout complex software that is an operating system that goes on your phone. You’re not going to reinvent the wheel 40 times to do the same thing through 40 different features, especially considering that this was user reported over 2 years ago. Unlike the one we just spoke about, I think this one is one that a tester could reasonably predict. The reality is that we had in-the-wild exploratory testing happening by end users who were probably trying to break things, which is great; because, when you try to break things, you’re going to find all these edge cases. But, I think that kind of exploratory testing that was happening in the wild could’ve definitely been happening in Apple.
MATTHEW HEUSSER: So, I want to make sure that I get this right. You’re saying that, when the security popup comes up, that’s actually a browser? Whoever is interfacing with that is going to create that HTML page? So, the guys at Apple who were creating the software knew that someone was going to create a HTML validation page for the login and they knew that HTML validation page could have a JavaScript, and in order to test it, they had to write it? So, they should’ve said, “Hey. While we’ve got access to JavaScript in this document object model, let’s poke into Safari and see what information is available to us?
BRIAN VAN STONE: Yeah. And, I mean, Apple even creates their own captive portals leveraging the same feature for Apple-supported features. It’s a feature where they enable users to do things they’re also doing in the way that they build software for the iPhone. So, I think it’s definitely something that’s within the scope of something that could’ve been discovered a lot earlier, especially considering internally they’re leveraging that feature to begin with.
MATTHEW HEUSSER: Again, it’s subject-matter expertise. If you had a tester who was there for six months that was just doing what they were told, there’s no way they’d find it; but, if they really understood the software and thought about it for an hour, they could probably come up with that. I think that we didn’t bother to prioritize the fix for two years is kind of an interesting side story too.
BRIAN VAN STONE: Yeah. You’ve got to wonder what the rationale was there. Hopefully, there was some sort of risk analysis done and hopefully communication to end users as well to “be aware.”
MATTHEW HEUSSER: At Excelon, we’ve been talking about this term “bug advocacy.” I mean, it’s correct, but people don’t really have an emotional response to it. And, we think it’s a really important skill, but it’s under, even when you talk to people about “bug advocacy,” they don’t grasp what we’re talking about. What’s the new term you’re kicking around there, Justin?
JUSTIN ROHRMAN: I like to call it, “skill reporting.” Knowing how to create, or craft, information to shape it correctly for your audience and also sort of getting your way. If you have a bug that you think is important, you can craft the report such that it will be persuasive and the thing will get fixed, and that might mean talking accurately about who’s going to be affected, customer-wise, by this problem or how bad the failure will be or how bad it will make the company look. The end result is you get more bugs fixed.
MATTHEW HEUSSER: And I think something that’s also important is finding the right channels to advocate through. Is that something that you would bundle into the same concept, or would you consider that to be a separate and parallel aspect to the problem?
JUSTIN ROHRMAN: I think it could either be separate or bundled together. It really depends on how your organization is shaped. If you’re on these very closely-knit Agile Teams where you have a couple of developers and a tester and you’re all sitting together and working together at a table, most of the time when you find something it’s just going to get fixed right away. Lobbying fair cause becomes less of a developed skill for you. You’re just not going to need it that much. But, if you’re working in an old-school type of environment where there’s separations of concerns, you have developers that work in their queue forum, testers that work in their queue forum, product managers that work in their own separate space, you’re going to have to learn to bridge these spaces and how to convince two other people that you don’t work with intimately.
PERZE ABABA: The key to advocacy too is really your relationship with your fellow teammates. If we circle back to Jerry Weinberg’s definition of quality, that it’s, “value to someone who matters.” If you find that someone that cares about it, then you have a co‑champion to help advocate for fixing of that bug as well.
MATTHEW HEUSSER: So, I’m not sure I’ve heard Perze right there. We’ve got to convince them that it has quality. We’ve got convince them that it has value. And, I think, with bugs, a lot of times the value of fixing a bug becomes the absence of a defect. Product owners tend to overvalue the presence of a feature and undervalue the absence of a defect. So, getting it to be, “Here’s the actual use case, Twenty‑five percent of our core users use this feature three times a day.” That’s how many times they’re going to be impacted by this, and this is the experience that they’re going to have. They’re not going to be able do this core task at all. This is very different than, “Here’s the bug.”
PERZE ABABA: Right. A lot of testing reports that I’ve been seeing and reviewing have made an incomplete story. They’re only mentioning the bug and you don’t know the risks that come with the bug. So, if you have complete information and point that back to how that will actually affect the user, that would actually help in providing more information with regards to, “How important is this bug, and how soon should it be fixed?”
JUSTIN ROHRMAN: I think if you’re able to identify a bug, you should be able to at least intuitively identify the risk to a person. You understand that this thing is a bug because somebody’s going to be unhappy with it. It’s just a next little logic step really to make that explicit whenever you’re talking to people or writing out your report and actually describe the consequences of this problem.
MATTHEW HEUSSER: What’s new and exciting in the world of testing? Perze?
PERZE ABABA: So, I read this article that, “A computer glitch that lead to the early release of prisoners in Washington State.” Apparently, this issue has been existing in their system since 2002, and it affected more than 1,000 prisoners. If you get the chance to read the article, they have a link to the actual numbers that the Washington State Department of Corrections had gathered and do an actual ROI of how it was affected. In the span of four years, they realized that because of this early release, one of the convicts were charged with committing, [LAUGHTER], vehicular homicide after his early release and another with first‑degree murder. I think this is a real, live scenario of, “not testing good enough, would affect the lives of other people.”
MATTHEW HEUSSER: I’m trying to understand. A computer program said, “It’s October 5th. Let Joe Schmo out,” and they didn’t have the Parole Board or anybody to check the records to see if the date was right? They just let them out?
PERZE ABABA: Right.
MATTHEW HEUSSER: [LAUGHTER].
PERZE ABABA: That seems to have been the case.
JUSTIN ROHRMAN: That sounds like a human bug. People should be checking records to make sure that what they’re reading is correct. Doctors don’t go off and do everything that’s printed out in front of them immediately. That would be disastrous.
MATTHEW HEUSSER: This is my concern with “no tolerance.” What “zero tolerance” says is that, “We are suspending all of our judgment and we are strictly following the rules.” For this kind of thing, I like to term, “decision support system” (DSS). Your computer can say, “I predict that you’re going to be really low on toothpaste next week, so I want to order a whole bunch.” But, you’ve got to have a manager who says, “No. Somebody mistyped the inventory levels. We’re fine. Don’t order the ten cases of toothpaste.” It amazes me that we keep moving in that direction. As much as I love continuous delivery, a lot of the companies that try to implement it kind of have that problem.
JUSTIN ROHRMAN: Yeah. I think this is a great tie in to the problems with the misunderstandings of test automation where some people sometimes think they can remove the human decision and judgment making completely out of the equation because you have these programs that are running in the background. That they will make the decisions for you.
MATTHEW HEUSSER: I actually wrote an article last year where I was talking about how to thread this very-hard-to-thread needle and my editor stuck in, between the title and the body, the little summary piece that said, “Can you remove the human element from testing? Matt Heusser says, ‘Yes.’” And, I was like, “No, I don’t.” [LAUGHTER]. “That’s not. That’s not what I wrote.” Like, “What?” I’m still trying to figure out how to get over that. I took some heat for that, and I didn’t write that. Sorry. I’m just a little bitter, maybe. So, moving on. How’s Michael?
MICHAEL LARSEN: “How’s Michael?” Michael is dealing with the fact that we have a little storm system running through California at the moment. Understand, California has been dealing with tremendous drought for the past several years. So, anytime anything falls from the sky, people freak out, [LAUGHTER], but we’re excited to get the rain. This is not necessarily a new topic. The proliferation of software passwords for just about everything under the sun, I’ve started using a system called LastPass, and LastPass is a method in which you can repository all of your passwords, which is great. But, what happens when your actual security provider gets hacked? That actually happened recently to LastPass and it should come as no surprise that, if you are going to be a source of protection, you’re probably going to also be a prime target. If you’re using the same password for multiple places, even if you put it inside of an encrypted password system, that encryption is going to show what the same password is. So, you’ve just given a nice, broad entry to somebody who hacks it and gets your account to say, “Well, it’s nice to know that Michael seems to use this password at 17 of his sites. Let’s see if we can figure out what that is.”
JUSTIN ROHRMAN: To me, this is somewhat reminiscent of that guy from LifeLock who was so confident in his product that he pasted his social security number on the side of a truck and had a drive around New York City saying, “I dare you to try and steal my identity,” and I think the underlying concept that these kind of share is, “Wow. We can have this nice secure system that makes us feel safe, if we create something that is lucrative enough for a hacker.” That, “If I just break this, I’m going to get some much value out of this.” In some ways, we have to be aware that we’re painting a target on our foreheads to some degree.
MICHAEL LARSEN: It’s a great tool. I want to make very clear, I am not knocking LastPass. I’m a user of it. I love it. I think it’s a great system; but, even great systems need a little bit of help every now and then.
PERZE ABABA: I think password redundancies are definitely key as well. There’s a lot of providers now that have two-factor authentication enabled—Google, Amazon—and it really does help. Personal experience, a couple weeks or so, someone tried to hack into my Yahoo Account. The person started sending me text messages, because I turned in my two‑factor authentication. Yeah. This person was really persistent, and I think I still had an old account with a really old password that somehow got published into one of these hacks. I believe it was through Gizmodo. It does help to have two-factor authentication in place, because it’s a very good protective layer.
BRIAN VAN STONE: Yeah. And, I think today with everyone having a smart device in their pocket, two-factor authentication is so much more accessible to the average end user than it was in years past. You don’t have to have an RSA token on your keychain anymore. You can just get a text message on your cellphone.
MATTHEW HEUSSER: One of the sites that I work with, if I get my password wrong three times, they’ll just text me and they’ll say, “Hey. What’s the text that we just sent you? If you can’t get it, you’re locked.” So, there’s none of that, have them send you an e-mail, go check your e-mail, check your spam folder, follow the link, and reset your password stuff. They just send me a text. I’ve found that helpful. So, today’s main topic is, Skill On The Test Team, and I think we actually covered it in some of our news. We have these defects that you need more than just tester skill to find. You need to really understand the customer or you need to really understand the use cases or the product where you would’ve had to have written performance testing against the system and run it overnight to find the Nest defect, for instance. QualiTest Group actually has a blog article up on, The QA Job Market is a killer, would you stand a chance? And, I find it really interesting. I think as an industry, we are not very good at getting those skills down. It’s mostly junior, mid, senior, scripted manual, and automated exploratory, “I know how to use LoadRunner. I know how to use ClickDesk Pro. I know how to use Selenium.” But, test design, skilled reporting, test strategy, subject matter expertise are extremely hard for us to even think about them. When I think about interviews that I’ve had, if you bring those topics up, people are like, “Oh, yes. I am very good. I am awesome at test design.” How do you even interview for that? How do you figure out on your teams what skills you have and what you should be hiring for?
PERZE ABABA: For me, where the rubber hits the road is for them to show me what they can do. If they know how to design tests around, say, an API endpoint, I have them show me that they can actually do that. I’ve had a couple of walk-in interviews where the moment they, [LAUGHTER], step in the door, I give them a marker and have them talk to me about how they solve a particular problem, specifically around testing and tooling It’s actually pretty straightforward. The challenge I have with resumes is that they tell you a lot of what they’re supposed to test, but they never tell you how they test it. Having an interview process that reveals how they actually think and how they work around given problems is really key. One of my favorite, [LAUGHTER], interview questions is: I bring up the Google Homepage and just ask them, “How would you actually test this?” So, it really depends on what the person’s maturity level is in understanding the technologies behind the Google Search Page and what you can do with regards to search. You can actually see varying degrees of answers.
MATTHEW HEUSSER: You know, we’ve used auditions in the interview process. One phrase I like to use a lot is that, “I think the best interview is an audition. The best audition is one where you do the work. The best audition is one that’s three-to-six months long.” So, [LAUGHTER], we always recommend contract-to-hire to get to really know people. But, how do you decide that, “Gee, what we really need right now is someone who knows the technical details of testing APIs versus someone who can come up with the test design for APIs,” because the system is already in place? Versus, “We really need someone who understands the publishing process and how that works?” How do you figure out and map your skills on the team and where the gaps are?
PERZE ABABA: There’s a lot of gray area in that type of trying to find the right person that actually fits your context and someone that can blend with your culture. The other layer of challenge really is, it has something to do with, “How does this person actually behave under pressure,” and that’s something that’s very difficult to simulate during an interview process. That’s why a lot of companies have a provisionary period to see if this person is worth keeping, because one of the challenges, if you hire a full-time employee, after they get through that provisionary period, it’s going to take a whole lot of documentation information before you realize that, “Oh, this actually doesn’t work.”
MATTHEW HEUSSER: One of the things that I know QualiTest does a lot more than we do is they do a ton of contracting. I’m curious. A lot of the companies I work for actually force me to sign an MSA that says, “I will not hire your people, and you will not hire mine.” And then, every now and again, they ask me to waive it on their side. Is that part of what you guys do? I genuinely don’t know.
BRIAN VAN STONE: We have had situations where we have employees get hired out to customers after a contract and things have been able to be figured out in certain scenarios in a mutually-agreed-upon fashion. In a lot of situations, that can happen without damaging or straining relationships, and that’s generally good when it can happen. Companies need skills and people have skills, which means companies need people, and sometimes the right move is just to put the right people with the right skills in the right place. But, I think what QualiTest experiences more often towards this is a slightly dynamic different dynamic from what Perze is addressing; and because QualiTest is so opportunity-driven, we need to have access to all the skills on a moment’s notice, which means we have to get really creative in leveraging cross‑training opportunities—find every free minute, every free hour, every free day that we can to make sure that people are getting the opportunities to learn—and I think it’s really important to make sure that generating learning opportunities for your employees is a high priority so that you can future proof your skillset.
MATTHEW HEUSSER: Well, I know that Michael has hired for some of these techniques in interviews. Maybe you could speak to some of these things? Michael?
MICHAEL LARSEN: Sure. Happy to. Over the past several months, we’ve been interviewing for software testers. We got these resumes that, on paper, looked fantastic. They had all the right buzzwords. They had all the right exposures. They had done these projects, which they’d worked with the tools. And, more times than not, you came out with a very vague sense. It’s like, “I don’t really have an indication of how they would pivot in a differing situation.” And, one of the things that lead to was a module that Henrik Andersen of House of Test put together called, Context-Driven Robots; and, if you just do a search on “Context-Driven Robots,” you’ll find many of Henrik’s talks and workshops on that. And, it’s not so much the Context-Driven Robots scenario, but the fact that what you do is you take something—you can even substitute your own product with it—and you put in three phases of your testing and you make them somewhat contradictory because that’s what we deal with in the real world. You have a goal. You have a target. You have a road map.
And the, unfortunately or fortunately, some new piece of information comes along and some new opportunity comes along and you scrap what you were working and you go do something else or you tweak on something else based on the new information you have. It’s really interesting to see how many people faltered when all I said was, “Okay. Round two. Oh, by the way, I’m cutting your time by half.” Or, “Oh, by the way, we have this new requirement that needs to be worked in. Now what will you do? The person that we ultimately did hire recently, the reason I was impressed with their approach to the problem was they didn’t just sit down and give a canned answer, because I didn’t allow a canned answer. I changed the scenario, you know, three and four times on them. They got up. They went to a whiteboard. They started blocking out all the ideas that they had. Then, they stopped, erased, and tried some other avenue. They asked me questions. They probed for where we were going with the ideas that we had. And, it was so great, because during the more canned portion of the interview with this particular tester, at first, I was, “I’m sure if they’re going to be a really good fit for this.” But then, I said, “Let’s try this old game idea that I have,” and they just came to life.
MATTHEW HEUSSER: You know, I wonder, we’ve had a couple of exercises that we like to use that are given this problem space to design some test strategy around it. Specifically, “What tests would you run? What are your test ideas?” To take that for a second round and actually say, “Okay. Given those test ideas, these are the bugs you find. What is your next round of test ideas, or are you done?” I really like the idea of pivoting to see how people can learn from the experience and adapt to change.
JUSTIN ROHRMAN: A couple of themes I’ve noticed so far. One interesting thing to me was the comment Michael made about looking at resumes and trying to pick them apart and discern skill from that. So, I’m not sure I’ve ever actually been able to look at a resume and figure out whether a person was skilled or not. Obviously, talking about execution and planning and heavy on the tooling side, I think those are good heuristics for people that don’t know what they’re doing.
MATTHEW HEUSSER: We should find three, four, five LinkedIn that we think are good and put them in the show notes and say, “We think these guys actually talk about what testing is and what their job is, and I would call these people back.”
JUSTIN ROHRMAN: That’s not a bad idea.
MICHAEL LARSEN: So, I’d like to add little bit to that. I don’t have a resume—not a real one in any sense. I do have a LinkedIn profile, and I tell people, “Hey. That is my resume for better or for worse.” What I do have is a blog. I tell anybody who says, “Hey. You know, we’re looking for etcetera, etcetera and could you send us a resume?” My answer is, “No, but here’s my blog. I would like to encourage you to read five of my entries. Spend 20 minutes. I promise, within 20 minutes, you will know if I am somebody you want to talk further to or if you just want to skip and move on.”
It’s actually proven to be very helpful for me, because, one, I have to actually explain why I’ve got blog posts up there. So, I have to go into some details about what I’m talking about. I want people to understand the things I actually know how to do and the things that I don’t. And, one of the things that I do with my blog is, frankly, I spend a lot of time talking about the things I don’t know or that I am actively learning in public. At least they see what my mindset is. They see how I approach a problem, and it seems to work.
MATTHEW HEUSSER: Yeah. A couple of good points in there. I don’t usually have a resume either. I have a LinkedIn. My LinkedIn says, “If you want to get to know me, read what I’ve written.” I used to actually go into all of their requirements, take my resume, tweak it to fit their requirements, write a cover letter, and send it to them. This was around 2011, when I went independent, and we started talking about contract positions. But, companies want resumes in a very specific format so they can compare 100 different candidates and pick the 1 who is the right combination of price and will do what he’s told and value. That’s not me. Right? [LAUGHTER].
BRIAN VAN STONE: You’re not a fit for that and you probably wouldn’t be very happy if you actually got the job.
MATTHEW HEUSSER: Right. So, that’s actually a sign to me when they say, “No. We really need a resume.” “Well, here’s my LinkedIn.” “No, we really need a resume.” Luckily, I haven’t been in those phone calls lately. But, yeah, I think that if you can distinguish yourself to the point that people can find out about you in a more meaningful way than a classic, “Here is the list of the technologies that I know,” yeah, that’d be a kind of person that I’d want to talk to.
PERZE ABABA: Yeah. I think you guys have some really ideas on this; because, in my experience, whenever I’ve been involved in interviewing candidates the biggest challenge for me has always been, “How do I figure out how this person thinks? How do I explore how they solve problems, how they deal with strife?” What you guys are talking about here is basically a way for the candidate to empower the interviewer to be able to explore that that side of them; and, when you can do that, I think you’re setting yourself up for success. You’re giving yourself a better opportunity to land that job that you want when you can open yourself up that way.
MATTHEW HEUSSER: Unfortunately, we are running tight on time. So, before we go, Michael had a couple of videos he wanted to talk about.
MICHAEL LARSEN: There’s a lot of testing topics that either you watch a screencast that is discussing some tool and you kind of glaze over because you don’t have something that you can directly apply it to or it’s a talk at a conference where you don’t actually get to really see what they’re discussing. I like the fact that there’s something in between that’s recently been put together. This is called, Whiteboard Testing. You can follow them of Twitter (@WhiteboardTest) where they mention Whiteboard Testing, short informative videos about testing, from the whiteboard. It’s where basically someone is standing in front of a whiteboard, they’re discussing a topic, and they write it out on the board. And so, you get a chance to see exactly what they’re thinking. You can watch it multiple times to absorb the ideas behind it, and I think it’s brilliant.
Anybody can do this. In fact, that’s exactly what they’re encouraging people to do. If you have an idea, just go up to a whiteboard, put your camera on it, discuss what you want to do, put it in two-or-three minutes. It’s beautiful. So, I strongly encourage, if anybody wants to pick it up, definitely check out the Whiteboard Testing Twitter Link and check out their YouTube Channel.
MATTHEW HEUSSER: Fantastic. Thanks, everybody. Anything you want to say before we head out? Maybe a conference you’re going to go to?
MICHAEL LARSEN: So, for those who are interested, Software Test Professionals Conference is going to be held just south of the San Francisco International Airport in the lovely town of Millbrae, California. If people are going to be at STPCon and would like to have some interest as to what to do afterhours, you’re in my hometown. I can probably help out and also San Francisco is just 15 miles north. I’m going to be doing a workshop on Training New Software Testers and sharing a lot of the things that we did with the Summer Camp Program. The other talk that I’m going to be doing is about accessibility and it’s also about another related discipline and that’s Inclusive Design. So, if you happen to be in Northern California the first week of April, consider coming to STPCon and sitting in on my talks.
MATTHEW HEUSSER: Thanks Michael. I, myself, am going to be, in May, at Agile and Beyond in Dearborn, Michigan. So Midwest represent. I’m going to be at Agile Testing Days 2016 in Germany. Software Testing World Cup, we’re going to have the Americas round, North America, June 15th. I think it’s like $10.00 a person or something. It’s really cheap. It’s three hours of testing. If you win and the sponsorship works out, you get a trip to the World’s. At the very least, you’re going to get a free Conference for registration. But, last year, we got enough sponsorship to provide airfare to the winning team. And, Agile Testing Days 2017, Americas, is going to be coming to Boston in March 2017. That’s something to put on your calendars.
Thanks everybody for being on the show. We’ll be recording another show next week; and, if you’re in the listening audience, we’re going to publish these about every two weeks.
Thanks, everybody, and we will see you next time. Thank you for listening to The Testing Show sponsored by QualiTest. For more information, please see https://www.qualitestgroup.com. Do you have questions, or is there a topic or an idea you’d like to have covered? Please e‑mail us at: [email protected], and we will answer questions we receive on air.