The Testing Show: Security and Privacy

The Testing Show: Security and Privacy

Sometimes, you can find experts on topics in unusual places. This week we discuss security and privacy with Doug Traser, an Information Security Manager with Five9. He’s also the guitar player for Michael’s band, Ensign Red (or is Michael Doug’s singer? We’re never entirely sure). Regardless, if you have questions about security, OWASP, polities that drive you crazy and wondering if any of this makes any sense, Doug has some answers, and maybe raises a few more questions.

Also, in our news segment: what happens when Amazon Echo might hold the key to a murder trial. Can your personal digital home assistant testify against you in a court of law?




Any questions or suggested topics?
Let us know!


MICHAEL LARSEN: Hello.  Welcome to The Testing Show.  Thank you for joining us.  I’m Michael Larsen, your show producer, and today we have our usual suspects.  Let’s welcome back Jessica Ingrassellino?



PERZE ABABA: Hello, everyone.

MICHAEL LARSEN: Justin Rohrman?

JUSTIN ROHRMAN: Good morning.


MATTHEW HEUSSER: Hi, everybody.

MICHAEL LARSEN: Please welcome Mr. Doug Traser?

DOUG TRASER: Hello, everyone.

MICHAEL LARSEN: Matt, this is your gig as moderator.  So, I’m going to turn the time over to you.  Have fun.

MATTHEW HEUSSER: Thanks, Michael.  Now our guest this week, we are joined by Doug Traser to talk about “security.”  We’ve never heard his voice on the show, but he is a long-time friend of the show.  Because you’ve heard him play the guitar on the intro and outro music for how many episodes now, Michael?  All of them?

MICHAEL LARSEN: Yes, all of them.

MATTHEW HEUSSER: I’ve never met the guy or said, “Thank you” or sent him a Starbucks gift card or something.


MATTHEW HEUSSER: Do you drink coffee, Doug?

DOUG TRASER: Absolutely.  Yeah, I’m in IT.  [LAUGHTER].

MATTHEW HEUSSER: Well, I mean, so we start the show with our news segment and move into Doug’s security manager.  One thing that really impressed me was you were the senior security manager for Gymboree before you got your current gig.

DOUG TRASER: Yes.  Yes, I was.

MATTHEW HEUSSER: So, I’m a little curious about how much of that is IT and how much of that is physical security, but we’ll dig into that in a minute.  But first, it’s time for that news segment.  Michael dug up a story about Alexa and security and privacy.  Tell me how much of this I get wrong.  There was a murder defendant in Arizona, I think, and he had Alexa.  Alexa is an always-on device.  There’s a famous XKCD comic where the guy walks in and the first thing he says is, “Alexa, order me 10,000 pallets of creamed corn[i]” or something like, “10,000 cases,” just to test, and it’s also because it’s funny.  But the defendant had Alexa on and the police believe that that information could be useful in the murder trial; because, essentially, it’s having an audio recorder on all the time in your house.  I see two pieces of that:  There is the legal piece of getting the data, but also the ethical piece of, “Should Amazon really be recording everything we do?”  So, Michael, tell me what I got wrong about the story?

MICHAEL LARSEN: You got it pretty good.  The story[ii] doesn’t actually mention the literal murder case, but the brief that they’re talking about for this, about whether or not Amazon actually should able to do this and give records over for murder trials and such.  The brief was filed in Arkansas.  Other than that, yes.  The way the article leads off is, “Amazon Echo can seem like your best friend until it betrays you, because this device is different from anything else in your house.”  In the sense that, after you use a “wake word” to Amazon Services, it’s then recording everything that you say so that it can respond.

MATTHEW HEUSSER: So, it’s not all the time.  It’s only when you wake it up.

MICHAEL LARSEN: It says, “Thankfully, before Alexa can betray you, Amazon is taking steps to push back.  Arkansas police recently demanded that Amazon turn over information collected from a murder suspect’s Echo.  Amazon’s attorneys are contending that the First Amendment’s free speech protection applies to information gathered and sent by the device; and, as a result then, Amazon is arguing that the police should jump through several legal hoops before the company is required to release your data,” and I think that’s good.

MATTHEW HEUSSER: Well, did they have a warrant?  The article is unclear.  Even if they had the warrant, which means the judge signed it, is that enough?

MICHAEL LARSEN: Well it says here so, “If Amazon has its way, the police must prove that the state has a compelling need for the information and that the material can’t be obtained elsewhere, a receipt in a person’s possession,” is what they use as an example.  “Information must be specific and integral to the investigation.  If the police meet this test, the judge will review the information in private and decide what information, if any, should be disclosed.”

MATTHEW HEUSSER: Hmm.  Okay.  So, they’re saying the bar for a warrant should be higher, and they’ve got to know what they’re looking for.  They can’t just do a phishing expedition, “Give us all the tapes, and we’ll see what we can find.”


MATTHEW HEUSSER: That’s where they’re going to have a problem, because that’s exactly what they’re doing.  [LAUGHTER].  It is totally a phishing expedition.  [LAUGHTER].  It is completely and totally, “Oh, let’s hope we find something.”

MICHAEL LARSEN: “You have an Echo.  We would like to check that to make sure that nothing interesting shows up there or that something interesting shows up there.”  You would definitely need to have some kind of a compelling example or reason why you would think that there could be something there.

MATTHEW HEUSSER:  If you had the defendant on tape saying something like, “I don’t know how to wipe my echo.  I said that one thing.  Oops.”  Then, you would have some evidence that there was some evidence in there.  It’s possible to meet that standard.  That’s a pretty high standard.


MATTHEW HEUSSER: So basically, I just don’t want to have an Echo.  I don’t want to have Alexa.  I don’t want to have any of that stuff.

JUSTIN ROHRMAN: I mean, to eliminate that problem, you’re going to have to stop using, like, any Internet-connected device.  There was a VIZIO TV [iii]that was sending all of your viewing habits back to Sony, which they were selling to TV companies.  So, like, every device in your house is spying on you to some degree.  The real way is the way of Luddite and just get rid of all this Internet-connected crap.

MICHAEL LARSEN: Hey, we live in a new world now.  I mean, apparently microwaves can spy on us.



MICHAEL LARSEN: I will leave the context to that as an exercise to the reader.  This is a non-political show.

JUSTIN ROHRMAN: Your Hot Pocket is listening to your conversation.


MATTHEW HEUSSER: It’s also on XKCD comic book.  Is that based on some political thing I didn’t know about?





PERZE ABABA: Really, Matt?


JUSTIN ROHRMAN: No, this is computer.

JESSICA INGRASSELLINO: I don’t have TV either, but bruh?

MICHAEL LARSEN: This is the Kellyanne Conway talking about the fact that—

PERZE ABABA: Seriously, bruh?

MICHAEL LARSEN: —Trump was wired tapped in his hotel[iv] and that he was being spied on through his microwave.  It’s been quite the Twitter sensation.

MATTHEW HEUSSER: No, I don’t.  That’s weird.  I don’t know.

MICHAEL LARSEN: Okay.  Matt, I am now absolutely convinced, if you have not heard about that or you haven’t run across that, you are probably not going to have to worry about your computer spying on you, [LAUGHTER], man.


MICHAEL LARSEN: Well done, by the way.  I’m impressed.  [LAUGHTER].

MATTHEW HEUSSER: So, with that send up, welcome to the show, Doug.

DOUG TRASER: Thank you.

MATTHEW HEUSSER: So, maybe you should tell us a little bit about what you’re working on and how you view that interplay between security-privacy power, users, and customers in business.

DOUG TRASER: Well, I approach security with the basic triad of confidentiality, integrity, and availability.  Privacy falls under the confidentiality aspect.  It is one of the key points that we focus on most, especially in a retail space.  My time at Gymboree, that’s all it was, was confidentially.  The Integrity part of it, that was somewhat important.  Availability, that’s important to the business.  But, the confidentiality piece is important to people, and that includes the privacy component.  People expect that their information, that their personal life, is going to remain as private as they want it to be.  Not as private as a company or even as a tester or developer wants it to be.  The developer is going to want integrity and availably to be a key focus in most cases.  Their job is to turnaround a product at a certain schedule, on a certain time.  There is sacrifice of people’s privacy in a lot of instances where some loose code might allow disclosure of their confidential information.

MATTHEW HEUSSER: We had Dan Billing on before, and he was more of a practitioner of security.  So he was, “Let me show you how to do this kind of penetration test.  Just start with the OWASP Top 10[v] and go down it.”  I think that was kind of tactical advice for testers.  It sounds like this could be more holistic, philosophical.  So, if you’re a software tester, what should you know about security—or a test manager—how do those roles overlap?

DOUG TRASER: The OWASP Top 10 is almost the continuous mantra of every security practitioner, when they’re talking to developers.  That’s generally because PCI has become this overbearing governance model that everybody’s using.  It does apply a good set of rules; and, when you’re trying to assess against the PCI Data Security Standards[vi], you’re going to reference the OWASP Top 10.  That’s the checkbox in the compliance audit.  OWASP obviously doesn’t consist of just the Top 10.  It’s a much bigger model and it’s a much bigger framework; but, if those Top 10 or covered, then the application that’s being tested by the pen tester is going to pass, in most cases.  There could be some other under-penning issues in the code that’s going to still allow disclosure of privacy; or, in the case of the Alexa, that’s not necessarily a flaw in the software.  That is its specific design.

MATTHEW HEUSSER: I would think that threat modeling at the manager level would be a core part of the role, and my favorite threat modeling example is the company that had outsourced customer service to a developing nation where everything was cheaper, which meant American dollars went a lot further.  They analyzed, “Who has the data?  Who has access to the data?  How are they going to get it?  Is an ex‑wife going to call in and try to find something out?”  Which is actually a remarkably large percentage of the cases in health insurance, where the husband may have a condition, and they’re just trying to hide.


MATTHEW HEUSSER: Or if they thought he cheated and there’s an STD or something, and it was actually customer service.  You call in and they say, “What’s your social security number?”  You tell them.  “What your date of birth?  What’s your full name?”  You tell them.  They said, “If customer service has access to that, they can write it down a slip of paper and, bam, they can get a credit card in that person’s name.”  So, they just said, instead of the customer service screen showing the SSN, the person who called in would say the SSN.  They would type it in and hit enter and it would say, “Yes” or “No.”

DOUG TRASER: So DTMF (dual tone multi frequency)[vii], when you press the buttons and it senses what buttons that you’re pressing.  That’s very popular in a lot of call center software where they’ll ask you to enter that information instead of you speaking it now.

MATTHEW HEUSSER: Oh, right.  Don’t even say it out loud, just push it into a computer, and then when they get to a customer service rep, it’s already done?

DOUG TRASER: Exactly.  That’s the whole purpose of why that emerged.  It’s to prevent that employee that’s taking the call from writing down personal information and using it later.

MATTHEW HEUSSER: So, the point being that threat modeling is a way to identify who can get the information and who would be motivated to get your information and whether or not it mattered, and that you can put appropriate security in place on top of it.

DOUG TRASER: Exactly.  Identify the threat agent, “How do you reduce the risk that the agent can impose?”

MATTHEW HEUSSER: Or you could flip it the other way around—and we’re back to my power users—where you could say, “So, what.  What can they do with that?  Nothing.  So, let’s open that up a little bit more and make our lives a little bit easier.”

DOUG TRASER: I think what we see in the InfoSec World is a lot of times we’re put into the position where we have to put together some type of process in a vacuum.  We don’t involve the right people at the right time.  We create the process based on security.  That functionality piece, that goes out the window; and then, when that comes back in and we’ve learned that we’ve disabled him—we haven’t enabled him, we’ve disabled him—that’s when the policy goes through a maturity cycle, and the process becomes more mature and you actually start getting to what their actual real goals should have been in the first place.  There seems to be this disconnect a lot of times in organizations that InfoSec is tasked with protecting data and developers are tasked with writing the code.  Testers are tasked with getting what was developed into production and making sure that it’s ready for production more so that the three facets just haven’t touched each other.  It’s when that does come together and there’s a good common understanding, that’s when you can get a good policy and procedure in place.

MATTHEW HEUSSER: So, there are two stories that reminds me of that I think are relevant to testers today, and then I want to throw this to Perze to see if from his slightly wider vantage point at a much bigger company if he’s got similar stories.  We have a contractor that’s working at an $11 billon company, and she’s been there a year.  Everybody loves her.  They tried to hire her in twice as an employee.  They sent me an e‑mail saying, “Let’s get that contract renewed again.  It’s getting silly now, budget-wise, very short renewals.  We’ve got to process them a lot.”  I sent them the paperwork back and forth, but it didn’t get signed.  She went a week with no legal contract, but she is still showing up every day.  Using her card, logging in, and doing her job.  The contract says, “With opportunity to extend…,” as it has for the last five renewals.

So, they’re going to pay us.  It’s okay.  We’ll figure it out.  She got locked out of the building, because her contract was supposed to end March 6th and it’s March 11th or something.  The weekend came and the security guy gave her a little bit of wiggle room, but you know, she was out of contract for seven days.  Just doing his job.  I still don’t know the status.  Like I’ve got a PO back, but it doesn’t have signatures on it.  So, I don’t know.  She might get locked out of her account.  She might not be able to do her job, and that was someone trying to get the appropriate role-based security based on the contract length.  That’s just a risk you take.  The other opportunities, if the contract ends and they come in anyway and do who knows what, that’s a bit of a sticky wicket.  So, I will pass it to Perze.  Do you have similar problems/stories with security?

PERZE ABABA: For me, it’s really around the notion of, I guess, the differentiation between privacy and confidentiality.  My understanding of it is that privacy refers to the person while confidentiality refers to the data that was taken out of the person.  So, whether the methods that were used to gather data, you know, from a given person—example, for like double-blind studies.  I guess my question really is:  Do we actually have adequate provisions around these things so that we know that we can maintain the confidentiality of data?  So, is this thing done at the federal level, the state level, or is that something that each of the companies, who are dealing with sensitive-type data, (is it up to companies’ level) for them to be able to define that, of course in relation to existing federal laws?

DOUG TRASER: Every place that I’ve work privacy information and what falls under privacy, there’s different regulations and different compliance models that you fall under that really kind of define it for you—GLBA[viii], PCI, HIPAA[ix].  A lot of those compliance and regulatory frameworks define, “What is categorized as private information?”  If you are getting three pieces of information that can personally identify somebody that is private information.  Most organizations should be able to identify that in their datasets and determine, “What do they need to do to protect that data so it does not get disclosed to inappropriate parties?”

A lot of times that’s not the case though, as we’ve seen.  There’s plenty of instances where that information is not being stored correctly because corners have been cut.  Not enough security on the frontend, and the “keys to the kingdom” are right behind that initial door to be able to decrypt that data.  So they may have encrypted it, but the “keys” to decrypt it are right there in the same space and it’s quite easy for anybody that can get past the boundaries.  I would say that the government has defined it, but it’s not been defined in a vacuum.  It’s been defined based on what they know.  We obviously know that Social Security numbers are being used for identity theft.  Even just knowing somebody’s home address and name and phone number falls under the criteria as being “private information.”  There was a brief—

MATTHEW HEUSSER: What?  So, like can’t be PCI compliant?

DOUG TRASER: That’s not PCI—right?—but that is privacy information.  It can fall under GLBA, if there is information.  So most people, if they don’t opt to have their information in Yellow Pages, you have the option to say, “I don’t want my information listed,” then it has to be protected.  But, if you allow it, then you allow.  That’s it.  Explicit permission, but you have to give explicit permission for that information now.

JESSICA INGRASSELLINO: So, Doug, I have a question and it’s based on my experience in a few start-up companies, which at one of the companies where I worked, they wanted security testing in place and they needed it because of the client contracts that we were signing.  As a tester I was like, “Well, okay.”  Everybody said, “Oh, OWASP Top 10,” blah, blah, blah.  I’m like, “All right.  I got that.”  But then, the company wanted to go a step further and they kind of said, “How are we going to do this?  What’s it going to look like?”  So, I found myself looking through, kind of, all of the OWASP Resources, and I was given to understand that, even though I was tasked with this, I should absolutely be working with an InfoSec professional, but I didn’t have that luxury at the time.

So, I wound up going to the OWASP OpenSAMM Model [x]to try and put together an iterative security approach for what was a very small company that had no specific private InfoSec; although, the company had built in some decent security features and did a good job at preventing DDoS attacks and all that kind of thing.  So, I guess my question is:  From the perspective of the tester that may either want to do this or may be tasked with doing this and have no idea—I found OpenSAMM, but it took me weeks to go through and sift through the information—what would your recommendation be for the person who kind of needs to do this and doesn’t even know where to begin?

DOUG TRASER: It depends on the test that you’re running.  Most organizations, because of the development process, have transitioned from being a Waterfall Method to the edge Agile Method.  It’s a lot harder to put security checks in the development life-cycle now.  We really have to kind of push the responsibility onto developers and get them involved at the beginning.  When they are writing their code and, let’s say, they get done writing a module and they checked that into Git, the most optimal method would be to, one that code gets checked into Git, it goes and it gets scanned, static code analysis then identifies where vulnerabilities are.  The developer comes back the next day, he has a report of where he is violating the OWASP Top 10.

Once it gets done going through all of those iterations, there should be a report that’s done before it gets to your level to identify, “Here’s all the things that are part of the OWASP Top 10 that we’ve taken care of.”  It’s really geared more towards developers to write their code and to write their code better.  Another key aspect—and it’s actually a requirement under PCI Data Security Standards—is that developers go through secured development training annually.  Every year they have to go through some type of refresher, and you have to provide evidence to the QSA that your developers have undergone this training.  There are some controls being put around it.  Trying to digest the OWASP Framework on your own, it’s daunting because it’s changing.  It can be interpreted one way or another.

JESSICA INGRASSELLINO: And, it’s gigantic.


JESSICA INGRASSELLINO: I mean, it really took me weeks.  [LAUGHTER].  It was worth it.  It was very interesting, but I just scratched the surface and it took weeks.

DOUG TRASER: Yeah.  I don’t know it.  [LAUGHTER].  I have to go to the website every single time to figure out which control within the framework that I’m really talking about when I’m speaking to a developer.  The best suggestion I have is, “Always have an Internet connection to be able to look at it and try to gauge against it.”

MATTHEW HEUSSER: Well, I would say, “Start with the Top 10 and then figure out the tools.”  Like, if you’re just doing penetration testing, which is the testing piece of it, right?


MATTHEW HEUSSER: But, to reframe something, under Waterfall, we can hire a penetration testing team, and they could do pen testing at the end before we released, every six months or once a year.  We’d pay $10 grand or whatever and they’d give us our bugs and we’d fix them.  But now, we’re releasing every two weeks, if not several times a day.  We just can’t, unless you want to put a pen tester on staff at, add another stage-gate in the process, and slow everything down.  We just can’t afford to do that.  So, we’ve got to push that responsibility on the team, which would be developers having an understanding of how to write secure code and then testers figuring out ways to question and explore and look for opportunities that might not have been considered, and I think that’s a pretty mature way of looking at it.

Honestly, I think the big difference is, if you’re an insurance company and stuff is all inside the firewall, you can do that a lot more informally.  Or, the pieces of your software that don’t involve, if you can carve things out that aren’t compliant, carve out these pieces that have nothing to do with it.  These pieces touch private health information.  It’s the component architecture.  “These don’t,” the things that don’t, you can security test much more informally, and you might not have to have the proof that you’re speaking about.  If you do need the proof, you’re just going to have to do a lot more work.

MICHAEL LARSEN: There’s a few tools that you can utilize for this.  There are a couple of resources that I found that are interesting, especially if you are somebody new to the idea of security testing or you just want to get your feet wet and play around with it.  One of them is a professional tool that you pay for, and that’s Veracode[xi] (, and Veracode has a number of configuration options that you can do.  You can setup a static or dynamic scan of your app, and you can choose.  As you were saying, if there are special areas of the app that you actually care about this for, you can designate that you want to look at this.  If that is the case, then you can actually isolate where you are doing your security testing so you can get the most information which is good, especially if you’re going to be running tests frequently.  If you’re going to be running a test on an entire site, their standard protocol is that it may take anywhere from 3-to-5 days to do the necessary scanning.  If you’re releasing multiple times a day, that model is just not going to work for you.

You’re going to have to bring it in-house, and there are a number of things that you can do.  In a book that I recently got access to and that I’m currently looking through from Packt Publishing, and it is, Security Testing Through The Bash Shell.[xii]  There’s a lot that you can do with the readily available tools such as Kali Linux[xiii], and Kali Linux is something that everybody could have access to on their desktop, if they wanted to explore this.  Of course, I always have to mention this with the caveat that anytime you use any kind of a security tool that has anything to do with penetration testing, you have to be aware of what you’re doing and you have to be alert to the fact that you could inadvertently screw a lot of stuff up and really wreak havoc if you are not careful.

PERZE ABABA: Yeah.  So, one of the tools that we actually use for my group is Qualys[xiv], among others, which I believe is kind of a parity with what Veracode has.  The one good thing that I’ve seen with Qualys is that they do have an option where you can have the free section of the tools where, you know, you can run an OWASP application audit scan for free, and the good thing is they have APIs for this.  So, if you want to be able to run this in parallel as well while you’re running your performance checks, for example, or your shallow checks or deep tracks or while you are verifying a PR, for example, these things are the stuff that you can just integrate in your build and deploy system so that it can run parallel of the other things.  I can probably send a link in the Show Notes so that people who are interested can play with it, but I think there’s definitely a concern with how fast we release things.  But the good thing is, there’s tools out there now that kind of give you the ability to catch up with how fast we want to release things, and you still have a certain semblance of compliance before you push something on to production.

DOUG TRASER: There’s an Open Source version of this.  It is called, “SonarQube[xv].”  They do have enterprises levels that allow for team deployment and enterprise deployment models as well.  This actually will go into the code, analyze what parts of the code that the vulnerability exists, and produce a link so you can go right to that piece of code.  It will highlight what part of that code is the violation of one of the OWASP Top 10 and gives the developer basically a map directly to where he or she needs to go to resolve the issue.

PERZE ABABA: Right.  I can actually attest to that, because that’s part of the static tests that we run against our code base.  The beauty for that too is that it does go at the, you know, “What is the method that you actually used?  Did you actually use a deprecated method that is now, you know, insecure?”  Yeah.  It’s pretty straightforward to install, and you can link that to your Jenkins as well.  It’s pretty fast.  So if you use an IDE for development, then it has plugins across multiple languages, and it is a really cool tool as well.

MATTHEW HEUSSER: We didn’t plan on this, but Veracode is a QualiTest partner.  If you want to get started integrating security into the continuous integration and you want to drive it through your test organization or you want to involve an organization that knows about how to do this and has done it a lot of times, give QualiTest a call.  If you want to do it on your own and want to save money, I just found out that Sonar[xvi]—I know Sonar is good for unit test and code coverage—apparently, you can use it for security tests too, Open Source.  So, that’s awesome.  Before we get going, Doug, is there anything else we should know about security or guitar?

DOUG TRASER: [LAUGHTER].  Well, guitar has nothing to do with security, and it’s much more fun.  [LAUGHTER].  But, in the security realm, it’s really about educating.  It’s discipline.  It’s definitely not a checkbox.  Anytime that an organization tries to do security by checkbox, it’s not necessarily security.  That’s passing an audit.  Discipline has to be something that becomes part of an every-day process/action that is done.  It becomes part of the fabric of everything that you do.  A good security InfoSec professional security tester in your organization will help fold that into the fabric of everything that you do with some acceptable processes.  I always like to say, “It’s easier to change a ‘no’ into a ‘yes’ than a ‘yes’ into a ‘no.’”  Most security professionals will have that mentality.  You’re going to start with everything closed and only open up what is needed.  If you open everything up and then you try to close it down, it’s going to be a never-ending story.  So, at some point, you have to start with a, “No,” and work towards the, “Yes,” and you’ll have a much cleaner product at the end.

MATTHEW HEUSSER: Okay.  Thanks for being on the show, Doug.  Where can we go to learn more about you and what you do?  LinkedIn?

DOUG TRASER: Sure.  Yeah.  Yeah, I’m on LinkedIn[xvii].  That’s really kind of my only social media presence now.

MATTHEW HEUSSER: Okay.  What is everybody else up to?  Jess?

JESSICA INGRASSELLINO: Well, the PyCon Education Summit [xviii]is coming up in mid‑May, and so gearing up for a really exciting time.  We’ve doubled the audience allowance.  So, we went from 75 last year to 150 seats this year, and we sold out in the beginning of January.  So, people are really excited about this and we have some great speakers and a lot of great topics.  So, I’m very excited to be sharing it and see it coming up soon.


PERZE ABABA: Yeah.  So, just a quick plug on the NYC Testers Meetup[xix].  We try to have one every month.  So, just check us out on our Meetup page—  Also, to the south of where I am, the Ministry of Testing in Philly[xx] actually also started a Meetup.  While that Meetup already existed, they just, you know, kind of renamed it, and it’s spearheaded by Mark Tomlinson.  If you guys are in either areas, please check us out.

MATTHEW HEUSSER: Michael dresses up like a pirate.


MATTHEW HEUSSER: So, if you live in San Francisco, where can we go to learn more about piracy?

MICHAEL LARSEN: Wow.  Okay.  That’s a segue.

MATTHEW HEUSSER: Well it’s a security episode, right?

MICHAEL LARSEN: Absolutely.  No, that’s a good point.  Well, you’re brining in a few of my loves in this one.  I do actually work with a living history crew.  We’re called, “Pirates of the Silver Realm[xxi],” and you can learn more about us just by doing a search for, “Pirates of the Silver Realm” or go to, and you can find out a bit about who we are and where we set up.  We bring a historical encampment and a literal 18th Century presence to various events.  If you are in and around California, we have a number of events that we will be participating in.  By the time this show goes live though, the one that I would definitely—if you really want to see “pirates on parade,” so to speak, in a very large setting and in a fun carnival atmosphere—the big event is called, “The Northern California Pirate Festival[xxii],” and that is held on the Vallejo Waterfront across from Mare Island, and it is held Father’s Day Weekend.

It is a tremendously fun event.  There’s lots of vendors.  There are a lot of guilds that participate.  We’re not the only one.  We have dozens of them.  There’s actual nautical battles that take place.  We have a group that fires off actual swivel cannons, and we do mock trials.  A number of us actually get into our character guises, and we go out into the public.  There is the Jack Rackham and Anne Bonny School for Impressionable Youth, which is a great little program that they do for the young ones.  My daughters are actively involved as teachers for this program, and they love it.  So, there’s something for everybody.  If you are in the Bay Area and you happen to have this opportunity to come to the NorCal Pirate Festival, I’d love to see you and talk with you as my alter ego, and you can get to meet “Rash Dowtey” in person.

MATTHEW HEUSSER:                   Cool.

JUSTIN ROHRMAN:                        CAST[xxiii] and TestRetreat[xxiv].  Tickets are on sale.  Tutorials are announced.  The keynotes are announced.  So, come to CAST.

MATTHEW HEUSSER:                   The tickets for TestRetreat are on sale?

JUSTIN ROHRMAN:                        They are.

MATTHEW HEUSSER:                   Awesome.  It’s going to be huge.  We will have—AST isn’t going to do it, but I will definitely organize boat crew for that week.  Probably the night after TestRetreat.  So, please sick around.  It’s going to be good.

JESSICA INGRASSELLINO:          Sounds fun.

MATTHEW HEUSSER:                   I don’t think I have any particularly new, except I’m doing a bunch more writing lately.  I hope to see you people at Agile and Beyond[xxv] in May, at CAST.  I’ll be at SQuAD Denver in September[xxvi], and there’s a chance I get out to Portland for Pacific Northwest this year[xxvii].  We’ll see.  I think that’s it for the show, folks.  Thanks all for listening.  Thanks, everybody.  We’ll talk soon.

MICHAEL LARSEN:                        Thank you.

PERZE ABABA:                               Thanks, everyone.

JUSTIN ROHRMAN:                        See ya.



The Testing Show: Security and Privacy

Sometimes, you can find experts on topics in unusual places. This week we discuss security and privacy with Doug Traser, an Information Security Manager with Five9. He’s also the guitar player for Michael’s band, Ensign Red (or is Michael Doug’s singer? We’re never entirely sure). Regardless, if you have questions about security, OWASP, polities that drive you crazy and wondering if any of this makes any sense, Doug has some answers, and maybe raises a few more questions.

Also, in our news segment: what happens when Amazon Echo might hold the key to a murder trial. Can your personal digital home assistant testify against you in a court of law?


[i] XKCD: “Listening” :

[ii] A Murder Case Tests Alexa’s Devotion to Your Privacy:




[iii] What Vizio was doing behind the TV screen:

[iv] Trump Tower wiretapping allegations:

[v] OWASP Top 10 Project:

[vi] PCI Security:

[vii] Dual-tone multi-frequency signaling:

[viii] Overview of the Gramm-Leach-Bliley Act (GLBA):

[ix] Summary of the HIPAA Security Rule:

[x] OWASP SAMM Project:

[xi] Veracode: Application Security

[xii] Penetration Testing with the Bash shell:

[xiii] Kali Linux:

[xiv] Qualys Freescan:

[xv] OWASP SonarQube Project:


[xvi] SonarQube Enters the Security Realm and Makes a Good First Showing:

[xvii] Doug Traser – LinkedIn:

[xviii] PyCon Education Summit:

[xix] NYC Testers Meetup:

[xx] Ministry of Testing In Philadelphia – Meetup:

[xxi] Pirates of the Silver Realm:

Age of Discovery Productions:

[xxii] Northern California Pirate Festival:

[xxiii] Conference for the Association for Software Testing (CAST) 2017:

[xxiv] CAST 2017 TestRetreat:

[xxv] Agile and Beyond 2017:

[xxvi] SQUAD Conference 2017:

[xxvii] Pacific Northwest Software Quality Conference: