Angie Jones, master inventor and automated testing engineer, joins us to talk about bridging the gap between the testing team and the rest of the development process, the challenges and limitations of automated testing, and some of the new and state of the art technologies in testing.
Today Angie Jones, a master inventor and automated testing engineer, speaks with us about what a master inventor is and what it took to receive that title, what testing automation is, having parallel between testers and developers, what developers can be doing to build more testable apps, and finally how modern web development has complicated automated testing.
Angie talks about some common problems when it comes to testing. The test team is often separate from the developer team, and it leads to communication problems. Testers should be working in parallel with the dev team to ensure that from the get-go they are writing a testable app!
Another common problem is that 100% test coverage gets pushed. However, that's the wrong idea. Automated testing is expensive to implement, so Angie talks about how she figures out what'll give the most "bang for your buck" when deciding what tests get automated.
Apps aren't as simple as they used to be, and thicker client-side UIs have made it much harder to implement automated tests. Automated Engineer is a fully fledged development position requiring skill across platforms, which is why Angie says that developers shouldn't be leading automated testing. There is a lot that automated testers have to do and separate skills that they need to develop. There's only so much a person can keep up on at once.
"Incorporating Testers with Every Development Phase with Angie Jones" Transcript
Joel Hooks: Hi Angie.
Angie Jones: Hi Joel, how are you?
Joel Hooks: I'm doing great. How are you doing today?
Angie Jones: Wonderful.
Joel Hooks: I'm super excited to talk to you and I want to mainly focus on your expertise in the software testing and automation field, because it's fascinating to me and I think not enough people probably focus on it, and it's something that really helps develop software at scale.
But I was reading your bio and I had a question before we get started in testing automation. You describe yourself as a master inventor, and I wanted to know what a master inventor is.
Angie Jones: Sure, so it's a distinguishment that's provided by IBM. So I used to work at IBM for several years, and for people who contribute significantly to IBM's patent portfolio and have inventions that are valuable to the company as well as help them evaluate potential inventions and mentor others who want to be inventors, this is a distinguished honor that they provide.
So it's not something I made up, it's actually a real thing.
Joel Hooks: Yeah. Makes sense. So basically is this software patents then?
Angie Jones: Right. Yes.
Joel Hooks: So my question was, and I've been a software developer for ten years and I don't have any patents, but I've never worked for a company for IBM that actually looks for that sort of thing, but I'm wondering how to you discover, what's the process in terms of discovery when it comes to these sorts of things and when you're identifying new inventions?
Angie Jones: Yeah so that was a question I had as well. So when I worked at IBM, the culture there was one that was very innovative and people prided themselves on this sort of thing, and there was a woman I saw who had this plaque on her wall in her office about a patent, and I was like, "What is this and how do I get involved?"
So she pretty much took me under her wing, and it's really just a slight adjustment in your mindset as someone who's already a techie. So for me, I was working in test automation at the time, and the way that I viewed things is I look for problems, and so patenting is pretty much the same way. You're looking for problems or you're looking for holes in a certain product or a technology space, and you simply come up with how do you fill that gap? How do you fill this hole?
And it's a lot of times comes from annoyance. So anytime that I feel slightly annoyed, like I think a product should do this, or I wish that I could some other thing, then I just take a moment, instead of griping about it, I say, why can't I do this?
And I think about it and say, "Okay, how could I go about doing this? What's missing here?" And I invent it.
Joel Hooks: That's really cool, and I'm wondering, do people outside of a large corporation like IBM, like it kind of makes sense in that context, but as independents, is it something that we can do or can we pursue acquiring patents and creating inventions ourselves in the software space?
Angie Jones: Yeah, you can, but it's, in my opinion, just so much easier to do it through a company. For one, there's a huge financial investment involved in doing so. So for example, just applying for the patent, it can vary based on what country you're in and that sort of thing, but there's, like when I submitted mine, it was like $25,000 just to have them review, and that doesn't guarantee that it's going to be granted. So that could be $25,000 you're spending, you get a "No" back from the patent office.
And then there's fees for legal costs and to have someone draft it up, so I write down my idea in my terms and then turn it over to a patent attorney who can write all of the legal mumbo jumbo, so of course there's costs with that, and all sorts of things.
So in my opinion, I would recommend going through a company, and every company I've worked at pretty much has offered some type of program. It's not always been as widely used as at IBM, but other companies have offered it as well, where they give you like an award so you get money for, some of them will give you money even for submitting an idea to them internally, like "Hey I have this idea." And that might be a check.
And then you get some more money if they actually file it and more money if it's issued and more money if it's involved in licensing and all of this, so yeah.
Joel Hooks: Yeah that's pretty cool though. I've never seen anybody that had that or heard much about that process, because I haven't been familiar with it, so it's kind of cool.
So we'll get into test automation though because this is where you've had your focus for the last 15 years of your career, and automation in general to me is such a powerful tool that we have at our disposal. And I'm wondering what specifically is test automation and how do you define test automation?
Angie Jones: Yeah so test automation, it's interesting. A lot of people think of it as a replacement for testing, like manual testing, which it very much so is not. There's still definitely a need for manual testing, but what test automation is is it's essentially another software development project that's focused on the regression test of your production application.
Joel Hooks: Right so does that include the whole scope of what like Martin Fowler would call the testing pyramid where we have unit and integration and end to end testing and then the automation that kind of glues all that together?
Angie Jones: I look at that pyramid as examples of test automation. So each one of those layers in the pyramid represents a subset of an entire test automation strategy if you will.
Joel Hooks: Right. Yeah because they're all different, right? Every layer of the test is really there to serve different needs and for different purposes.
Angie Jones: Yeah. I think it was Mike Comb's pyramid, actually, but yeah Martin wrote about it too.
Joel Hooks: So as a test automation engineer then, what's the area of responsibility that one would find themselves in as an automation engineer?
Angie Jones: So if you're really good at this, you look at the entire automation strategy and determine which test cases we should even automate. Once you determine what we should automate then looking at, "Okay, where's the best layer to target that? Should this be something that's done at the unit test layer?" And that might be something that the developers can tackle, and you kind of assist in the identification of that.
As well as if, "Hey, I thought of all of these tests after the code was already done, but they would probably be better suited at the unit test later," contributing them there, that's fine as well.
Then doing the whole, "Okay, I want to test this feature, but how do I automate against this?" And so looking at that, sometimes if we test it, we'll test it manually by going to the UI and clicking on all the buttons and verifying the outcomes and everything, but in automation you have to look at this from a lot of different angles, so UI automation is a lot of times it's brittle.
It's much slower than automating at these other layers of the pyramid. So you look at that and say, "Okay. This is what I'm actually trying to test. There's probably some set up here. There's a lot of different things that I need to do to get it in the state that I want it to be in," and a lot of them might be able to be done at like API layer for example or call in some business logic and then doing whatever you need to do at the UI.
So it's such a interesting space. I've been in it for quite a while and always learning something new. And any project I go to, there's going to be these different interesting challenges to overcome, so I really love it.
Joel Hooks: It's kind of like a holistic look at the entire system and how we're going to test that and make guarantees and click testing's always going to be there, but how are we going to make it easier and where we can focus on harder problems and eliminate a whole class of stuff that robots can do for us.
Angie Jones: Exactly. Yep.
Joel Hooks: So I've had a pretty decent run in the enterprise software space, which seems like where I've worked most with dedicated QA or test teams, and to me in almost every case it felt like it was siloed where there was test and there was development, and there was this tension between test and development, versus some sort of integrated team strategy, and I was wondering what your experience was with test and dev and how you think that that should be on a software project.
Angie Jones: Yeah I've had the same experiences where they were siloed groups, and neither understood what the other was really doing, and we saw that a lot when we were doing development in the waterfall method, where dev would do their thing and kind of toss it over to test and then they would do their thing. And as we're moving more towards agile and dev ops and trying to release a whole lot quicker, everyone is realizing, okay that's not going to work. This whole staged approach is not going to work and we have to collaborate to be able to do things quicker.
So as far as testing in general, I always encourage testers to be involved in every phase, whether it be planning or design, and I've seen amazing testers that can go into like a design meeting and be able to poke holes in a design and tell them, "Okay this is not going to work because of X Y and Z, or this is not aligned with the rest of the way that we built the product," and be able to identify that stuff before any code is ever written, and it's just so amazing. It saves so much time.
As far as automation engineer, I love to work with developers early on, especially let's say we're working in a two week sprint. So a developer has to write the code, let's say they're not done until a week and a half of that sprint. Well now, I need to write all of this automation code, and remember this is a software development project in and of itself, and sometimes, I tell people, sometimes your automation engineers are writing more code than your developers are. Your developers write this much code for the feature, whereas your automation engineers have to write all of these tests for that feature, so they need time as well.
And I've come up with strategies to be able to collaborate with development so that we can both write code in parallel. So some of that might be, "Let's get together. Let's mock out what this UI is going to look like, so we both have a clear vision in our head. Let's go ahead and talk about how do we make this UI testable? So what IDs and locators do we need for the different elements so that I can begin writing my automation?"
And so if I leave the meeting with that information in hand, I could go ahead and start writing my automation code before dev begins their coding, and that way we get done a lot quicker.
Joel Hooks: Yeah, I mean parallel makes so much sense. I always had a real problem with the idea that development and test was this separate thing that we're doing, and the reality of it should and can be that this is a collaboration and we're working together, and test isn't there, they're not out to get you, developer. They're here to help you and make a good product for the end user.
Angie Jones: Yeah, exactly.
Joel Hooks: Because sometimes as a developer when your tester's like, "Oh this is all broken," and you're just like "NO!" Like that kind of, but it's not [inaudible 00:12:47], it's broken, so we have to fix it.
Angie Jones: Right. I mean it's your work, we're all sensitive about our stuff, but yeah it hurts a whole lot less if you have your testers involved a lot earlier and so they're catching things where you're like, "Oh yeah. Of course I made this mistake. I'm not really even done but yet, but yeah I'll go ahead and fix that right now," as opposed to you put your blood, sweat and tears in it, you think that it's beautiful and then people rip it to shreds.
Joel Hooks: Yeah it's not your baby. It's not a child. It's code or whatever, like a feature. We got to step back a little bit.
Angie Jones: Exactly.
Joel Hooks: The end goal for everybody is to deliver a killer experience, then we're on the same page I think at the end of the day.
Angie Jones: Yep.
Joel Hooks: So one of the things that I kind of noticed, just in looking at test automation is that in a lot of ways it is like a developer position, but it's more focused on testing, but it feels to me like especially when you look at the automation part of that job description, it really does have a lot of crossover with what a developer would typically do, because you're writing software, just software to test other software. Is that a good way to look at it?
Angie Jones: Yeah that's exactly right. So I actually worked as a developer for a couple years. So I did automation and I did automation straight out of college, so then I kind of viewed it as this hierarchy where it was development was more superior to automation, just like, "Oh I want to be a developer." And so I got into development for a couple years and Joel, I hated it.
Yes we were both still writing code, but like I talked about with automation, I get to do so much. I get to exercise so many more development muscles, and during the architecture, and I have this complete autonomy over it. I have a larger view of the system and what we're trying to do here versus in development it would be like, "Oh I have to create this little widget or something like that." And so my view was just so scoped in development, whereas automation it was just much broader.
But yeah, we're all writing code. It has to be clean, all of these concepts are still applicable to test automation. We have to maintain it. We go through code reviews. All of that. So it's very similar, yeah.
Joel Hooks: How do you think people listening to this, just Egghead's audience in general is mostly developers, but should developers be picking up more of a test automation mindset and doing some more crossover work in that regard in general?
Angie Jones: You know, Joel, I wrote this controversial piece that says "Why developers should not lead your test automation projects." And a bunch of developers jumped on me as if I was saying that they can't do it. And it was funny because I've worked at quite a few places. I've done consulting. I've not met a whole bunch of developers that even want to do this, but me challenging them and them interpreting it as they could not do it, just got them all up in arms. So it was really funny.
Now, here's my thought on this. I think that a lot of times we have leadership that's trying to push everybody to do all the things, and I just think that sounds cute in theory, but when we're looking at everything that we have to do, it's just so much and one person can't do it all. So I think that it's fine to have everyone collaborate, but I do think that you should have someone designated to focus on test automation.
And like I said in the piece, the developers, they're focused on their craft. They have this huge task where they need to actually build something, and they're constantly trying to update their skills in their area. I find that it's unfair to ask them to also stay on top of all the latest and greatest things and test automation as well, because that's a whole 'nother niche there. There's design patterns there. There's all of these strategies and models and things, so it's only so much a given person can focus on.
Joel Hooks: Yeah for sure.
Angie Jones: So yeah I think that they should let someone lead it who is focused on this space, but they can always contribute to that based on the guidance they get from that leader.
Joel Hooks: What should developers be doing then to help us build more testable apps that flow better, I think is probably the question.
Angie Jones: Yeah that's a great question. Okay so I think that unit tests are really important. Those are going to be your fastest, most reliable test, so if developers take up the bulk of the unit test, that is extremely helpful. And I also think that it's probably better if a developer tries to write one or two more longer form automation tests so that they can really get a feel of how to make this more testable.
So there's a lot of, I worked at Twitter up until a couple of months ago, and a lot of the engineers that I worked with there were new grads. And so they were great developers, but they didn't teach anything about testability and stuff like that in the courses that they took.
And so while they were making a nice app, it's like, "Oh guys, I can't automate against this, or I could do it, because I'm a magician at this, right, but it's making my job a lot harder," and so it was just something that they didn't know on, "Okay. I need to put in locators and hooks and things like this and provide scenes and APIs and stuff so that our app is more automate-able and more testable."
And just explaining that to them and showing it to them, it's like, "Wow. Okay yeah sure I can do that." Or in some cases, well I'm using this Java Script library, so I'm not even developing this widget, I don't have access to put a locator on it. And so we would just communicate and figure out how could we go about adding what we needed at some point in the application?
Joel Hooks: Yeah. I mean that kind of circles back to the idea of during a spring or during a development cycle, we're all collaborating, and we're not just waiting til the end and being like, "Hey, here's what I developed. Now good luck testing it."
Angie Jones: Right. Right. Exactly.
Joel Hooks: So we can't test everything, right? We want to have a well tested app and there was this Google testing blog post that talked about Testivus, have you ever seen that one? I'll forward it to you and link it in the show notes, but it's just this hilarious, if you are into testing jokes, parable where you have Testivus and he sits on the mountain and the first developer comes up to him and he's like, "They say that we need this much coverage." And he's like, "Well just write a test and that will be fine."
The punchline is like my boss says I need 100 percent coverage, and he was like, "Write as many tests as you want, because that's what you'll do anyway." I told it poorly, but it's just a testing joke, and the idea to me, that I took away is that this concept of complete test coverage is something that you hear about and people talk about, but it's just like the wrong idea, and that we should, so what in your experience is the scope of the most bang for the buck as far as testing automation goes?
Angie Jones: I think that it boils down to risk, and especially when we get into test automation. So test automation is just so expensive to create and to maintain. And a lot of times we start the projects and it's like, "Oh let's automate all the tests, because we don't have time to do manual regression." And that's the wrong frame of thought. You'll quickly find out that automating all the tests is not getting you what you want. You have all of this noise and your built, heaven forbid you use them as blockers for like continuous integration or something like that, so now your build is failing because of something that is really low risk in value.
You don't even really care about the test.
So I put together this matrix, and I've talked about it. It's called, "Which test should we automate?" It's out on YouTube and stuff like that, but it's essentially like a way that you can score your tests, so here are all the tests that I had, and you can go through this model that looks at risk-value, cost efficiency and the historical failures around that feature area, and be able to assign a score to any test, that way you can then sort your test by scores and you can say, "Okay, maybe the top 25 percent's definitely things that we should automate. Maybe the bottom 25 percent, just throw that out. Don't even worry about automating it." And then everything in the middle you can use your judgment or whatever. Contextual metrics that you have that says, "Okay we should or should not automate this."
But the beautiful thing about that is that it's all sorted by score, and so you know what we should automate next, and what's our highest priority? So it's really all based on risk and value.
Joel Hooks: Yeah really quantifying the priorities of what we should focus on.
Angie Jones: Yep. Yep.
Joel Hooks: So one of my questions, and something that comes up, people talk about tests and testing and to me, it directly affects user experience, and I was wondering what your philosophy was on that. Is testing part of the UX of an application?
Angie Jones: Yeah, definitely. So we can have functional apps that are not usable or the user experience is horrible and we've all seen cases where that product doesn't succeed because of that. And development was tight, but the user experience just was not. I've just started working at applitude that focuses on visual validation. And I'm seeing as I'm learning in this role, I'm seeing a lot of cases where in test automation, you might have automated some tests and functionally, the test passes.
For example, I was in Raleigh last week, and I wanted to make reservations for a large party because I used to live there, and so I wanted to meet with some friends, and so I went on Open Table to make a reservation for a large party. And you know how they show you the times available, and so I went to click on one of the times, and there was a modal that popped up. All I could see was two buttons that said Select and they were poorly aligned and I couldn't see any text that went with those buttons.
So I had to dig in the dom. I was in chrome dev tools trying to figure out what do these buttons correspond to? And eventually I found the labels way off in the upper right corner or whatever. And so I was like, "Oh this horrible." But I could totally see where if I automated that test, that test would pass. That would be green in the bill because the buttons technically are there. The labels are there.l and so everything works, but as a human being, I wasn't able to use that without digging in dev tools, which most people are not going to do.
So the user experience, the usability was it failed, and that would have been missed with test automation.
Joel Hooks: Yeah that's interesting too, and it's like the other side of it, right? We can automate all we want, but perfection's probably not going to be something we achieve.
Angie Jones: Yeah.
Joel Hooks: So you've been in this for a long time. You've been doing test automation for 15 years. The web, in the past 15 years, if we go back to the early 2000s compared to now has changed so much. We got View and Angular and these kind of thicker client-side UIs, and I'm wondering how has that affected your job as a test automation engineer?
Angie Jones: Yeah oh my God. Yeah. It definitely has. So before I called it the simple web, where we had a page for every state and so it was much easier to be able to get to what you needed to get to, but now we have single page applications, and we have all this dynamic content and so it makes it much more interesting in the automation space, so there's a lot more coding that's needed. We have to do things like wait intelligently for things to appear and disappear in the dom, and so there's a lot of new tools that are coming up that kind of build a lot of that stuff in, so that we don't have to worry about it.
Also, we used to preach to dev, "Oh put IDs on all of your elements so that I can automate against it," and a lot of times I'm seeing cases now where with these new fancy, I call them the fancy UIs now, the new fancy UI's, that's not even enough having a ID.
So as an automation engineer, you really had to level up and be able to work with CSS selectives and X path selectives and be able to write an X path query that basically says "This exists in the dom, but it's not really visible on the screen, and so how do I craft a selector that says that?"
Joel Hooks: Plus not only are you doing that, but you're dealing with a extremely a-synchronous UI that can be changing by the second.
Angie Jones: Exactly. So it makes it interesting.
Joel Hooks: Yeah I mean it's job security too. If everything stayed the same, we'd just automate everything and we'd all be out of a job.
Angie Jones: Yeah, but that goes back to what you were saying about how automation engineering is very much so software development, like think of that challenge and having to be able to write a test that needs to be repetitive and work across different environments and take all of that into consideration.
Joel Hooks: Mm-hmm (affirmative). That's what I was saying. To automate this stuff, you have to kind of be a cross systems expert, really, and you have to understand the whole scope of the project, and what everybody's doing and all the systems or how they all work together, in the front end and the back end and [inaudible 00:27:54] and thick clients and the whole thing is a real challenge, really.
Angie Jones: Yep. Yep. Exactly right.
Joel Hooks: So what's the current state of the art in test automation?
Angie Jones: So selenium web driver is like the king of UI automation. There's been a lot of talk lately though of Cyprus. So Cyprus IEOs, Java Script. So a lot of Java Script folks are looking into this. Developers seem to actually like it and want to write tests with Cyprus, which I find fascinating.
There's also a huge demand for API automation, so getting away from the UI. We have a lot of people that are developing web services and going from monolithic apps to microservices and things like that, so they're looking for people who can automate against that. That's a totally different scale and being able to thoroughly test a web service, so that's something that's on the rise as well.
People are also looking at things like visual validation, so like that UX thing I talked about earlier, being able to put eyes on your automation to be able to actually do the UX piece of this. So a lot of interesting things.
There's also a lot of people that are looking into AI and testing and how that might be able to help us in these automation efforts. So some of the challenge is like the locators changing and all of this as the application is continuously being developed upon. Using AI to be able to still find elements even though the locator has changed, or helping with test data strategies and things like this, so it's also something I'm looking at AI to be able to identify, "Oh, this is a shopping cart page. I don't care what application this is. I'm able to determine that this is a shopping cart page. So here are some out of the box automated tests that you could run against that."
So it's really- yeah real interesting.
Joel Hooks: Yeah. So I want to close it out, I saw, I think it was a talk that you were giving or are going to give that describes testing as a game and I thought that was really intriguing, and I was wondering if you could kind of expand on that, and tell me what that means.
Angie Jones: Yeah so I did that as a key note at Selenium conference a couple of weeks ago. It was called "Level up. Playing the game of test automation." And I actually studied game design. I developed a web app that's a game, and I talked about how having this game designer mindset, how it helps me to view some of the challenges within test automation, and how to navigate those.
So some of the things I talked about is like how if I design a game, a game has to have a goal. A lot of times we get into test automation. The first thing someone says is, "Oh let's build a proof of concept." Without even thinking about, "What's the goal of this?" So you wouldn't start a game without knowing what's the goal and what's the objective? If you do, you're just aimlessly wandering around. You don't know which place to make. Same thing in test automation.
And I talked about different rules, how do you define rules? How do you navigate those? Game flow and strategy. And then I talked about Richard Bartle's taxonomy of player types. So this was a huge hit at the talk where I talked about he defined all the players that come into the game, he's able to categorize them into four different categories, and so I talked about how people can identify with these and what they are doing and what they should do and proof.
So it was a really good talk. Really interesting. People seemed to like it.
Joel Hooks: Yeah people have to check that out. Is it available online?
Angie Jones: Yeah it should be on YouTube by the end of the week.
Joel Hooks: Cool. Well we'll look out for that and we'll add a link to it.
And I had one other question. I was curious about the test bash and the ministry of testing, and I've seen you've been involved with them for a while, and I was wondering if you could tell us what that's about, and I know that there's a conference upcoming if you wanted to mention that and it might be actually before this comes out, but at least give people some awareness that it exists.
Angie Jones: Yeah so Ministry of Testing is an organization. They throw these conferences, as well as other things, it's pretty much like a community, so they have testing courses and blogs and things like that, but in addition to that, they also do these conferences around the world. And for the very first time this year, it's coming to San Francisco, so I'm helping them organize that and bring that to San Francisco November 8th and 9th.
And I put together the program. It's an amazing program. So being in San Francisco, I wanted some high tech type of talks, so some really interesting things, things that we don't talk about enough in testing. Some cutting edge stuff, so things like testing server-less cloud apps, AB testing, testing voice first apps like Alexa, there's even a developer who's coming to speak and talking about how the testers have gotten into his so much so that when he's writing his code, he can essentially hear the things that they're going to say and it helps him to develop better.
So yeah, it's going to be great. I love this conference. I speak all over the world, at so many conferences, and I have to say that test badge is my favorite, just because of the community feel there. You just get to meet so many wonderful, supportive people.
Joel Hooks: Yeah. I know Rosie Sherry and she's always been really, just the whole idea of Community First, and I've never worked as a test automation engineer, but just the community around it and just the love that that group of folks shares is really amazing and inspiring to me in terms of community development. So it's cool.
Angie Jones: Yeah yeah. She's amazing.
Joel Hooks: Yeah. Well Angie, thank you so much for taking time out of your day to talk to me. I really, really appreciate it. I think that this stuff is important and I think developers should be more aware of what the test team is doing and I think that it shouldn't be developers and testing. I think we should all just be one team and we should build great apps that are well tested to deliver to our users.
So thank you so much.
Angie Jones: Thanks for having me, Joel.
Joel Hooks: Cheers.
Angie Jones: All right. Bye.