Agile Testing Days – Day One

The Agile Testing days conference is now in full swing here in Potsdam, Germany! Noted below are my summaries and thoughts on the talks on day one of the conference.

Scott Ambler – Disciplined Agile Delivery: The foundation for scaling agile

Agile teams should adapt to their respective context when working on delivering software. For example, a team of 5 people will work very differently to a team of 50 people. That’s not to say that either team is wrong to work in one way or another, it’s just what is best for that specific team.

Similarly if you take the case of a distributed team spread across several time-zones, they will perhaps emphasise more on electronic information in their sprint tracking system compared to that of a co-located team, who sit next to each other in the office. The truth is there is no one size fits all, if something doesn’t work for your team and the stakeholders – where is the value?

I enjoyed this talk. It was presented in a very honest and open fashion. The points made throughout were not sugar coated as I may have expected before hand. It brought up very interesting statistics on the adoption of software practices, all the surveys cite their sources and can be found here.

Jan Zdunek – Going Agile with Automated GUI Testing: Some personal insights

I was immediately drawn to this session when I was first eyed up the conference roster for day one.

I’ve been working on the client side of our products for the best part of almost 3 years now at Caplin and this was an area I was very keen to hear what others had to say as it’s a very big problem to tackle in agile. Needless to say, it was pleasing that a lot of the ‘findings’ didn’t come across as a surprise to me.

Don’t test your business logic via the GUI – This should be done in your business logic layer.

Jan suggests that you should write a dedicated business testing API to act as the layer between your tests and the business logic which testers can use to construct their acceptance tests. This is exactly how we go about testing our blades using Verifier so it was good to see other teams using this approach.
Avoid ‘record and replay’ tools – Using tools such as QTP is not really a solution, not in today’s world.

The tool does not know what you want. Record and replay software also are known to have problems with asychronous applications and is unable to recreate timing issues.

Choosing the right tool –  The most important thing when choosing a tool to automate your GUI interactions is to research and properly understand the limitations of it – because you WILL reach them (E.g you may need it to interact with the Operating System).

I found myself more often than not agreeing and nodding my head to a lot of what was said in this talk, it’s always nice to have your approaches reinforced by other people 🙂

Andreas Grabner – Performance land mines and how to address them

Performance is important, we all know this. Let’s take a look at Amazon – they make about $67 million a day on their website. If a page is delayed by 1 second, they stand to potentially lose a an extremely large sum of money simply because they failed to meet a user performance expectation.

Users don’t like to wait – The perception of ‘slowness’ can be and often is, a deal breaker to businesses.If we look at the online retail space, a delay in a page loading may well be a trigger that makes the end-user want to look elsewhere at a rival company. It’s also not uncommon for them to want to share this information with the rest of the world on their twitter stream.
This talk covered many ‘real-life’ examples and case studies which was very insightful. One of the key lessons to take away from this was the importance to always test your production environment, both internally and externally.

If you’re going to be testing performance, or been bitten by performance issues in the past I would definitely recommend checking out the DynaTrace blog.

They also have a free web book on Java Enterprise Performance which may be of interest to the Java developers out there, it can be found at book.dynatrace.com.

This was definitely one of my favourite talks of the day.

Janet Gregory and Lisa Crispin – Debunking Agile Testing Myths

Janet and Lisa presented a very well received talk on a lot of the myths floating around about agile testing.

Going along with the theme of myths, the presentation was filled with pictures of Unicorns, Werewolves, a video of Frankenstein coming to life and I think there may have even been a picture of a donkey. Without going into detail of all the myths that were covered, here are a couple of the more interesting ones:

Myth – “Test is dead”

Alberto Savoia caused quite a stir at the Google Test Automation Conference (GTAC) last year by claiming that “test is dead”. Janet and Lisa were quick to stress that it must be taken in the context of the point being made. Of course you still need to test your software, but Alberto states that when, where and who it is done by should be driven by “doing the right ‘it'” as opposed to “doing it right”.
For more information, check out this video from the GTAC 2011 key note .

Myth – “Testers must be able to write code”

It’s true that if you look at a look of the job posting these days for tester jobs, that there is a very wide-spread demand for testers with development skills. Even at Caplin, if I look around at our QA team, all of us are able to write code – but that’s not to say there aren’t other qualities which contribute to testing on agile teams. Exploratory testing is just as (if not, more) important than automation.

Testers don’t necessarily need to be able to write code, they do however, need to have technical awareness; the ability to communicate with the developers to create a more comfortable working partnership and be able to look at some code and read it to see what it’s trying to do.

Myth – “Agile = Speed”

 The reality is that new teams who attempt to adopt an agile way of working, go slower, A LOT slower. In this transition, the teams need the time to be able to re-factor their existing code and also be given the buffer to avoid additional technical debt building up. The challenges are that this can be a hard sell for the development teams to go to the business (or customer) and convince them of the long term benefits of avoiding technical debt when as far as they are concerned, things seem to be ‘going OK’.

The focus on quality is with a view to having speed further down the road, “speed is the by-product,  not the goal”.

Andreas Grabner – Product demonstration of Compuware APM (application performance management) in action

This session provided an in-depth overview of ‘Compuware APM’, which boasts some very powerful performance diagnostic and monitoring features. In a nut shell, you are able to embed “agents” into your application and with the software installed on your servers, these agents will then be able to log and monitor stack calls all the way through your entire end-to-end system. The user is provided a customisable dashboard which allow you to really drill down into the very details of your full stack performance.

Whilst there are other free tools available to help with this type of monitoring, I have to say I was very impressed by the level of polish and extensive set of features in the application. I may have to see if I can grab myself a trial copy to try together with our products here at Caplin.

Alexandra Shladebeck – Why World of Warcraft is like being on an Agile team

Being a bit of a gamer myself, my curiosity just got the better of me on this one and I felt I had to attend this talk. Alex took the inner workings of the very popular MMORPG and the in-game concepts of skills, race, quests and raids and related them back to the different roles in an agile team.

The question posed to the attendees towards the end of the talk was “what role a tester plays in an agile team, a healer, damage-dealer or a tank?”.

I’m sure you’d get plenty of interesting answers if you went around and asked different members of the team! I found myself in agreement with Alex in that I see testers as the healers in the team. We get our tests running from the start, applying ‘healing’ over the duration of the sprint and we help support the rest of the team productive towards it’s goals.

Lasse Koskela – Self Coaching

This talk was very different from all the other ones on the day. It was focused on self coaching and self improvement by having a better understanding how and why we react to things the way that we do. For example how we may be less willing to do chores at the same time as our favourite football team is playing on the TV and the impact this has on the relationships of those around us when we go against what we know to be ‘the right thing to do’.

The role of a coach is to ‘help clarify other people’s thinking so that they are then better equipped to make better decisions with a broader mindset’. This can be relatively straight forward when helping another individual, but when trying to facilitate this same coaching onto yourself, it can be a lot more difficult.

Some are better at it than others, but the point put across was that if we want to be able to coach ourselves to achieve a desired behaviour or outward attitude, we need to be more mindful and conscious of triggers or signals which give us a ‘warped’ view of the world.

The trick is to be able to ‘catch yourself’ as you find yourself thinking in an undesired way or negative attitude, having a greater sense of self awareness of this should allow yourself to make a calmer, level-headed decision. This talk may well merit a blog post on it’s own as it’s a very interesting topic.
What I will say is that this was a very eye-opening talk and was hugely well received by all the attendees I spoke with afterwards.

Leave a Reply

Your e-mail address will not be published. Required fields are marked *