This time I knew it would be great since Gojko Adzic was the teacher and the subject was something that I find really interesting and useful - Specification by example.
I would be completely impossible to write down even a fraction of all the stuff I picked up during the two days but I thought I stop at the main points and write down what stood out for me.
As I've been following the BDD and Specification by example community intensively for two years I had quite a lot of knowledge on before hand - the main points I wanted to get out of this was practical tips on how to facilitate and handle the early phases of the lifetime of a specification done in this manner. Being a developer I have focused much on the later parts up to now.
Here are the main topics we discussed, but not in this order as I remember:
- Communicating with examples
- Key process patterns of successful teams
- What makes a good specification
- Fitting into a development process
- Adoption strategies and patterns
Communicating with examples
We started this part by doing a simulation (a Black Jack game) in which we tried out what happens when we try to do "traditional" testing on a short development cycle. And our group went into all the traps that I've been preaching against for so long; developers and testers stopped communicating, we didn't involve the customer, we focused on the implementation rather than on business value. We even got the acceptance test case but returned them as we thought it was a mistake. No group asked the customer (Gojko) what he was expecting.
From that we started to look on how hard it is to understand each other and how the telephone game plays out in a software process. For even such simple stuff as to how many points a simple symbol has there were a lot of disagreements.
We saw how using concrete examples could clarify that a great deal. And when we went back to our simulation and applied examples to the requirements we've got we actually could find a whole lot of inconsistencies in them. Here is was really powerful to find business concepts by using examples such as BlackJack and Busted for example.
Two kind of exercises for creating examples as a group stood out to me:
- Diverge and merge - where you intentionally let several groups work on the same example for a while and then merge the groups to compare and learn from each other.
- A feedback exercise which reassembled planning poker a bit. Write down a case and then each write the expected outcome. Compare and learn.
The main thing you want here is probably not the examples but rather to learn and get a shared understanding of the problem at hand. The examples is a great way to reach there and also to document them in.
Key process patterns of successful teams
This part I knew the most about during the course, since I've read the book and done presentation on these patterns. I still learned stuff of course:
- All the teams that was successful in implementing Specification by example specified collaborative and did that using examples
- Key examples is not all examples. It's probably 10-15 not 500
- Put up a couple of examples and try to break them (drill a hole in them) to see if all relevant key examples has been found
We then talked about some common collaboration patterns and when they might be applicable:
- All-team workshops - when stakeholders are available, you need loads of knowledge transfer (you don't know much about the domain) or you want to explore new ideas. Pretty expensive tough.
- Three Amigos - get a Business Analyst, Developer and tester together and do a mini-workshop. Hehe I drew this without knowing a while back. This is useful when you have mature product that your team knows a bit about.
- Ad-hoc conversation where you simple skid over and ask the involved people. This requires people being nearby and you knowing a lot about the product.
- Write + Review - you write a specification and then have somebody else review it. Good if your are distributed and have a hard time to get hold of people needed to answer questions.
What makes a good specification
I loved this part and learned loads from this. The main part of this was us dissecting and discussing a big load of examples (great idea - examples to understand how to write examples). Together we came up with a big list of stuff that was good or bad with them.
Here are the good part, as we finally summarized them:
- A descriptive title (what you would Google for to find this document)
- Has context under which the example executes
- No technical details such as database ids or web page class names
- Short - in fact most of the good stories were considerable shorter than the bad ones
- Precis - talks only about the example at hand.
- A structure in which the test was separated from the examples (Scenario Outline for Cucumber features for example)
- Boundaries included - we triage and try different values to find the boundaries for the example
- More than 1 example (see above)
- Uses the business language - the domain language
- Has the right abstraction level. This is a hard one but you reached this when you cannot remove anything without destroying it. Or another way is to ask someone to summarize the things you have written down. If they can and it's still understandable - use that summary.
- Clear and measurable expectations
- No new concepts introduced simply due to the fact that we're going to test this with a tool
Other stuff that came up from Gojkos presentation was:
- Don't write workflow scripts - write WHAT should be tested not HOW it should be tested. 90% of the team failing with BDD does this. 90%!!!
- If you find yourself writing about technical concepts - try to rephrase it into what that technical thing does
- Try to find breaking examples to find the boundaries
- Show the example to someone else and see if they understand it (given domain knowledge). If so - then you have the right level of self-explanation.
- Write the description of the test to be a description on how to read the examples.
Fitting into a development process
This was basically a number of case studies that showed how different teams had fitted the ideas of Specification by example into their process. It was quite interesting as it ranged from fully fledged agile teams to very rigid waterfall-type of processes.
A few tips I picked up:
- Set aside time before the planning (be it sprint planning or whatever) to prepare some key examples. That will make the actually planning much smoother not halting on the first question.
- The more stuff that is unknown the further ahead the initial team should work. Work a sprint ahead if needed.
- Get only bullet points with acceptance criteria from BA if they are very busy
- Don't try to write out full specifications at workshops - who writes the specification is not important.
- Emphasize collaboration and shared understanding
- Define tests as early as possible
- Make sure that you get a mindset of collective ownership for the specification
Adoption strategies and patterns
This was something that our group came back to. How do you introduce these concepts? How do you sell it to the team or the stakeholders.
Firstly you can say that this is a general problem that has to do with any change. For that I would recommend the book Switch - how to make change when change is hard that talk about a lot of strategies for change management. A quote that stuck is; "People don't resist change. People resist being changed" - if you get people to think that it's their idea you have gained a lot.
Gojko had three points:
Firstly you can say that this is a general problem that has to do with any change. For that I would recommend the book Switch - how to make change when change is hard that talk about a lot of strategies for change management. A quote that stuck is; "People don't resist change. People resist being changed" - if you get people to think that it's their idea you have gained a lot.
Gojko had three points:
- Change the team culture to a culture of collaboration. Collaborate on specifications and test, which will build trust among team members. Focus on delivering business features and not functionality which will build trust with stakeholders
- Remove waste from the process - make things precis early to establish a clear definition of done. Validate frequently and strive to get a single source of truth - the examples. This will make other things you do unnecessary (such as writing different documents for specifications and tests for example)
- Facilitate change - make sure that the examples becomes the main source of information (send links to them to answer questions), document business process (do not "write tests")
This section also contained a lot of other stuff but it was discussed and is hard to write down here.
Conclusion
This post was my brain dump. I don't expect anyone to get it if you wasn't there. If you still did I'm very happy. If you didn't get it - please ask a question below and I'll answer to the best of my knowledge.
Again - thank you Gojko for a great course. I learned loads from you. Again.
6 comments:
Nice write-up! I wonder if you could explain a little more on the concepts of 'finding boundaries' and 'breaking tests' (a.k.a. 'finding holes in them'). Best greetings! :)
Hi Pawel -
When you write scenario or example you want to write more than one example to show the different values and their expected results.
But when should you stop? At 3? 4? 1000? 1001?
That is a very hard question to answer in general terms as "it depends".
You should probably strive to have as few examples as possible and have them represent the different values that are interesting for the feature under test.
So if the example is about people getting free delivery for 5 books then examples with 4 and 5 books might a good start. That shows that you don't get free delivery for 4 books but do get free delivery with 5. That's the boundary.
But what can we more do? Can we "drill a hole" in the examples, break the knowledge they represent?
Yes, maybe... What about something else than books? Let's write an example with 1 CD. Should the customer get free delivery then? "No, it's just for books" or maybe "Yes, it's for CD's and books"
By using this technique we have not strengthen the examples and extended our knowledge.
I hope this clarify this a bit
Yes, absolutely! Thanks for the clarification :)
Hi Marcus,
thanks very much for sharing your impressions from the workshop and all the advice you put together in your post. It was very inspiring to our team (teamaton.com) and we had a great spec writing session right the day after reading your post :-)
Today, though, a tough question came up that my colleague and I were not able to answer for ourselves, even after almost an hour of lively discussion.
Our problem is with the boundaries and responsibilities of a given feature and its scenarios. My colleague argued that writing a specification for a given feature should imply that there should be no more behavior than that specified in the scenarios for that feature and that the tests should guarantee that. I argued that a set of scenarios cannot possibly guarantee anything outside of its own scope - every scenario can't even say anything about any other scenario in the given feature. For example, if we test the product page of our portal to contain certain elements, lets say a header and a picture, then in my opinion it does not imply that there can't be other elements as well, e.g. a picture subtitle, unless I explicitly specify so, whereas my colleague argues that this should be implicit since otherwise we could produce a situation where all our scenarios would succeed but we would see something on the product page that was never meant to be there. This in turn would mean that we cannot rely 100% on our specifications to tell us if the current development state is ready to deploy. How do you approach this kind of problem?
We would really appreciate your opinion on this hot topic!
Cheers, Oliver
Hello Oliver,
brrr - being referred to as "the one who shall settle our argument" feels scary. I'll give you my point of view and you'll two will have to fight it out. :)
I think that you're right in that the scenario only can "guarantee" what is in it's context.
One can also wonder if it's feasible to write scenarios for every thinkable possibility...
I like Gjoko's definition "Key examples" which I interpret as being "enough to understand the key business rules". So the specifications you write on this level should not (in my opinion) be test of every thinkable permutation, but rather examples that are representative for the business logic and understand how it behaves.
You could of course write tests for all possible permutations (or maybe not but a lot of them :)) but I see no reason to automate that end-to-end. Most likely those permutations will be faster and easier to implement with the use of mock-objects and you setting up a known state and test against that.
Examples of this is how a page is rendered, or advanced business logic and calculations etc.
So, in my careful nature (:)), you are both right. But I think it's better to focus the scenarios on the Key examples you need to understand the business rules.
Well that's my view on the matter
Post a Comment