Wednesday, 4 February 2015

The Evolution of Tools

I mentioned in my last post some exciting conversations that I had in the pipeline with the testing team of my new owners. I had one such conversation last week, in a meeting where some of the team demonstrated a range of testing tools that they had developed in house that were available for us to use.

Over the course of a one hour Web Meeting two very pleasant chaps from San-Diego talked us through three of the tools that they had developed in house for database testing tasks. Particularly interesting for me was the fact that, for a couple of these tools, we had equivalent in-house developed tools and I was interested to look at the similarities and differences between their tools and our equivalent ones.

  • SQL Generation

    I spent a chunk of my time last year looking into random SQL data generation. As a good excuse to teach myself Ruby at the same time, I developed a prototype SQL generation utility that could produce a range of SQL syntaxes based on a model defined in input files. I was therefore particularly interested in looking at SQL generation tools that the team had developed. As the San-Diego team talked through their utility it was clear that, predictably, their tool was at a much higher level of maturity than mine. At the same time, I started to notice some very familiar structures in the configuration files. The way that functions were defined, the parameterisation of specific variables, the definition of data schema all bore close resemblance to the ones in the prototype that I had worked on. I recognised structures which reflected what I had done myself to solve the problems that I'd faced in attempting to generate correct, relevant SQL queries.

  • Test harness

    As we moved on to discussing a more generic test harness the parallels to our own testing harness were even more apparent. The way that workflows, parallel scheduling and multi-machine tasks were defined all had direct parallels in our in-house harnesses. There were some differences in the mechanisms used, however the underlying structural elements were all present and the nature of the solutions overlapped extensively. The location and means of definition of test data, metadata and scheduling mechanisms across servers were similar solutions to challenges that we'd tackled in our own harness as our product evolved in line with our product over the past few years. I found myself standing at the screen gesturing to my colleagues the elements of the harness structure which could be mapped directly to our own.

Convergent evolution

My conference talk at EuroSTAR 2011 is testament to the fact that I am fond of applying evolutionary principles to the development of tools and processes. The fact that these very similar tools had been developed by independent teams in separate organisations reminded me of the idea of homoplacy - a process sometimes known as convergent evolution, in which separate species evolve highly similar characteristics not present in a common ancestor, to fill the same evolutionary roles.

For example many Australian marsupials bear a remarkable resemblance to mammals from other parts of the world despite having evolved completely independently.

There are many other examples, such as the similarities between Cactus and euphorbia, and Mantidfly and Mantis, whereby evolution creates the same or similar solutions to environmental problems.

Much as with evolution in nature, tools and processes evolve based on needs that arise within organisations. Working in similar product markets, it is understandable that our tools have developed independently to meet the same needs. What struck me when comparing my tools with those being demonstrated was the extent of the similarities - there were many familiar structures that had clearly been implemented to address the same challenges, within the context of a specific testing problem, that I or my team had previously encountered.

Time wasted?

The examples here are representative of what I have seen elsewhere - I imagine that it must be a common phenomenon. The 'convergent evolution' of tools must be going on throughout development organisations around the world. Taking SQL generation as an example, through my research on the subject I know that Microsoft, DB2 and MySQL have all developed similar utilities.

One obvious question that arises here is 'is this time wasted?'. If parallel efforts are going towards creating equivalent tools then would it not make sense to collaborate on these? Surely it would be more efficient if these teams combined their resources to create common tools rather than duplicating effort? Or perhaps that's a rather naive perspective.

  • Tools are valuable assets. Microsoft at least have never released their RAGS tool outside of their organisation. I suppose that the theory here is that releasing the tool would provide an advantage to others developing competitive products who haven't invested the time to developing them. A good tool can provide a competitive edge in just the same way as a well designed feature set or a talented team. For an open standard such as SQL, Microsoft have little incentive to release tools that allow others to develop competitive products. By way of an interesting comparison, for Microsoft's own standard of ODBC, they have made tools freely available - perhaps the improved adoption of their own standards provides sufficient incentive to provide free tools in this situation.

  • Joint ventures are prone to failure. As part of my higher education I studied joint ventures between organisations, with a focus on examining the high failure rate of this kind of collaborative project between companies. Based on the examples looked at, they rarely deliver to the satisfaction of both contributors. Some reasons for potential failure include an inability to relinquish full control of the project, and an imbalance between the level of input and the benefit gained between the contributing parties. When it comes to the relatively small and well encapsulated nature of tool development, it makes more sense to take this on in-house to maintain ownership and avoid the pitfalls of joint development.

  • Ownership and control are powerful motivators. Another option for avoiding parallel efforts on tools is if an organisation develops and maintains a tool commercially for others to licence and use. Whilst SQL generation may not have sufficient commercial application to be attractive here, I've seen many scenarios where teams develop their own tools even when there are good commercial alternatives. The incentive behind such an approach can be that a simpler toolset is required and so the cost of expensive commercial options is no justified. Another likely reason is that having control of your tools is a compelling advantage, it certainly is for me. One of the main reasons for developing our own tools has been the knowledge that we can quickly add in new features or interfaces as the need arises, rather than waiting for a commercial supplier with conflicting priorities.

A positive outcome

I came away from the session feeling rather buoyant. One one hand I was pleased that in-house tools development received such a strong focus in my new organisation. The team that we spoke to were clearly very proud of the tools that they had developed, and rightly so. A strong culture of developing the appropriate tools to assist your testing, rather than trying to shoe-horn your testing to fit standard testing tools, is an approach that I have believed in for some time. Whilst the approaches taken between the two teams was somewhat different, what was clear was that we shared a common belief that having the ability and flexibility to develop your own tools was a powerful testing asset.

More importantly, I felt proud of what my team and I had achieved with our tools. With a fraction of the manpower of our new owners we had evolved tools which could be compared favourably with those of a much larger and well resourced organisation. Whilst there was clearly some overlap such that moving to the some of the new tools would make sense over continued development of our own, in some cases the strength of our own tools meant that this was not a foregone conclusion. As I wrote about in my post on the Facebook Effect - we're never quite sure how our work stands up in the market given the positive angle that most folks come from when discussing their work in social media. Given this chance of a more honest and open comparison, I was justifiably pleased with how well the work of my team compared.


1 comment:

  1. Good post! It is interesting how different companies and/or locations, or same company but different groups or locations, do tend to come up with similar solutions/tools to the same problem. I've always believed in looking around to see what other people have done. This helps me find something similar and adapt it to my needs and vice versa.

    The mentality of "we'll do it all ourselves" can be a costly one. I tend to steal... er, I mean borrow from other people in order to help me get my job done. Why re-invent the wheel or try to build a better mouse trap when one may already exists.

    Jim Hazen


Thanks for taking the time to read this post, I appreciate any comments that you may have:-