Friday, December 3, 2010

From the dark ages, to test automation

Some days inspiration comes so thick that all your ideas begin to stick together in one melding like pancakes with butter, maple syrup, and strawberries.  What follows is a second post, that begin inside my earlier posting: Where in the world are all the software testers?  So if some ideas from that post are also found mixed in here, that is why, but I did my best to refactor my initially 2,500 some odd words from the initial post to create what I feel are two very distinct posts.  While the first one is my thinking about Alan Pages blog on forward thinking, this one is a bit of an experience report, all be it a bit of a history about my first experience in software, and a bit about some of my more recent challenges.  I describe the trial of working with out a source code repository, and even describe some of my frustrations at having to build using clearly out dated technology.  Sometimes it is hard to realize how valuable any given tool is, unless you have ever had to try to work on a project without that tool.  This is where I began, and to forget how I first tread foot into the industry, would be to forget how far I've come, how much I've learned, and how much more there is to learn and do...

When I first started in Software, testing was seen as something I did when the time and necessity called for it.  Though initially I was hired as an integration developer, focusing on piecing module code from one place into another in the stable product line, testing was something that seemed as if it was just part of the job, but not necessarily the most important.  Maybe its the way software engineering is glamorized in the courses I took at WVU, I'm really not sure; nevertheless, as I began to grow understanding around the product, its structure, and purpose, It was not long before I found myself in a "Test First, Test Often" mindset.  

The reality of my first professional Job seems almost barbaric compared to recent projects I worked on.  There was no source code repository, no Visual Source Safe, Subversion, or CVS in general use, so this meant that when new modules were being developed they were always in what was deemed as the last stable development build.  Many times the build that the features were added to comprised a number of new features and bug fixes, so the porting of that code had to be done laboriously by hand.  If I had not been so green as a Developer at the time, and had scene the value of source code repositories before my professional time, maybe we could have changed how things worked.

In any case, it was my job to kick the tires on these features by porting them into the latest and greatest build, and then seeing what the new additions actually did.  Suffice it to say, I highly recommend a source code repository for projects, especially when you have people working remotely or on different sites.  However, this was the situation I found myself in as a young software practitioner.   Finding all the right bits of code, initially was not easy.  Sometimes I would get complete source, and have to scour every class for changes (And really, with little background on the project before, It is quite possible some of those changes had been changed not by that module, but in previous bug fixes in the new build.)  So I began pushing back to have some way of seeing and marking what areas they had changed, so I could focus there, and not on other parts of the code.  The wrap around comment with tags for each module became the first thing I searched for when integrating.

That's how the process was initially described to me, so that's how I went about my tasks early in that assignment.  This turned out to be a harder more time consuming method though.  With developers deployed overseas, sometimes on different operating system setups, even the code they wrote that worked on their systems did not always work correctly on hours.  Were these Regression errors that crept in as features kept being added?  Or were these simply the result of the different platforms we were using?  To this day I am still not certain, but the obvious solution came that we began requiring a demonstration build of each feature.  First it would allow us to see if the build actually worked on our production systems which were perhaps closer to what our customers actually used, and secondly it would also allow us to see whether the feature worked as expected before putting the time consuming 'monkey work' of copying code A to class B.    So I found myself doing about 50% more testing than actual clerical/coding work on our project, and found a lot of bugs in the process.  Suffice it to say, I look back on those first steps as very raw, hard ones in my development professionally, and am thankful that I have learned and moved beyond such outdated techniques.

So it has been just over five years that I have been on this latest assignment, and one area that I have found myself thrust into has been testing.  What others on our team realized, though I couldn't see it from my eagerness to help out the team in any way, was that I was actually a pretty good tester.  I'm not sure where this skill, or perceived confidence in me as a tester came from, but as a Colleague said to me just this past week, "He's a testing Wiz."  I smiled and accepted the compliment graciously, but deep down, while others think I have it all figured out, I know that I have a long way to go before I consider myself to be any kind of expert, at Testing or Software Development.   Truthfully I wonder if it is even possible to be an expert in anything software related given how fast the state of the art moves in some areas.

So what does this have to do about forward thinking?  Before the last year I had not seen myself as a tester, just a software developer, eager of doing whatever the team required to succeed on our projects.  I stepped up to an offering of a different role as a developer of test automation, and also testing in general, and I embraced the chance to learn and do something new.  I did so not enter this assignment knowing how long I would be kept on the project, and I did so, having done perhaps more coding work than actual testing on recent assignments, but I didn't let that stop me.  As I had recently learned from the classic book from Spencer Johnson, M.D. "Who Moved My Cheese?" to not only anticipate change, but to embrace it, and relish it, I plunged into what initially seemed like a rather dull engagement, and not to mention highly time consuming, but I could see why they needed someone to focus on test automation, it just frankly took too much time to have the regular testers doing it.  

So I embraced the change, challenge, and at times boredom of test automation, and learned how to string together tests in a version of the test tool we used.  While I looked for and anticipated change, I had set my mind to get 'comfortable' in performing these tasks the way they were demonstrated to me.  I didn't know at that time that anyone else had tried and given up on the effort in the past, nor did I have any prior experience or research to back me up in finding a better way, but it was a start.

As I pushed forward through these challenges, I took time when I could to research the certain unique problems I encountered along the way, and at some point I stumbled onto a blog.  I don't recall which blog it was that I first read, but I began reading this blog and using the challenges to learn more about testing in general.  Then one blog linked to another, Google Reader began suggest some other similar blogs, and suddenly a new world of testing was open to me, and as I learned, read, and absorbed new ideas, and techniques, and it transformed my understanding and view of the field.

Now I am engaged on a new project, one where I am not constrained to a particular tool set, rather constrained to doing the best job I can in testing the application to find bugs, to learn, and to apply as never before new found technology.  Some of my tasks have afforded me the chance and need to develop a test harness, utilizing both C# and Selenium.  Perhaps later I shall try to bring some examples of the generic process  I went through to create and build a test harness for some of the automation tasks that make sense for this project.  For now, it feels good to no longer be back in the dark ages, but to be in the present ever living moment, applying new ideas as I encounter them, learning and building both my knowledge, as well as wisdom on when and where to use certain testing ideas.

Where in the world are all the Software Testers?

One of the blogs I read on a regular basis is Alan Page's Tooth of the Weasel Blog.  Recently I have been particularly engaged in thought on the question of forward thinking as it relates to software testing.  There have been discussions on Twitter recently about just these sorts of things, and as Alan wonders on his blog entry Careers In Test:  "What are the new ideas in testing? What is our role in the future of quality software? How do we advance the state of the art in testing?"

These are interesting questions, and initially a lot of this discussion focused on some of the concepts other authors have been writing about for years. The problem seems that in many places testing doesn't has the appearance that it hasn't changed, that it isn't any different than it was twenty years ago.   Now I have only been in the technical field of software professionally for about eight or nine years now, so I do not have much first hand experience to go on about what was the common practices in testing fifteen or twenty years ago.   However, I have been able to get a peak back by reading every morsel I could get my mind's fork into as I gobble up old software test articles, or books written that give some shading and hue to the background of the field I now find myself thoroughly engaged.

Once upon a time, testing was just one sliver of what I did in my role on a software project, but now testing is very much a key and important part of everything I do, for it is by testing that we learn about the projects that we build.  Which brings me back to Alan Page's recent entry.  Where are the forward thinkers? I don't know if I qualify as a forward thinker just yet, but I am certainly more aware and active in my role in testing than I was before.  But Alan raises an interesting question, why don't we see more Forward thinkers.  I personally believe they are out there, but where I could not guess, and there certainly are some forward thinking minds that are now practically famous in the testing community now, too many for me to list them by name.  However, it got me to thinking about why things may have been different twenty years ago and one word came into my head:  the net.  

Twenty years ago even dial up service was hard to find in some areas, and the so-called "Information Super Highway" had not yet materialized as a tangible valuable asset that it now is today.  When I think about testing, and the norms of any work place, I can't help but picture that many are just working in their current area to earn a buck, to put food on the table, to continue to exist with some level of lifestyle familiar unto them. 

I find that many people are developers or testers by day, but by night they put off that cloak and become something else, a husband, a wife, a father, a mother, a friend, a sportsman, a couch potato, a bachelor, whatever.  Many of these people are very bright in technical areas, and people I hold a great deal of respect for.  I imagine there are many Joe or Melissa, average testers out there striving to make their little corner of the software world cleaner, like a broom trying to get the last few kernels out of a corner.  So I found myself this morning commenting on Alan's blog and found inspiration for my first Blog of December.  (Thanks Alan!)  So here is the crust of what my thinking was this morning: my original comment is here.

After writing some more this morning I had more thoughts on this, so I'd like to even take it one step further than that, to issue a challenge to my fellow testers out there, some may have already done this, but earlier in the year I put out a question on twitter for any testing societies within my home state.  As of right now, there are none, none that I can find, so this is something of a troubling thing for me.  One of my hopes, and goals should I continue to do as I have here in West Virginia is to find a way to establish a community of testers either within just my corner of southern West Virginia or perhaps the entire state.  The problem is how to start it.

The first is to reach out to testers that you know.  On my hand, I can think of four people at work who are testers on other projects, and there may be others.  The challenge I think is to find a way to network with those we know, those we may come in contact them, and invite them into this world that is growing online.  My challenge is this, what can you as a Tester do to grow our reach as a community?  I challenge you to invite one person, one possible tester into this world.  Invite them to check out at least one part of our growing community.  Whether it is the articles and blogs on Software Test Pro or show them the site for the Association of Software Testing.  Link them to blogs and sites for other testers like Michael Bolton, Lanette Creamer, James and Jon Bach, Matt Heusser, Alan Page, or maybe some other site or part of our community.  Let's do our part to build the community of testing up, and through that maybe we may find out where in the World are All the Software testers?


(I forgot to link to Alan Page's Blog post, Thanks Michael Bolton for catching me on that, I will try to do better on my citations in the future.)

Friday, October 22, 2010

An End, and a new Beginning...

A lot can happen in two months time, nations have fallen, court proceedings have been completed, and sports seasons are nearly done.   When I last posted in August, I wasn’t quite expecting to go all of September without a blog entry.  I knew that based on the schedule for our Cub Scout Pack, Soccer, and Work schedules that October could be a heck of a month, but that was still almost six weeks away. 

I had just completed a research project, and come to the determination that if we wanted test automation that was easier to maintain, and of long lasting value, that we would very likely have to move away from using the older tool that we were using on the project.   Then came the exploration of a slightly newer version of the tool, and yes it provided some new bells and whistles, but when it came down to it, I still felt that given the way our website worked, that tool was not meeting our needs.

It’s a shame that it took me nearly a year to come to that conclusion.  Oh I knew the tests I was capturing, building, and testing were brittle, and sometimes it’s not something that’s easily avoidable given development practices on any given team, but it had become crystal clear to me, that building and maintaining automation did not have to be so hard.   Well, it is shear irony that I came to this conclusion just a couple of weeks before I was told they would not be keeping me on for the next contract year.  I thanked the project manager for allowing me to learn so much from the time I spent with their team, but realized that shifting gears at this point was not a logical possibility.  So I resolved to capture as many tests as I could in the short span of time I had left, and to leave behind documentation of the struggles, and conclusions I had come to so that in the future better solutions could be chosen for their team.  I am hopeful that work was not done in vain.

It was an interesting project, and I am thankful for every moment, every topic that came up in the pursuit of becoming better at testing, and specifically test automation as I worked to test for that project.  I really enjoyed working with the great group of people that comprised that team, and I hope I may one day be able to work with some of them again, but it was good to move on.   

When I took the step of faith to step out of my cushy box as a software developer, to walk on the other side of the cube so to speak, it was a bit of a risk.  I had done some testing off and on before, it seemed to always be something I would be called on to perform now and then, but without any formal training, or much relevant experience to fall back upon, I did not know if I was equal to that task.  However, I was eager to learn new things about software development, and was determined to become as solid as possible as a tester on this project.  In hindsight, I believe I achieved that goal, and as a result opened a vault of untapped knowledge relating not just to software development, but testing as well. 

The experience truly has transformed my thinking, although I now feel somewhat like a software development mutt, caring both about increasing my skills as a developer and tester.  The new project I am on promises to permit me to grow, and apply some of the techniques and skills I learned about testing, even some that were not applicable to the kinds of testing I was required to do before.  Plus, as a bonus I may get to learn more about a new technical area of which I have always had a small interest in, but little time to really dig into it until now.  

So the old project is done, a new one is on my plate, and I couldn’t be happier for the change, change truly is good, and I thank God for the opportunity to continue to build knowledge related to testing, even as I find myself straddling a fine line between being a pure developer, and a pure tester.   There will be more posts coming about lessons I’ve already learned and applied on this new project, but for now I am happy to be where I am professionally right now, and I can’t wait to see what’s around the next corner.  So while I’ve come to an end, I’ve discovered that it’s really just a new beginning.

Wednesday, August 25, 2010

Navigating self learning

I recently have pondered how things have changed in the quest to self educate myself concerning the software field, testing or other areas of interest.   When I was still a youth, I remember going to the mall and visiting Waldenbooks, and thinking what a great place this is, they have so many books.  Oh the days of browsing just for fiction, thinking life was good.

Then moving to College, I found libraries with thousands of tomes, perhaps a million books on shelves, and bookstores that carried a lot of books not just texts.    This is also the place where I learned the evils of the text book industry.  Some books you might want to sell back, finding little utility in them, may get stuck in your possession due to a new edition, or a semester where no class is utilizing the text.  I learned early in my college career to be wary of where I spent my hard earned monies on books related to technical interest.

At a trade show, I once bought a new hard drive for my computer, a book on Access 97 published by Que, and a Java book, that was pretty worthless the moment I bought it.  It was a hard lesson to learn, and for a time after graduating I steered clear of books as much as possible, not knowing how to pick a good book from the bad.

How many volumes can be written about a piece of productivity software? I don't know, but it was frustrating at times to find the Books a Million had two shelves full of books on Office, Outlook, and Word, but slightly less devoted to areas that actually would have been of help to me as a young software developer.   We can look at publishers, Microsoft Press, or O'Reilly, and  hopefully develop a familiarity for how books are written.  This book may be a quick start guide, this one claims to teach you in 24 hours, or 30 days.

The reality is often different.  Some books are good solid references, almost printed documentation of the language you are coding in which may be useful if there was no quality documentation online (Such as MSDN). Others spend a great deal of time on one set of features, yet seem to not cover one area of advanced use that is what you are really looking for.   So how do you determine what is a good book and what isn't?

One way is to look at Ratings.  You can go to Amazon, or some other book seller site and read the reviews of the book.  Do the reviewers mention parts that you were hoping to get understanding from that book, do they down it?  Do they list the table of context, or index to give you a partial idea of topics covered? Do they describe the book as far too long to cover so little?  Yet Ratings games, like search engine optimization can still be played in the online market place, just as it could be played in a brick and mortar store where a book with a better price may actually contain significantly lower substance than a book of just a few dollars more . So, it is not always clear whom you can trust as an Author or Publisher at times.

When I first came on board as a full time tester on this project, I had not read a significant text related to testing, in several years.  I dare say, the last mention was probably in a software engineering text book, that I have opened now and then just to refresh my knowledge.  I learned to browse the shelves of fellow developers, sometimes to ask if they could recommend a good book.

As strange as it may sound, word of mouth was responsible for a number of quality books I bought when I first moved to Hinton, WV.  A lot of data can be found online these days, and that's great for helping you through tricky areas of the work, but sometimes, nothing beats a good old fashioned dead tree to plop in your lap and devour.   Sure, I may read faster in an online format than I do in print, but there is something to be said about having those pages a few finger tips away.

However, what if no one you know has a text in that area you might wish to explore?  That is the hard part.  How do you figure out who to read when there may be so many texts?  Some may contain ideas that may not be all that useful for what you or your team wish to accomplish.   Recently I've discovered a new and better way to learn about books, through Webinars, YouTube Videos, Blogs and Twitter.

I can't count the number of interesting blog entries I've read in the last few months, the thousands of tweets or YouTube Videos I have browsed.  I know it has been a treasure trove of information, both about the areas I have been researching, and about the individuals themselves.  I admit, I was a bit of a skeptic about Twitter in the past.  I found Facebook to be more trouble than it was usually worth, and was not sure quite how twitter would really help.  At some point I actually tried to follow some sports writers and bloggers on Twitter, hoping to get insight into this years NFL Draft.  A lot of the commentary linked to places that required paid subscriptions to read, and over time, I realized it was not as valuable as I would have thought.

Then I began to watch Webinars by several testers and software development 'coaches' as I'll call them, and I realized that there was more going on twitter than just simply Facebook status messages limited version 2.1.  I think Lanette Creamer was the first tester I followed on twitter, and shortly there after the Bach brothers, Marlena Compton, Michael Bolton, David Burns, and Adam Goucher and a host of others.  I admit, I have never had a good understanding of how to network with people in my field, especially given the geographic region I work in.  But Twitter has turned out to be a godsend for not just meeting and listening to conversations related to testing, but a major tool in expanding and learning how to be better at what I do.

Then I found out that, some of these folks, actually had written whole, or contributed part to any number of texts.  So I began my quest to seek out these tomes, to see what may have been written, and over time I read and absorb as much as I possibly can.  Like many testers, I've not had any formalized training, barring a few webinars, and free web courses I've taken.  I had been thrown into the fire and reacted the best that I could, and tried to be as thorough as I could be as a tester.

It's interesting, because word of mouth brought me to purchase the first few books I acquired after moving here.  Now it is not just word of mouth of coworkers and friends, but perhaps of the authors, editors, or even other readers whose opinions I have come to value highly as I consider what tomes to add next to my shelf.

To conclude, if you find the search for better technical publications to be a bit of a maze to navigate, I highly recommend seeking out individuals in the field.  Look for papers they've written, articles in magazines, or even follow them on Twitter.   Getting to know the author, is almost like establishing a relationship with someone as a trusted person in the community you live in, and grants that extra bit of confidence. I've found that makes the learning far more enjoyable.  Thanks to Twitter, my desire to expand my knowledge into new horizons has ascended again, and I hope that many others will find the fires of Prometheus flame rekindled in their desire to learn.

Monday, August 23, 2010

Testing as an inter-disciplinary skill

I know it has been a while since I last wrote a blog entry.  Unfortunately lack of sleep and cramped schedules deprived me of much writing time of any kind the last few weeks.  Hopefully I can find a way to fix time for this purpose, or at least leverage some for a crowbar before my ideas go stale.  (Note to self, start carrying the notebook again.)

With the obligatory apology completed, now to some discussion.  This past weekend, I had some time to spend with my father.  He's a Chemical Engineering graduate from West Virginia University, and has worked in a number of different environments.  Without giving you an entire biographical sketch on my father, let me just say that he began working for FMC in South Charleston, WV shortly after graduation, and now, through an ironic quirk of fate now works at the same plant, that has since been spun off and changed owners a few times as a Chemical Operator.

My dad, often reflects on how things are at his plant, a place I even had the privilege and honor of interning at for a summer, and learned a great deal about just how complex an industrial plant can be.  This honor allows me to have a much better idea of the parts of the plant that he's referring to, even though it seems as though they've re-purposed some of the land for other things in his description I can usually get at least a basic understanding of what he is talking about.

I of course am not a Chemical Engineer, my last Chemistry class was Chem 16, and I was glad to be done with it, the likes of organic chemistry was not something I felt I needed to learn, although at one time I had considered that field.  So while I may not understand all the complexities of how a plant is laid out and run, I found some interesting parallels to experiences I have had in the Software field.

For example, take a particular process, any process you can think of, be it software or industrial.  Imagine this process produces a raw or virtual product.  The process has rules, and procedures that are to be followed by the floor workers, the programmers, the so called 'QA' group and so forth.  Maybe they have cross functional teams, or maybe they work in segmented distinct sections of the 'plant' to produce the product.  Whether they are stamping out bolts and panels for a car, or controlling the flow of material inputs, you can see how the manufacturing sector and the Software Sector may seem to have a lot in common, at least on the surface.

What does this have to do with testing?  Well, one area of interest to me has to do with how to improve the process in which software is developed on the teams I have worked in, and I have worked in several different situations.  The first, and perhaps most frustrating happened near the end of my College career.  I was taking an Operating System Class, and each class divided into smaller teams to build and code the projects.  Somehow I ended up with a trio of other students, a pair of them Grad Students.   Well, we began by planning, and figuring out how to divide up the work load.  Things initially seemed to be going well, and then the worst thing possible could happen.  The one other undergrad guy, dropped the course, which increased our individual work load, and then, soon to the Grad students disappeared from class.  I suddenly was in a situation where I never wanted to be, alone trying to solve a massive project. Fortunately I was able to transition to another team later in the semester, but there are so many things I could have learned better if I had just been with them at the start.  The team literally imploded under its own weight as people were pulled in different directions.

The same thing happens in software teams in industry.  How many people have I known who share multiple hats?  Lead architect, chief tester, database engineer, technical writer, system configuration admin, etc.  Those are just generic hats, when you add in possible technologies like AJAX, jQuery, Flash, and other libraries and techniques, the distribution of knowledge in a team can be spread out.  This may not necessarily be a bad thing, but it can hurt teams when people are retasked to other projects, or move on to other opportunities.

On one particular project, I was given the responsibility of picking up a section of the site, a search page to port it to the latest version of .Net.   I had seen the search in action before, and thought I had a pretty good idea of what was going on, well that was my first mistake.  Once I dug under the hood, I realized the intricate classes and data connections related to this page were very complex.   The change to the latest version of .Net meant the manner in which data was passed around had changed, and it was not just a simple matter of pointing to a different data source.  In short, maintaining that now 'legacy' piece of code became a headache, one that I prayed I'd never have to work through again.

Note it wasn't that this particular page was prone to error, it was literally a swiss army knife, with a multitude of possibilities, and that was before you started saving various searching configurations.   The draw back was that it was a difficult mechanism to extend, and that ultimately lead us to redevelop the module and to take advantage of another Reporting Service technology inherent within the version of SQL Server we were running at the time.

My father described what seemed to be a similar situation.  Changes in how they ran the plant, where or how certain inputs were calibrated, and how they ultimately had certain effects on the outcome.  Most teams in software will encounter bumps in the road that they have to learn to overcome, and it seemed that this was a similar process for my dad's company.  If you encounter a problem in the process, you examine it, and try to determine how to fix the problem, and hopefully develop a procedure to avoid repeating the mishap.

There's just one problem, that in my Dad's case gave me concern.  What if there was a situation where an error happened, but the process was not at fault?  In software we can tweak and tweak until the cows come home, but with each new procedure, each new piece of red tape, how much do we slow our ability to produce and test code as we go?

My father went on to describe some of the different teams that worked in the plant.  They have their managers, their QA people, their maintenance guys, their operators in various sections of the plant and so forth.  I found myself asking my father what he thought Quality was.   Truth be told, I blame Jerry Weinberg for the question as I recently borrowed an old copy of Quality Software Management: Systems Thinking from a friend at work thinking it was out of print.   (It turns out I was mistaken on this. Thanks to Jerry, for pointing out to me that it actually is still in print, just not in Amazon's roster.  It is still available with Dorset House Quality Software Management: Vol. 1: Systems Thinking by Gerald M. Weinberg. My apologies for listing it as out of print in error.)

I found myself wondering how my Dad, a chemical engineer doing operator work saw these things, and then I described how Jerry in QSM describes quality simply as value to some person.   Now I've seen a number of other tester bloggers wrestle with the ideas concerning quality.  Unfortunately none of those entries were fresh in my head on this particular day.  However, one point I did remember from the chapter I had just finished reading, was that because quality is subject to interpretation to some person, there are going to be different definitions, and expectations depending upon who the person may be.  Jerry does an excellent job of describing this phenomena in the first chapter of that book.

This then prompted another question, "Is the Quality Assurance Group" the only people who need to be concerned with quality in the plant?  I was trying to drive home that perhaps one of the issues in his plant was that people believed Quality is something that only that group in the labs is concerned with, everyone else is just a laborer doing what they are told.

I'll be honest I've never understood that mentality.  I have always pushed myself to do the best of my ability and in a team setting to do whatever I can to make that group successful.  The situation reminds me of one I've seen on paper in a few webinars.  A software group may have a group of Business Analysts that try to figure out the requirements.  They then pass those on to some architect, or developer group who tries to implement them in the chosen coding convention.  Then those same requirements are then passed on along with the builds for the project onto the testers who have to then parse those requirements and try to figure out how to determine whether a product 'passes' or 'fails'?

Sound familiar?   My dad basically described this situation, where the Analysts would get an idea at a high level of something they'd like to see happen, but often times they don't know the capabilities of a particular piece of equipment or hardware, whether it has constraints that may limit how it is used.  I'm reminded of the complaints about highway plans where I went to college.  So many described the roads as poor and unorganized, and often as if there had been no plan at all.  I often heard people say, that like the roads the best way to fail was to start by failing to plan.

Having lived through some similar situations I can attest that it can be very difficult.  My first role as a Software Developer, I never imagined that they'd require me to do quite so much testing.  Software Engineering Classes for BS Computer Engineers focused more on how testing was often a separate phase done by a somewhat independent group.   Yet here I was soon after starting my first professional job for pay, and I discovered myself having to test this module.  Sometimes, all I had was the name of the module, no requirements, or notion of how it should work.   Even worse, initially I was only given the code, and had to parse through them buy hand to figure out what was changed, plug it into a newer build, and then test it.

It didn't take me long to realize that this was an untenable situation.  How much time and effort was being wasted integrating someone's code into our project only to have to back it back out when we discovered it was not as mature a feature as we had thought.  I prefer not to think about that, but I began to push back and ask for at least some basic requirements or description of the features, and eventually we began requiring a demonstration build of the project as a proof in concept that I could put through its paces and explore to see if it lived up to what we were expecting.

Honestly, it was a very hard experience.  My dad says that Engineers go to College to learn how to think and how to teach yourself about the disciplines you encounter, but that you don't really have any clue how you will use what you've learned until you are out and working in industry.  In short, as a fresh out of college graduate, I didn't really know jack about how to do my job well.  There were so many things that I'd not encountered that I had to learn those first couple of years, and without any real guidance from the more senior developers in the team, I was left to figure things out on my own.  Fortunately I am a fast learner, but even then I know that I probably made more mistakes that first year than I ever imagined possible. 

This is not me being critical.  Just like I look back at stories penned before I finished High School, and today I can barely understand what I was writing or how I formed lines of thought to compose one line of text and weave it into another.  Truly they were embarrassing times, but they were learning times.  They drove me to work harder, to try to be better at each little thing I did, and I felt I was making progress up until I got retasked to working the Tech Room.  In truth, it was probably for the best, there were a number of glitches in our process for including and releasing new builds, and I was fortunate enough that my ability to work with people on the phone enabled me to switch hats, yet still learn a great deal about our product.  It was that connection to the customers that finally made things start to click.

In any event, just like things were in flux in that first assignment, I wonder how such changes affect my dad at his work.  How can they expect to keep their production levels high, when a key component goes down?  How would I expect a web site to operate if a key database or server went out?  This is what I was driving at with my dad.  That maybe, just maybe, the ideas of Systems Engineering for software, of testing, and development, do not just exist in their own bubble, but are perhaps shades of things that could help improve things at his plant.

I really enjoyed the chat with my father.  I wish I could have more such talks with him, to exchange ideas about how to think through problems as an engineer, or a tester.  One thing is for sure, the more I read about software development, the more I wonder why more work place components are not striving to improve their processes instead of staying at the ad-hoc level.

PS. Thanks to Gerald Weinberg for writing Quality Software Management: Systems Thinking, and thanks to my friend Craig for letting me borrow it from his book shelf as the book seems to be out of print due to its age.  I look forward to completing reading it, even though some of the ideas presented within may be a bit dated.  One of the joys, that being a full time tester has brought back to me, is the joy of digging for new ideas and knowledge through books, blogs, and the internet.  I'm not sure I would have had this wonderful conversation if not for having started that book, so Jerry, thanks a bunch.

Wednesday, August 4, 2010

My take on Adam Goucher's Six Shocking Automation Truths.

Adam Goucher has a wonderful blog entry over at Software Test Professionals relaying Six Shocking Automation Truths.  If you have not yet read the article I wholly recommend it as it does provide some high quality grade A discussion material within it.

According to Adam the six shocking automation truths are:
  1. Truth One: You do not need to automate everything
  2. Truth Two:  Going forwards in reverse is still going backwards
  3. Truth Three: The automation language does not have to be the same as the development language
  4. Truth Four: There is not one tool to rule them all
  5. Truth Five:   There is no 'right' or 'wrong' way to automate (though there are better and worse)
  6. Truth Six:   Your automation environment is not production
These six ideas coalesce around several myths that many in our industry have not yet come to realize are myths. I find I agree with most of these Truths, or at least the blanket summary that Adam gives them, but I'd like to take a moment to expound upon what I these truths mean to me.

The first one, "Truth One: You do not need to automate everything" is easy.  You can't automate everything rings very true for me.  Some parts of a particular software may not be possible to attempt automation either as artifacts or side effects of how they are designed, or due to the nature and quality of tools available.  What's more if you are like me, with several hundred test plans comprised of thousands of steps that cover the modules of an existing software, trying to automate everything could take a really long time.  Likely much longer than the client is willing to pay you for which leads me into the next Truth.

The second truth: "Truth Two:  Going forwards in reverse is still going backwards", follows from the first.  If you have a large number of tests in need of automating, and only limited time to script/record/code/setup the automation tests, then given that time to complete the automation is very likely limited you have to be judicious about where you actually use automation.    Now some may argue that it makes more sense to start with the most stable older portions of a code base to automate. 

I can understand the deceptive and seductive nature of this.  Repeating these test scripts by hand every iteration may seem like a waste of time.  This would seem especially true if they are often finding few if any defects to report.  Yet that section of the application must always be regression tested, and is somehow of more value, despite the lower chance of defect occurrence.  In addition, automation is desired even if no change in features actually intersects it.  To this I find myself in disagreement. 

First, old tests, that are used to regress the software do not always equate with relevance to the current build and software release.  I have worked on projects where test plans from Release A or C were changed, or completely rebuilt from the ground up.  Should they continue to regress tests and force automation to built upon test cases that are now obsolete?  My answer is no.   There is nothing more costly than trying to automate tests which are invalid, obsolete, and not an accurate reflection of the current software's behavior. Therefore if your definition of Old is actually referring to dated perhaps obsolete and regression checks then maybe that's not what you want to automate.

Now Adam argues that the best place to start automating is on the new sections of an application.  I can understand that thinking, but it also is not always necessarily possible to do that.  In some cases it will be, and this may also be dependent on the kinds of tests included in your automation framework.  If you are just starting Unit Testing for example, It makes loads of sense to focus on code that is currently in development rather than trying to cover old dated code.  If however you are using a different type of automation that may not make sense especially given the pace at which code may change in a particular feature as it is developed.  

So what then is the middle ground?  I think a better way in many cases would be to focus upon the areas of the software that were most recently released.  Recent release may imply stability, and though a recent artifact in the release it is most likely the most fresh in the minds of the team in general.   Newer modules may have a higher probability of being touched again as their functionality is expanded with additional features in subsequent releases.   This of course will not always be the case, but the chief concern of this truth is to remember the Pesticide Paradox. 

The Pesticide Paradox simply stated is that "defect clusters have a tendency to change over time, so if the same set of tests is conducted repeatedly, they will fail to discover new defects."  Or at least that's the paraphrased definition from a online course I recently completed.  Or as another tester explained it to me, as bugs are found around particular code segments, reported, fixed, and retested, the old tests will begin to prove repeatedly that those bugs are gone each time they are run.   The problem though, is that these kinds of old proof tests may give a false sense of confidence about the stability of some sections of a site leading the team to focus on testing and developing the more raw parts of the application.   This is why we must maintain and especially update at tweak even old tests as new releases come out in order for them to remain relevant.

The third truth that you need not test using the same language that the code uses, seems a rather obvious one to me, but then I come from a development background before I became a full time tester.  It should stand to reason though that if multiple languages can accomplish the same tasks, that it thus is not necessary for the tester to be fluent with that coding style, and in some ways may help enforce separation between development and testing areas.

The fourth truth, like the third, also seems like an obvious one to me, that there is no one size fits all tool.  I remember when I was a young Boy Scout that another scout showed me his Swiss Army Knife. That thing had twenty five or more gadgets and was so wide I couldn't hold it in my hand to cut.  Contrast that with the two pocket knives I used as a boy, the basic five gadget one complete with can opener, awl, bottle opener, large and small blades, and a cork screw, and the second a simple 3 bladed carbon steel knife (Three blades of differing lengths).  I got more use out of those two knives and they provided all the basic functions I needed from a knife at that time.  Today I carry a set of folding pliers one large one small that also have screw drivers, and scissors, and a blade on it, but I still find myself using that regular knife blade more than anything.  So it doesn't matter if a tool has more functions than its competitors, if its difficult or cumbersome to use, or if it doesn't cooperate with tools other developers are working with.  (I remember using the Ankh extension for Visual Studio several years ago, and had to uninstall it because my install would crash unexpectedly when it was being used.)  The same is true for testing tools.

Truth five is in my opinion the hallmark of good testing, and especially for those who ascribe to be part of the context driven school.  No test exists in a vacuum, and therefore consideration to the environment, the parties that will use the application, and risks involved should all be considered when testing approaches are mapped out. 

The last truth, "Truth Six:   Your automation environment is not production" is the only one I really have some issue with.  Sometimes it is easier and better to understand a software, especially one that you've only recently been brought into, if you see the actual data, or a good facsimile of what it may imply.   I do agree that it does not necessarily make sense to hunker down a local networked instance for test via secure HTTPs, but I am not ready to say that it should never be tested on a test instance.  If your client, or process rules require your test instance to be exactly as it will be in production then I can see why a team may have no choice but to do things this way.  However, my extraction from truth six would be that to do such should always be done with caution to keep in mind the importance of keeping the application as testable as possible.

To conclude, Adam Goucher's Six Shocking Automation Truths are concepts that all automation testers and the stakeholders planning to leverage automation in their projects should be considered before they have the testers hunkered down in their make shift bomb shelter cubes and putting the software through its paces.  I think remembering these things will save many headaches for both the tester and the consumers of their testing efforts.

Tuesday, July 20, 2010

Finally, a First Post, and Some Background

As an aspiring writer, I once tried my hands at blogging a couple of years ago on my old domain name.  At some point I lost interest, probably because what I was blogging about was more of a rant about things as they were, and really wasn't where my true passions lay. This had been an outgrowth of my desire to share my opinions on certain matters after the loss of several forum communities of which I was a apart.

I gave up on the craft of blogging, and stuck to other forms of prose, namely a play by forum game called Battletech-Mercenaries where I had the opportunity to craft stories, and write collaboratively with those individuals who had joined my unit the Hellstorm Hussars.  This was a continuation of other writing experiences stretching back to a group of fan fiction stories that I and several other fan writers composed around the PC game known as Starsiege, and even joined a Guild until my interest in that niche waned.

Even before that I had always enjoyed telling a good story, and at one point took a rather poor attempt at writing a novelization of a story idea during High School.  That was before computers became a driving passion in my life.  Even going back to my earliest days in school, I can remember having an interest in computers.  I once thought that I would like to be a so called 'expert' in computers when I grew up.

While that expression was perhaps a bit too simple at the time, there was something about these machines that captivated my interest.  I can remember co-writing a small adventure/quest program with one of my best friends, older brother's Commodore, writing some basic for an old TI whose model number I have sense forgotten, but I do remember that the games that came with it were cartridge based.  In any case, you could say that I have been working with computers since I was a young child, but that would not be entirely accurate.

You see growing up my family didn't have a personal computer, (or an apple for those so inclined), and when my father got displaced from his job of fifteen years went and bought not a computer, that could serve multiple purposes, but a Brother WP-75 Word Processor.  Oh it was a decent machine for word processing, using a daisy style printer, and small three and a half inch floppy disks (double density disks if my memory serves) to store a small number of files.  It didn't provide me the opportunity to explore programming concepts in my Middle School to High School years, but it did provide a means for learning to type, and begin exploring my thoughts in a few story ideas.

I remember writing quite a few stories, reports, and papers for classes on into High School.  Some were solid, some generally lacking in a few areas, but they were as a painters first sketches, first chalk rubbings a beginning, a testing and beginning to building my writers craft.  It also is pure irony I suppose that I still find joy in writing. Back in Middle School and High School, writing was not something that came easy to me. In fact I struggled with english and literature classes in High School more than any other.  Largely that was because of the mass amount of literature they expected us to remember for the tests.  After getting a B in honors english as a freshman, my parents got me a tutor for the summer who helped me begin to hone in on my writing skills.

She gave me simple essay assignments about various types of boxes and through those exercises I learned and grew as a writer, a student, and a person.  Though I never again took honors or AP level English, my ability to write grew and what once was a weakness would become a strength in College when I scored an A+ in my Composition and Rhetoric (English) class as a freshman.  It was then I realized that it wasn't necessarily the grammar or writing structure that gave me problems in High School, it was the mass amount of data they expected you to learn in seven courses over the course of a full year.

Having turned writing into a strength, my skills were added to in College through a number of different reports, essays, and lab papers.  As I began to also work on some aspiring writing on the side for Starsiege, I found that I matured as College progressed.  Now, writing is a hobby, a way by which I make sense of the world, even though up till now most of my work has been fiction. 

Truly writing fiction, even in already established universes is a challenge.  For as Mark Twain once said, "The difference between truth and fiction is that fiction must make sense or nobody will believe it." As I look at some of the things happening in our world today, those words ring truer today than at any time in my life.  Real life does not always make sense, and how we tackle and learn from our experiences is as much of our life's journey as the day to day tasks that consume our somewhat meager existence.

That brings us to now, and the reason why I have taken up this mantel and the gauntlet of blogging.  In the recent years I came to Hinton to work as a Software Developer, primarily developing a Web Based application with ASP.Net and C#.   One of the skills that I have long sense possessed was the ability to quickly learn new source material, to pick up and leverage new ideas into perfecting myself, and my crafts as a software craftsman.  In the course of this journey, I found myself assigned to a project for a partner company not as programmer, but as a tester.

This was not the first time that testing had come into focus in my life.  My first professional job for pay was with a company called Stenovations, they make a Computer Aided Transcription program called Digital CAT for use by court stenographers to quickly transcribe and mark up depositions.  One of my first tasks there was to work at integrating software modules developed by programmers contracted outside the company into our current builds. 

I attacked my duties as I did with anything else, trying to absorb as much as I could, to learn how the system worked, and what it did; However, I learned over time that my job's first duty was not necessarily the parsing of the code, but testing these modules that were handed too me.   Many modules would be handed to me, and though I could scour the code for every artifact that had changed, the modules would not seem to work.  So I changed tactics and began to first test the demonstration build provided by the developers, and only then if it appeared to work, to go through the process of integrating that feature into our product.

This proved a very smart decision. It allowed us, and me in particular to actually test the feature said to be added.  To see if it worked as expected.  In many cases I discovered that even in their builds it had not worked as we had expected.  The lack of defined requirements was a problem at times.  So it was suggested we should provide a list of these requirements, these desired features and let them build them for us.

It became apparent though that some of the developers were suffering from a perceived communication gap.  I'm not sure if it was differences in culture, or something else that was proving an impasse, but this was my first step into the hat of a Tester. It was through this mechanism that I first began my story as a tester, doing exploratory testing even though I would not have realized it would become quite the buzz word till much later.

 In this I became a tester first, and a programmer second.  Because if the tests I ran failed to produce the advertised and expected outcome, then my programmer's toolbox would remain locked and closed.  I couldn't very well fix code I had not written myself to work in a new build when that same code did not produce the expected effect.   At some point, I was moved out of the development side of the business and spent about a year or more as part of their help desk and support.  It was a bit discouraging at the time, I had signed up to actually write code and use what I learned while enrolled at the University, but what I discovered was that in conversing with these customers I actually learned more about their needs, the problems that they needed help in solving, and that would make me a better developer and tester.

In fact at one point I decided to start writing some simple test scripts.  They weren't all that fancy, just a Microsoft Word document with a table with I believe it was two or three fields one for the item to be tested, another for if it passed or failed, and another for comments.  (Now that I think about it the pass and fail may have been logged in a single comments column, but it has been a few years.)  Of all the things I did at the Stenovations, I feel that perhaps that step of creating a test script and actually putting the software through its paces as a help desk technician was one of the few lasting contributions that I made to the company.

In 2005, I came to work for ManTech here in Hinton, WV.  Primarily I was an ASP.Net developer, but one of the things I love about ManTech is the way they encourage us to continue to learn and grow as individuals.  I made the transition to use C# in a matter of months, and then helped our team transition and port our product from .Net 1.1 to .Net 2.0.  During my time with ManTech, testing has been something I've occasionally volunteered or been tapped to assist with.  Whether it was testing our flash based lessons for our e-Learning system, or testing our site at various stages of development.   I expanded my knowledge of databases as well as I began working with SQL Server.  One other area of interest was that of documentation, and online help.  I helped write our first FAQ system, and learned to use Adobe (formerly Macromedia) Captivate for the recording of interactive help.

I've worn many different hats at times while with ManTech, and currently I work as a tester, with primary duties relating to automated web testing for one of our client/partners NISC, now an IBM Company.  As I strive to improve myself at the craft of testing, I've reached out and begun reading as much material as I can absorb on the subject of Testing, on Agile Methods, and project development.  During these explorations I came across blogs for several knowledgeable people in the Testing Field.  Some of these blogs (in no particular order) are those belonging to: Adam Goucher, Matt Heuser, Michael Bolton, Lanette Creamer, James Bach, Jon Bach, and Cem Kaner.  There are others of course, but mentioning all the fine blogs I've discovered in the last few months would likely take up more space than I care to devote at this time.

I really owe my return to blogging to these and other fine upstanding testing citizens of the world wide web.  I find their posts and tweets to be highly inspirational, and above all else high learning experiences.   If there is one motto I've put to use in life that stands above everything I do professionally, personally, or in civics, its that I always want to learn.  Learning to me is the real substance of life.  I've often joked that the day I quit learning is the day I retire, because that's how I feel.  I'm a lifelong learner, whether it pertains to the technical field in which I work, about the people to which I associate, in scouting as I step back in to the organization as an adult leader for the first time, or even at home with my family.  I thus strive every day to find one new thing, learn one new thing every day.  Most days I learn a lot more than just a single item, and sometimes I learn more than could be compressed into a concise statement, but the point is that I learned it.

So as I set out on this exploration of learning experiences in this blog I hope to discuss areas of technical interest with a emphasis on testing in particular.  I may be new to blogging, and growing as a new full time tester, but I look forward to the learning adventures to come.