Thursday, January 30, 2014

Book Consideration: Rethinking Systems Analysis & Design

To be clear, this is my second book by Gerald M. Weinberg, and I’m not reading the books he wrote in the particular order he published them in.  I like the author’s style of thinking, but in the Introduction to Systems Thinking book, he was very general in his descriptions, hitting a great deal of subjects, with his thoughts.  This of course is a big part of system’s thinking – the idea that you can apply what you know in one field against many different fields using a form of logical thinking and general rules.  While some of the design of the book could have used some refactoring, it was a good start.

In Rethinking Systems, the book focuses mostly on the development of software systems and how this thinking can be directly applied to software.  While some of the ideas are fairly standard (and the book was published more than 10 years ago), it provides some interesting insights.  He starts out by considering analysis vs slogans.  He talks of how we sometimes we over use history as a method for predicting the future.  While history is valuable, analysis will provide data not found in history.  That is to say if X is good, X+1 might not be better, depending upon the attributes that we want out of X+1.  However, we might have a slogan “Bigger is better [for X]”, which might be true up to a given point, where the system then fails to scale for some reason.  He warns of how software developers are particularly vulnerable to certain methodologies, because [in my reading of his opinion] we as an industry don’t have a long history, and even a 5% increase in productivity is great if we consider the number of failed projects in the past.  Unlike the physical world where you have to build a new bridge every time, in spite of it being very similar design, where as we in software always build something new, else we could just copy the previous build for “free”.

Chapter two talks of what actually makes up a [software] system, with a fairly reasonable picture, including the external world, the organization, training materials, training done on the job, wetware, existing files, test data, job control, (and all of this before), the program and hardware (Pg 34).  I thought it was a good point that part of the investment in software is the people and the wetware (the knowing what code is where because you built it which is faster than reading code or documentation).  He also talks about how we need to use our history more wisely, as we don’t often understand why something is the way it is.  While not an example given, I would say the Unicode set(s), is a good example, as they are design for different forms of optimization, but are fairly ironic since they were in fact an attempt to solve a very real world issue ASCII had, as computers were originally designed for English.  I also think of Joel’s amazing article on Martian Headsets (It is worth reading.  Go now, I’ll wait).

Chapter 3 is about how the observer fails often to observe anything of value (including thinking they saw X but saw Y), or at least the wrongs things (like how magicians have you look the wrong way).  In QA this is an important point, and one which I have suffered through more than once.  I actually talked on this a little in a presentation I gave about changing one’s perspective, a valuable thing to do.  He talks about studying the existing system to understand rather than criticize.  This is an important lesson, I hope to take more to heart, as a QA engineer, I have to be careful not to be hurtful to those who created the code, but more of an impartial observer, stating what I saw and why it seems wrong, not just to me, but from a more broad perspective.  In some ways this reminds me of a fair witness.

Chapter 4 talks of self-validating questions, that is to say, questions that require a response that will in fact validate an understanding of the question itself.  He talks of the question “Does that contain special characters?” which gets the answer “no”.  He assumes special characters means alpha + numeric, but nothing else.  In this day and age, it would mean something else, but by asking a yes/no question, without a follow up can get you into trouble.  He speaks of the problem that programmers tend to be defensive in what they do and how they act because users accuse them of deliberately causing trouble if the program says “don’t do X” but the user demands it anyway.  On the other hand, if they just ignore the user and do what they want, they can stubbornly miss important details.  This is a trick problem I also talked about in my talk with dealing with users.  I felt he had good insights and this chapter alone was worth the effort.

The last few chapters are on design of software, including the philosophy, trade offs and the mind of a designer.  I really don’t have much to say about these chapters.  It is not that they are bad, but it was not the most exciting to me.  He talks of being aware of your designing, not to over design nor to ignore the reality of the world.  I find these not exactly contradictory, but certainly a narrow path to follow.  To me, he could have done an entire book on design, but I suspect his strong point is system analysis, not design.  He suggests designing for understanding (know your audience), strike balance between variation and selection (don’t be too original or ground breaking for your audience), etc.  He speaks of trade offs and how to represent them as curves, which is a somewhat novel way of displaying the information, but fairly obvious to anyone who knows that famous triangle with fast, good, cost (pick two).

There are three other things which I would like to mention, but which I have not found (in my consideration of the book) what chapter they were in:

  1. He speaks of spell check which is ‘obviously good’, but is it?  It seems to him that the cost of the grievous errors might be greater than the minor typos.  He noted examples where this caused confusion as he had intentional mistakes that were fixed by editors.  He actually complains about one editor using glue and razors to fix an intentional bug in a different book, which I found pretty funny.
  2. He speaks of students in his class using systems analysis to gain ‘masters level’ knowledge in a subject and that on average some subjects took just 3 months, but others took longer (he didn’t say how long).  He specifically said English was a subject that took longer and computer science was one of the shorter subjects time wise.  I have my doubts, but interesting none the less, particularly when I recall an (apocryphal?) story of a man who said he could pass any test for his degree.  He was studying something like psychology, but said he could pass any test and thus was given a test in dentistry, which he got a B in, getting his degree.
  3. I found it interesting that in the author’s epilogue, he states: “basic human needs… air, water, food, sex… the need to judge other people.”  What I find interesting about it is I have had several talks about this subject with various people, one of whom claims not to judge, when judging is defined as “A person able or qualified to give an opinion on something.”  The author continues, “To write a book… you have to have an uncontrollable urge to snoop and pass judgment… to study what people do and tell them how to redesign their activities.”  It seems to me that judgment is of utmost importance in both the high and the low.  In the high level, we build something we judge to be useful to others, in spite of never having perfect information (making one not qualified?).  This egomaniacal belief that we are good enough to play a sort of demigod, handing design from on high, with the assumption that what we do will be good, or at least better than having not.  We design not truly ever knowing what our users really want and at best we can develop heuristics on this, but not hard and fast rules.  On the low, we design the system’s structures with limitations based upon our best understanding of what will affect it, judging how the system will be used (E.G. Computers will only need English characters) and who will use them, not really knowing what the future will bring.  The trick in my opinion is to be wise enough, not to intend harm with our judgments, nor be too harsh; for often times we end up blinded by the judgment, unable to let go based upon our assumptions.  Judgments, it seems to me (based upon the author’s words) are just another heuristic, not a truth.  In my own view, judgments are a non-binding, unenforceable guesses (with a probability weight behind it) as to how something works, based upon previously observed factors.
In my opinion, I would read the systems thinking book first and then read this book as Systems thinking gives you a broader base of theory, but this book is more practical and does have value.

No comments:

Post a Comment