Thursday, October 13, 2011

Norms of Validation and the Agile Community

Had a great dinner with Mike Cottmeyer tonight in Bean Town.  At one point we got on the subject of the nature of research and validation expectations.  I was reminded that someone told me that different sciences have different validation and correlation expectations because they will reject the null hypothesis.

In high energy physics, the expectation is for the coefficient of determination to be at least 99.9999%.
Depending on which engineering community, 95-99% or greater is expected.
In the social sciences, it is not unusual to accept R2 values of 30-50% as "good enough".  Human interactions are considered so complex as that when R2 values are >90% are one has to validate you aren't just measuring the same construct in different ways.

These communities have other norms.  A couple norms in these communities are that research must be falsifiable and replicable.  Another typical norm is that another minimum level of validation is peer review by people with PhD's and publication in a journal associated with the discipline.

These are the norms these communities.  We can argue about them, we can even argue if these communities hold themselves to these norms all the time.

It struck that my observation is that the agile community has their own set of norms for considering work valid and something to be built upon.  I would argue that the agile community currently has validation norms well below those of the physical sciences, engineering, and social sciences.  Sometimes a poorly done case study (by the standards of the social sciences) or a claim that is published in book form (but not peered reviewed) is sufficient for validation in this community.  Sometimes it is just a blog post by a well paid consultant.

As agile software development enters these other communities (such as the engineering systems community) the agile community shouldn't be surprised if they are expected to reach new levels of validation before their findings are accepted.

This is a great opportunity for research.  Agile is relatively new and now there should be enough cases out their to start and build serious research upon.

Thursday, May 5, 2011

Agilist strike again

Continuing on my critique of research in agile methods, started with this (relatively) popular post on integrated concurrent engineering compared to agile software methods.,.

I ran across this post by Dean Leffingwell and this post by Chad Holdorff, extolling the "proper" mixture of component vs. feature teams (definitions included in these posts, somewhat) on a project.  Problem with this post is that there are not studies showing correlation with the two variables on the graph from Dean, the proposed mixture of component and feature teams, and the productivity and/or quality of output of the teams that do and do not follow this proposal.  Additionally, there is no notion of context: why types of systems does this model work for? where does it break down? At best, this mixture is simply a hypothesis that needs to be tested through proper study.  They should be marked with "notional" and for illustrative purposes only (i.e., they are fiction).

Now compare the notional mixture of structuring teams with the work of using tools like Design Structure Matrix to organize and improve global product development.  A good starting point would be Steve Eppinger's paper, A Model-Based Method for Organizing Tasks in Product Development.  Here we have a model, formalized, that has been applied and the outcome studied for productivity changes.  There are other, dare I say, mature ways to manage dependencies.

The agile crowd needs to move past heuristics and case studies, and no notion of context, to describe what works and what doesn't work.  Further, the definition of work needs include a notion of output quality (and not just absence of defects).  There is are better ways than the simple heuristics and one off case studies.

Wednesday, May 4, 2011

System Architecture Principle 8: Beware of software

Tagline: Beware of software.

Descriptive version: Software grows very complicated very quickly.  It can be of high leverage, but can be potentially dangerous because of how complicated it can become.  Further, software does not have a "laws of physics" like physical systems making it difficult to reason about using models and such (the code is the model!).

Prescriptive version: When deciding to allocate functions to software, be aware that you are substantially increasing the internal complexity and number of operating modes of your system.

Discussion: I am not sure if this is an architecture principle, yet.  I am certain it will be one with enough time.  Certainly of my principles it is the one that has the shortest lifespan. 

Software is this very new thing that can quickly grow substantially more complicated than the physical system in which it operates.  The number of operating modes of software is estimated as the number of inputs times the number of outputs to some power (Ayaswal2007).  Further, software seems to be having a tendency to creep from being something embedded in some component helping that component deliver its functions to an item that becomes a system bus interacting with just about every component in the system.  On top of all of this, the work of software is largely hidden in complicated code in bits on software developer's computers making it even harder to reason about.

Citation
B. K. Ayaswal and P. C. Patton. Design for Trustworthy Software. Prentice Hall, 2007.

Tuesday, May 3, 2011

System Architecture Principle 7: You can't escape the laws of physics

Tagline: You can't escape the laws of physics (Augustine1996).

Descriptive version: No amount of being clever will allow your system to violate the laws of physics.

Prescriptive version: You can't change the laws of physics; use them, obey them, but don't think for a moment that as an engineer or architect that you can escape them. 

Discussion: This is one of the principles which reflects my background, with my undergraduate degree being in physics.  I spent the better part of five years (I started in graduate school and decided it was not for me at the time) studying how the universe worked, the models that explained why things happened all around us.  As you know, this attempt to understand the universe is very much related to engineering, but in some ways very different than engineering.  Engineering seems to be more about, given the set of laws that govern the workings of the universe, how can we (engineers) leverage them to affect the world around us.  It is very tempting to confuse this ability to "engineer" the world with the ability to "engineer" the laws of physics.  Falling into this confusion would likely lead to very undesirable consequences.

Citation
N. R. Augustine. 1996 Woodru ff Distinguished Lecture Transcript. http://sunnyday.mit.edu/16.355/Augustine.htm, 1996. section title Conceptual Brilliance Doesnt Blind the Laws of Physics.

Monday, May 2, 2011

System Architecture Principle 6: System design drives life cycle costs

Tagline: System design drives life cycle costs (Blanchard2011) (Crawley2010).

Descriptive version: The systems architecture, and hence the work of the architect, has the largest impact on system life cycle costs.

Prescriptive version: The work of the system architect can drive significant changes in the excepted life cycle costs of the end product.  It is important for the architect to understand the life cycle costs constraints (based on perceived value of the acquiring organization and the desired financial margins of the supplying organization) and to make sure the system architecture supports fits within the targets.

Discussion: The principle is support by the notion that highest management leverage is at the very beginning of the project, when the least amount of money has been committed to the project.  That is, the very first steps, deciding on the system architecture, is the biggest opportunity to steer a project to a desired life cycle costs.

Citations
B. S. Blanchard and W. J. Fabrycky. Systems Engineering and Analysis. Prentice Hall, 2011.

E. Crawley. Esd.34 lecture 1, September 2010.

Friday, April 29, 2011

System Architecture Principle 5: Systems exhibit emergent behavior

Tagline: Systems exhibit emergent behavior.

Descriptive version: As the elements of a system are brought together and interact, processes (function) and other intrinsic properties will emerge.

Prescriptive version: An architect must pay as much if not more attention to the functions and the combination of functions as compared to the items of form.  It is the proper combination of these internal functions that will result in the external delivered functions that provide value to the beneficiaries.  These emergent functions will not simply be the some of the sub-functions.

Discussion:
During (Crawley2010), it was stated ``Form by itself delivers no value''.  With this I somewhat agree and somewhat disagree.  It has been exhibited, by options like the iPhone, that some users will perceive higher benefit to a system when the system has an appealing `industrial design'.  This could be an argument for form providing value.  On the other hand, one could argue that the function of the device, the iPhone in this example, provides the base value and form provides additional perceived value.  With this argument the statement of ``form by itself delivers no value'' might be restated best as ``form provides additional value assuming that functional needs are satisfied''. 




Citation
E. Crawley. Esd.34 lecture 1, September 2010.

Thursday, April 28, 2011

System Architecture Principle 4: Systems exist to solve a need

Tagline: Systems exist to solve a need (Shapira) (Crawley2010a).

Descriptive version: Systems exist because humans or organizations have a need and are willing to trade something of value for that need to be satisfied.

Prescriptive version: It is important for the system design / architect to understand the needs of the key stakeholders and design a system that satisfies those needs.  Further, the perceived benefit of the system must exceed the perceived cost to obtain and operate the system.

Discussion: In the end, systems are acquired by exchange of money or other things of value in return for the benefits of the system.  The system must provide perceived value that is greater than the perceived costs, the difference being the perceived relative benefit of the acquiring organization or person.  Note the relation to the perceptions of the acquiring person or organization; this very much is in line with the principle of good architecture, which it is in the eye of the beholder.

This model gives the architect to levers to pull in order to increase the relative perceive value.  The first is to focus on the external interfaces, external form, and external deliver function in order to increase perceived value.  The second is to manage the internal system design so as to decrease the system life cycle costs

Citations
Y. Shapira. Principles of system architecture. http://en.wikipedia.org/wiki/User:YoavShapira/Principles_of_system_architecture.

E. Crawley. Esd.34 lectures on value exchange model, 2010.

Wednesday, April 27, 2011

System Architecture Principle 3: What can go wrong will go wrong

Tagline: What can go wrong will go wrong.

Descriptive version: It is very rare for a system of greater than medium complexity to operate without failure. This applies to both the satisfying the intended needs and anticipating future needs.

Prescriptive version: Robust design, flexibility in design, and design for contingency and emergency operations are critical to the success of a system.

Discussion: The potential history of this ``law'' dates back to an 1877 meeting of the Institution of Civil Engineers (Holt1878):
It is found that anything that can go wrong at sea generally does go wrong sooner or later...
Various other written references to the law of turned up over time, including in the context of stage magic, mountaineering, and as a name for the second law of thermodynamics. The law has been attributed to Capt. Ed Murphy, and engineer from Wright Field Aircraft Lab (Bloch1977), and to an unnamed theoretical physicist (possibly from California Institute of Technology, aka "The Other Technical School"). While it is difficult to pinpoint its origins, the law quickly spread throughout various aerospace engineering cultures (Unknown), into the engineering and science communities, and eventually into popular culture.

In the context of systems architecture, it is important that one consider all possibilities in the design and implementation of the system. Tools and techniques to do this include the consideration for contingency and emergency operations, designs that are robust to variability in operating conditions, to consider flexibility in design so a system can accommodate future unanticipated needs and operating scenarios. This should also be expanded to include consideration of human factors; humans are unpredictable, messy system elements and all care must be taken in the design and operation of systems that require humans to execute any of the system functions.

Citations
A. Holt. Review of the progress of steam shipping during the last quarter of a century. Minutes of Proceedings of the Institution of Civil Engineers, LI:2{10, 1878.

A. Bloch. Murphy's Law, and Other Reasons Why Things Go WRONG. Methuen Paperbacks Ltd, 1977.

Unknown. http://www.catb.org/jargon/html/M/Murphys-Law.html.

Thursday, April 7, 2011

System Architecture Principle 2: You can't do everything for everyone

Tagline: You can't do everything for everyone.

Descriptive version: A system of any size will have many stakeholders with many needs to be satisfied  it will be very difficult or very expensive to completely satisfy all of them.

Prescriptive version: An architect must make trades between which needs will be fully satisfied and those that will be partially satisfied or not satisfied at all.

Discussion: Projects of any significance will have multiple stakeholders each with multiple needs that must be satisfied . It is up to the architect to determine the most important stakeholders and needs that must be satisfied . Sometimes this means that needs that are very important to a stakeholder will not be satisfied . It is important for the architect to use high quality (low variability) information when making the decision as to the most important needs to satisfy and which should be partially satisfied or wholly considered out of project scope.

As a supplement to this principle, according to Ed Crawley: "Very many factors will in influence and act on the conception, design, implementation, and operation of a system."

(As a corollary to this principle, the stakeholder with the need that has the least importance will be the most vocal stakeholder, causing no end of grief for the architect.)

Wednesday, April 6, 2011

(My) Definition of Good Architecture

Tagline: Good architecture is in the eye of the beholder.

Descriptive version: Ultimately, architecture is a blend of art and science with a tendency to be more art than science. Like all art, good architecture is in the eye of the beholder; that is the judgment of what makes good art depends on the tastes and preferences of the person observing the art.

Prescriptive version: As an architect, one must understand all the stakeholders that will observe their work over its lifespan, and make an attempt to either create something that is "good" within their eyes or make a conscious decision not to appease that stakeholder. All of this must be done while satisfying the system requirements, working within the system and project constraints, and without violating the laws of physics.

(My) System architecture principles

I recently had cause to be explicit about the principles of systems architecture that I think are important.  These are very personal, meaning these have worked for me in the past when architecting a system.  I thought I would share them here, in a series of blog posts, for all the world to see.  These influence how I think about systems, how I design them, how I critically judge systems, and the guidelines I use to evolve an existing system.  I would welcome any feedback on your experiences and thoughts on architecture principles.

Principle 1:   (My) Definition of Good Architecture.
Principle 2:   You can't do everything for everyone.
Principle 3:   What can go wrong will go wrong.
Principle 4:   Systems exist to solve a need.
Principle 5:   Systems exhibit emergent behavior.
Principle 6:   System design drives life cycle costs.
Principle 7:   You can't escape the laws of physics.
Principle 8:   Beware of software.

Tuesday, March 22, 2011

Integrated Concurrent Engineering vs. Agile Software Development

Agile inspired software development is certainly all the rage.  One could argue that those processes have even crossed the chasm, with mainstream companies adopting various forms of Scrum, XP, DSDM, OpenUP and the like.  I certainly fall in the camp of people that started using Agile techniques as soon as I started to understand them (starting as early as the first publications on the c2.com wiki).

Meanwhile, over about the same time (starting in 1994 at JPL, according to Wall in “Reinventing the Design Process: Teams and Models.”), Integrated Concurrent Engineering (ICE) techniques were being tried and adopted in space systems development.  Also, some early work on concurrent engineering (CE) was published in 1996 by Prasad in Concurrent Engineering Fundamentals, Volume I: Integrated Product and Process Organization. (Note, this was 3 years earlier than Extreme Programming Explained by Beck.)  Prasad described eight fundamental principles of concurrent engineering: "“Early Problem Discovery, Early Decision Making, Work Structuring, Teamwork Affinity, Knowledge Leveraging, Common Understanding, Ownership, and Constancy of Purpose”.   ICE includes provisions and support for having a co-located customer and co-located and cross-functional teams; further, ICE is a tool for lean engineering. (A history of ICE is available starting on page 68 of http://esd.mit.edu/people/dissertations/avnet.pdf).

There has been much study (peer-reviewed and published) on the benefits (faster time to completion, lower risk of missing interfaces) and potential drawbacks of ICE (using CE when plain-old sequential engineering would due). There are almost no studies of the same type done in the Agile development world, despite emerging over about the same time period.  Why?

Thursday, March 17, 2011

Words have meaning

Why is it that so many people feel it is acceptable to take words and redefine them for convenience?  Over on Herding Cats, Glen laments the borrowing and redefinition of words by the agile crowd (a crowd of which I part of until a few years back).  I am experiencing one such redefinition:  Enterprise Architecture.

I'll start with my quick and dirty definition of system architecture:  the function and form of a system and how they relate through the system concept.

To me and some others, enterprise architecture is the structure of an system where the scope is the enterprise (e.g., for profit corporation).  The enterprise architecture also describes the enterprise's relationship to outside entities such as capital markets, labor, suppliers, and customers.  Further, it describes how the enterprise system will operate and evolve over time.

To others I know, Enterprise Architecture is defined as the information technology architecture at the scope of the enterprise;  this is another example of IT co-opting a well defined term.  I personally prefer CISR's notion of how enterprise architecture and IT relate:  "A firm’s architecture describes a shared vision of how a firm will operate—thus providing a shared understanding of the role of IT."  In this way IT is a sub-system of the enterprise system.

Monday, March 14, 2011

Uncertainty Framework

Over the weekend I came across a paper by McManus and Hastings, "A Framework for Understanding Uncertainty and its Mitigation and Exploitation in Complex Systems".  The authors present a taxonomy covering uncertainty, risks & opportunities, mitigations & exploitations, and outcomes.  Then the authors cover the existing methods for dealing with uncertainty and which parts of the taxonomy are used in each method.

This is a pretty short paper and well worth the read if you are interested in all in dealing with uncertainty.

Tuesday, February 22, 2011

White board drawing of the day

This was seen in my office.  If you want projects and products and people to be successful, this might be some of the most important advice:

Thursday, February 17, 2011

Deactivating Twitter Account

Today I deactivated my Twitter account.  I think Twitter is a great service for some, but for me if became more of a time suck, delving through pages and pages of 140 character exchanges with nothing really significant being said.  This is not meant as a negative toward the people I follow, I would much rather read your blog or see you at a conference where we can interact in a more meaningful, unconstrained way.

Now is the time for me to unplug.

Wednesday, February 16, 2011

Quote of the day

You don't get what you want want by trying to eliminate what you don't want.