Follow by Email

Monday, December 12, 2011

A New Software Engineering Philosophy

Business-IT alignment is a dynamic state in which a business organization is able to use information technology (IT) effectively to achieve business objectives - typically improved financial performance or marketplace competitiveness. Some definitions focus more on outcomes (the ability of IT to produce business value) than means (the harmony between IT and business decision-makers within the organizations).It is important to consider the overall value chain in technology development projects as the challenge for the value creation is increasing with the growing competitiveness between organizations that has become evident (Bird, 2010). The concept of value creation through technology is heavily dependent upon the alignment of technology and business strategies. While the value creation for an organization is a network of relationships between internal and external environments, technology plays an important role in improving the overall value chain of an organization. However, this increase requires business and technology management to work as a creative, synergistic, and collaborative team instead of a purely mechanistic span of control. Technology can help the organization recognize improved competitive advantage within the industry it resides and generate superior performance at a greater value, [1] according to Bird ([3]). Excerpts from Wikipedia on Business-IT Alignment

We all understand the impact software technology and how it has changed the world. However, most do not understand is the enormous potential of software when developed under the right philosophy. For example, what are the potential gains from a software development philosophy that aligns software systems with business needs? This is called Business-IT alignment. The reasons for the inability to align the business needs with system needs are hierarchic. At the conceptual it is the wrong software development philosophy and approach. At the project level, the inability to align the system with the business occurs because of conflicting objectives. Software engineers are trained and rewarded to deliver requirements specifications on time and within budget. Although “technically correct”, those delivered systems often provide very limited business and customer value. However, business customers want high quality software systems that meet business purpose, add customer value, satisfy users, and increases quality, productivity, and innovation. Because of the conflictive objectives and opposing views, software projects often are characterized by negative developer/customer interactions including finger pointing, lack of support and user involvement, conflicts, excessive politics, personnel resistance, and lack of cooperation and communication.

Of course, the objectives of the business units must prevail. No project has even been initiated in order to determine how well the development organization could meet requirements on time and within budget. But the development community does not see it this way and the question is why? Why the extraordinary obsession with schedules? What makes software developers go as far as to implement immature systems that often do not work, require extensive changes to make work, or produce marginal results when projects fall to far behind schedule?

The answer is the underlying software approach. A software approach represents a particular way of developing software systems. It is the equivalent of a software development conceptual framework, ideology or philosophy. It is a philosophy in that it represents a particular system of software engineering thought that includes the theories and concepts, or set of theories and concepts, beliefs, principles, and mental models that underlie the development of large-scale software systems. It integrates the belief systems, worldviews, perception, practices and behaviors that software engineers use to realize software systems. It is like the software engineering “collective consciousness” that conveys how software practitioners collectively think, judge, decide, act, behave and practice.

A software approach consists of three major components, a perception, paradigm, and practice. The perception describes the conscious or unconscious understandings of software work. For example, the current traditional understanding is that software development is a linear system while the new software perception is that software development is a complex adaptive system. The paradigm describes the theoretical elements and the behavioral elements involved in software engineering work. The theoretical elements include the underlying theories, concepts, philosophies and mental models that direct the human-social behaviors and technical practices. The behavioral elements include the project’s cultural, managerial, and organizational factors. The practices are the technical aspects of the work that include the technology, processes, and artifacts (methods, techniques, models, and tools) that transform requirements into software systems).

Gaining the ability to align software with the business require the replacement of the mechanistic with the system paradigm. The mechanistic paradigm dictates how software developers think, judge, decide, act, behave and practice the traditional software engineering approach. The mechanistic paradigm disavows all that “gooey soft stuff”. It dictates software projects operate as closed systems, behave like machines that ignore human behaviors, focus singularly on technology and process, and above, all emphasize efficiency through meeting schedules and budgets. As closed systems, projects have minimal relationships with the environment. Purpose is loosely coupled, functionally oriented, and generic in nature. The project purpose is to develop the requirements specified in the requirements document.

As a result, software projects exhibit more machine-like behaviors rather than human behaviors and focus more on internal measurements than external. Developers are trained and rewarded to meet schedule and budgets while external environmental measures like business purpose achievement are ignored. The mechanistic paradigm underlies the Software Development Life Cycle (SDLC). It is defined by the phases, activities, and tasks need to transform requirements into software systems. The SDLC uses a monolithic approach much like a pseudo assembly line to transform requirements into specifications, designs, code, tests and implementation. The end result is a manufactured software system tool that is designed to be used until broken. Like other manufactured tools, a bucket for example, quality is measured by the defects that prevent it from functioning correctly. The SDLC also includes the Agile, Lean, CMMI and Six Sigma because of their underlying scientific management principles.

The systems paradigm dictates how software developers think, judge, decide, act, behave and practice the new organic approach called the Organic Software Engineering Approach (OSEA). The OSEA suggests software projects are part of an ecosystem that grows software system services. The concept of the ecosystem emphasizes the need for harmony between the IT developers and business managers, customers, and users within the project. OSEA projects deliver organic software systems that are like living systems such as animal, plant, fungus, or micro-organism. For example, organisms are responsive to stimuli, reproduction, change, maintenance and growth. Software respond to stimuli, often reproduce on other platforms, incur system changes, grow in size and attempt to maintain system integrity during the process. Tools do not share common characteristic with software.

The systems paradigm dictates that projects operate as open systems, behave like socio-techno systems, focus on all aspects of project and system performance and emphasize the joint optimization of time, costs, quality and effectiveness. Ludwig von Bertalannfy, the father of system theory cautioned, that overemphasis on one variable in a function can sub-optimize the whole. He is correct; an overemphasis on schedules and budget is the core reason for project failures. As an open system, the OSEA is tightly coupled and specific to the needs of its environment, which are the business units. There is a strong functional bond, an alignment with the business units that philosophically and ideologically are beyond the capability of the closed system, SDLC. In the OSEA, purpose consists of distinct mission, vision, values, and objectives. It is tightly coupled, environmentally cohesive, and specific to the needs of the business. Quality is also different. Instead of defects, quality is categorized by the achievement of purpose, satisfaction, commitments, and system performance. Included in system performance are functionality, system support services, human factors, security, software quality factors, system operational performance, and defects.

Summary

It is time for a change. The CMM/I have proven that. The CMM/I, the epitome of software mechanistic thought, was widely herald as the “solution”. However, like the rest of the progression of “silver bullets”, the expectations were unrealized. Instead of the expected transformational benefits, improvements are disappointingly measured by return on investment. Clearly, it has not lived up to the original hype and is at an evolutionary dead end. When organizations reach Level 5 there is nowhere else to go yet, the major software problems remain? In addition, a 2003 GAO study estimated that the DOD wastes forty percent, 8 billion, of their annual 21 billion dollar software budget on rework due to quality issues. The DOD out-sources development contracts to organizations with higher CMM/I levels.

The purpose of this blog-series is to identify another way of developing large-scale software systems and to identify the transformational results that will occur one the change is implemented. The underlying mechanistic paradigm must be replaced. It is the strength of the mechanistic dogma that causes software engineers to continue to develop software in the same way, despite the futility of their efforts. Every five years, the size of software projects increases an order of magnitude so the software problem must be solved. Large systems are essential to the economy because business and the government use these systems to run their organizations. The inability to plan, organize, manage and direct moderate to large sized software projects is a major problem. If the software problem remains unresolved, the increasing importance of software systems and the growth in software size and complexity will put the world's economy at risk. The increases in project and system size will drive costs up exponentially and large projects will experience even higher costs, lower productivity, and greater failures.

Only strategic change can rectify this problem because the problem is systemic, philosophical, and cultural. It is systemic because virtually all moderate to large-scale software projects fail to meet expectations, often double or triple schedule and budgets, or are cancellations. It is philosophical and cultural because the adherents of the current software development practices accept that that the vast majority on their efforts including virtually all large-scale system development will result in failure but do not change. The problems associated with the development of large-scale software systems require philosophical and culture change. The philosophy change includes a new perception, conceptual framework, behaviors, and practices that will create new software engineering cultures that has the capability to develop large-scale, high quality systems. In other words, we must create new management systems, project structures, work systems, decision and information databases, human resources skills, work settings, reward systems and artifacts, processes, and technologies. There is also the need for a capability support system that supports the four project subsystems, software engineering, project management, project design, and human resources.




Wednesday, December 7, 2011

Advanced Software Estimating with Near Zero Variance

Near zero-variance is defined as the ability to achieve schedule and budget predictability with an absolutely minimal variance between project estimates and actual. It is achieved through an organic software estimating approach that is contextually based. Organic software estimating is a component of the Organic Software Engineering Approach that consists of new software engineering perception, paradigm, behaviors, and practices. The source of the estimating problem is the software incarnation of Frederick Taylor’s scientific management. We call it the SDLC; its goal is efficiency; and that is achieved through budget and schedule attainment. As a result, estimators rely on empirical cost models that use empirical data from “best in the class projects”. It is the software equivalent of Taylors time and motion studies. The only problem is that you cannot perform time and motion studies on the human brain.

Any proposed solution to the software problems must deal with estimating. The ability to meet schedules and budgets is a core problem that must be tackled. Achieving both system and project predictability is contingent upon the accuracy of the estimating system. Current estimating models grossly underestimate resource requirements so most projects are doomed from the start. The carnage caused by poor estimating practices is devastating. In economic terms it can be calculated in the billions and even trillions of dollars in wasted software costs and lost economic opportunity costs. Socially, it can be calculated in terms of voluntary and involuntary terminations, demotions, and careers put on hold. The same mechanistic paradigm that gives the world Sweat Shops give software developers Death Marches.

Estimating has always been problematic. The idea that some cost model based on empirical data could accurately estimate all projects past, present and future discounts the uniqueness of software development organizations and their projects. In order to improve estimating accuracy, estimators have been researching and creating various quantitative software development models to describe the software development process. However, the models are too generic and lack the granularity needed to estimate specific software projects. The default parameters can not accurately represent the wide variability of software project contexts. All of this is understandable, without organizational, technical, and project specific data the accuracy of the cost model are limited. These organizational, technical and project specific variables include but are not limited to:

1) Software organizations use different organizational factors, management policies, technologies, and management to create development cultures with different behaviors, attitudes, norms, and values. The culture determines attitudes towards organizational objectives, the amount of enthusiasm and energy applied to tasks, how work is performed or not performed, how internal groups support or hamper project teams, and ultimately how things get done or not get done.

2) There are significant productivity differences in software engineering approaches like the OSEA and SDLC, methodologies, and artifacts. This includes the waterfall, incremental, evolutionary, RAD, Agile or other life cycle models. The life cycle approach represents the difference in the project's course of development and hence the project's phase, events, activities, work products and tasks. It significantly impacts effort, schedules, and costs.

3) Project categories, source, and type have a major estimating impact. Whether a team is developing software, purchasing a package, replicating a system, or converting a system have significant impact on productivity. So does project source like internal development, project developed under contract, project produced for marketing, and many others. Project technical types embedded systems, time critical real-time, scientific or engineering, system programming, distributed and network, data processing and database, expert and artificial intelligence, image and pattern recognition, large scale simulation and modeling

4) Language groups and generation including 5GL, 4GL 3GL and assembler languages have a major impact on productivity.

5) System size and complexity have a major impact on productivity. As the size of the software product increases linearly, project resource requirements for development, communications, integration, coordination, and documentation increase exponentially.

6) The alignment of the development system used to develop the system with the develop system needed to develop the project is important. The greater the contradictions or redundancies between development system and project reality in terms of needed technology, processes, and artifacts, the lower the productivity.

7) Variations in staff caliber including skills, knowledge, experience, and motivation vary widely and affect productivity. Development team context includes development team experience, project support, and project management experience, management support, technical support, and project team quality.

8) User context includes user knowledge, experience, attitudes, support, and organizational turbulence. User context is a key productivity element. User context and the complexity of the problem will help determine the stability of the project and the subsequent volatility that causes rework.

9) Problem context includes problems size, problem complexity, precedent, problem domain maturity, and partial functionality. In the problem context, the major cost drivers are precedent. Precedent refers to domains that are new the organizational and includes technology, computer science, and application domains. The uncertainty that comes with precedence is a major reason for estimating errors. In many situations it takes twice the effort to accomplish unfamiliar tasks as it takes to accomplish familiar tasks.

10) Business context includes purpose instability, business importance, and business integration. Purpose instability leads to product instability including changes in scope and requirements. High business importance requires extensive and often unproductive business management scrutiny. High integration requires extensive coordination, communication, collaboration, and cooperation.

11) Platforms have the potential for major differences in productivity. Platforms and their languages share different characteristics that affect productivity. Mainframe, PC, server, Internet/Web, embedded real time, robotics, mobile system, network, palms, and games systems all have the potential for significant variations in productivity.

12) The quality philosophy has a significant impact on productivity. In the SDLC, the primary quality issue is defects. Does the tool work or not? Organic software estimating is customer oriented so quality includes purpose achievement, customer value, satisfaction, business productivity and performance. The OSEA performance include functionality, system support, human factors, security, software quality factors, system operational performance, and defects.

Information Architecture

A key concept of organic software estimating is that organizations share common characteristics with living systems including plants and animals. People makeup organizations so thinking of an organization as a living system instead of a machine enables managers to harness the natural, organic, tendencies of organizations. For example, like any other living system, organizations send outputs to its environment and receive energy-inputs from the environment. High-order living systems also need training, feedback from experience, and a memory in order to survive, grow, and proliferate. As organizations are living systems, it is intuitive that in order to survive, grow and flourish, there is a need for feedback-control mechanisms and a corporate memory in the form of databases.

The information/decision architecture contains the project management assets including the data bases, tools, methodologies, processes and methods for estimating, planning, scheduling, monitoring, and managing software projects. History databases provide the data needed to create the profiles, relationships, and models. Estimates are more accurate because the estimator does not have to make assumptions inherent in cost model development. Estimators can align both the technical and project variables so that the software organization can statistically listen to their own organizational data, identify what was said, and apply the knowledge towards better management of the software development project. Databases allow a manager to find similar sets or clusters of projects that most closely match the new project and use statistical analysis routines to identify the projects with the highest degree of similarity and relevance to the new project. The results are a set of role model projects that become the basis of the quantitative models that guide the project team towards project success.

The databases contain history data from previous project and are used to create environmental profiles, relationships, and models based upon variable projects or clusters. Variable projects are selected by the estimator at estimating time and clusters involve a semi-fixed set of statistically similar projects in the project database. An environmental profile consists of characteristic and distribution profiles that typify a normal project in a cluster. Characteristic values identify standards in terms of process, performance, product, and quality. Distribution profiles describe standards in the form of the allocation of effort, duration, staff, and costs. Relationships are equations derived from statistical analysis of previous similar, history projects in the cluster. Relationships predict the values of unknown factors based upon known factors. For example, from software size, an independent variable, we can create equations to estimate size, effort, duration, and staff counts. Models are developed by averaging the behavior of similar sets of history projects in the local database. Models describe the typical behavior of a project over time.

Software Functional Elements

Software functional elements are the basic estimating elements used in organic software estimating. Software functional elements are also used in organic software requirements engineering and organic software development. Software functional elements consist of systems, subsystems, software functions, capabilities and features. Software functional elements are hierarchic systems that consist of relational components that are also hierarchic systems. Each system, subsystem, software functions, capability and feature consists of eight objects. The objects are purpose, environment, boundary, inputs, components, outputs, restrictions and feedback control. Features as the prime software estimating element can either compliment or replace function points and other related size measures. Feature can probably be categorized by the number of object elements such as the number of inputs, process services, and outputs.

The software functional element is the basis for organic software development because of the user orientation. Customers and users don't know anything about software requirements but they do understand the capabilities and features that they need to do their work. In addition, each software functional element has emergent quality needs that often cannot be decomposed to a lower level software functional element. Finally, software functional elements are self-organized systems that are design to support a specific external function including a business function or system function.

Organic design is based upon the concepts that software functional elements are more like biological functions than machine functions although machine functions do exist in living organisms. Muscles and bones are good example. Generally, biological systems contain vertical, multi-level relationships in a system hierarchy. Think in terms of cardiovascular, endocrine, digestive, lymphatic and respiratory systems. Biological systems are self-organized systems where each system supports a body functions. For example, the respiratory system is the system of organs in the body responsible for the intake of oxygen and the expiration of carbon dioxide. In mammals, it consists of the lungs, bronchi, bronchioles, trachea, diaphragm, and nerve supply. Organic design structures projects and systems like biological organisms. Just like the respiratory system support the intake of oxygen and the expiration of carbon dioxide, software functional elements support specific business elements including processes, sub-processes, functions, activities and tasks.

Software functional elements can be estimated top-down where the requirements are determined for the systems, subsystems, software functions, capabilities, and features. Software functional elements can also be estimated bottom-up as the lower level software functional elements are used to create higher level elements. The lowest level software functional element and the core estimating element is the feature that provides the services needed by the business community. The new quality engineering approach is based upon the organic service model. The objective is the creation of adaptable, flexible, customer-specific, and self-organized systems that are designed to provide the services that meet the specific needs of the business. Services need a high level of interfaces with its customers so software projects must be designed for a high degree of user interactions. Software services tend to grow so there is a need for design plasticity and flexibility. Instead of rigid software products, developers should build self-organized systems that are designed for self-maintenance, self-renewal, and self-evolution.

Project Specific Modeling

Project specific modeling applies a Business analytics (BA) approach to estimating. BA is defined in Wikipedia “Business analytics (BA) refers to the skills, technologies, applications and practices for continuous iterative exploration and investigation of past business performance to gain insight and drive business planning. Business analytics focuses on developing new insights and understanding of business performance based on data and statistical methods. In contrast, business intelligence traditionally focuses on using a consistent set of metrics to both measure past performance and guide business planning, which is also based on data and statistical methods.”

Project specific modeling uses a human approach to estimating through the comparison of dissimilarities and similarities. Project specific modeling provide profiles, relationships and models that are based upon the needs of a particular project team in a particular organization that is staffed with particular set of people who contain particular skills, knowledge and character attributes. The project team use particular technology, artifacts, and processes to develop a particular system for particular users that execute particular business processes in a particular business environment. Similarity is based upon the concept that in the same organization, similar types of projects will evolve in similar ways. Similarity is a key concept in the development of models, profiles, and relationships. The greater the similarity, the more similar will be the experiences and the more comparable the evolution of the project. The more similar the experiences and comparable the evolution of the project, the more consistent are productivity and other relationships. The more consistent are the productivity and other relationships, the more accurate the derived quantitative models. The more accurate the quantitative models, the greater the predictability of future project outcomes. The greater the predictability of the future project outcomes, the greater the likelihood of project success.

Similarity is a common concept. Appraisers use similarity in assessing the market value of homes. Consider the data of recent home sales as local data, your neighborhood as the development organization, the size of the house as the size of the software product, and the features as special software cost drivers. The appraiser looks at recent sells of homes of similar square feet and the number of bedrooms and baths in the neighborhood. Then special features and other amenities are added to the primary estimates. The estimated market value is validated by comparable homes sold in the particular zip code or neighborhood. This is in contrast to empirical estimating methods where estimate the prices of home on the basis of the country or world-wide sales.

In project specific modeling the estimator select similar reference projects by searching the database for subsystems that shares the same attributes as the new subsystem. An estimator will select multiple sets of projects with similar independent variables from the history database and use the set of data points with the highest correlation coefficient to estimate the size and productivity of new project. The selected similarity sets are passed to a statistical model that displays the regression equation, R-squared value, and correlation coefficient for each similarity group. The group of similar projects with the highest R-squared has the strongest statistical relationship. The derived relationships are used to estimates of other size factors, productivity, duration, staff count, dollars and schedules. The results are a set of role model projects that become the basis of the quantitative models that guide the project team towards project success.

Then the project team must make adjustments. For example, to estimate a high risk project, an estimator develops an attribute profile of the new project and searches through the database for similar projects or a specific cluster. In this case, the estimator looks for technical attributes, contextual attributes, and project specific attributes. There are no exact matches but the more similar the characteristics the better. For this example, this project team consisted of junior developers and was contextually characterized as low development team experience. Let’s say that Insight, the advanced project management prototype that is the basis of this article, identified these reference projects as the most statistically significant. So the estimating equation would be based upon the referent history projects where teams had very low development experience.

The context is called “Low Development Team Experience”. If the project manager can replace the juniors with more senior staff, then this particular human resource variable is controllable. Otherwise, if he is struck with the juniors, then the variable is uncontrollable and mitigation is required. Mitigation is the process by which systems adjust the relational elements in order to compensate for a risk or deficiency in another element. In this case, the project’s four subsystems (software engineering, project management, human resource, and project design) are adjusted. For example, the software engineering subsystem options call for plans that are more detailed with an increased sampling rate of inspections, code readings, and walkthroughs. The project management subsystem options calls for a longer development cycle and an increase in schedule and duration. The human resource subsystem options calls for the availability of technical leads with good technical mentoring skills to help the juniors. The project design subsystem calls for smaller teams so that the juniors can get the attention that they need from the technical leads.