Monday, January 21, 2019

Feature Definition with a Quality Model

Defining all that goes into a feature can sometimes stop at the functionality and not take into account the various system attributes that combine to make the feature actually successful for the user. Attributes like Usability, Performance and Security add to or take away from the user experience; so understanding them as a whole is essential.

This post outlines the usage of a Quality Model that takes into account the functional and cross-functional features to provide a template that can be used as a definition of done.

Definition of Done

The quality model is really useful for understanding the scope of what needs to be done for a single feature or a theme of many features together. The application you are building is just a collection of features that work together, and how they work together is what is ultimately how the user experiences the set of features. How fast, how secure, how easy they are to use is what is key. these attributes are sometimes referred to as 'cross-functional' as they affect all functionality in the system. Also the term 'non-functional' is sometimes used, but seems like a miss-leading term. They all do something; and it's hard to convey the importance of working on something that is 'non-functional'.

Where are these defined in working?
These constraints are sometimes included in issues or explained in a specification. It's important to be consistent, and treat these as the requirements they are. So, to that end, use acceptance criteria from the domain requirements to outline the compliance targets for the quality requirements. 

This is applicable from the large to the small; from Product Architecture to Application Architecture to designing a single Feature. This template can be used to fix a bug, define a feature, or an entire product. These constraints provide the definition of done, or feature completeness.
Are you ever done the product? Hopefully not! The quality model give you a high-level glimpse of what it does today. This combines the attributes of the user facing functionality with everything around it; so it provides a much clearer picture of what is required to get that feature to a done state.

in its simplest version, its the basic release of any software.
Functionality: By Theme, use domain research
Usability: Goals and Experience
Performance: general feel and response times
Portability: Dev, staging, production
Serviceability: Observable logging, execptions, query times, dependency health and status.


  • Functional
    • Business/ Human facing attributes to satisfy the business rules/problem domain.
    • Match Personas/Jobs to be done with the Features that get their job done.
    • Domain requirements from subject matter expert
    • The Business value attributes that satisfy the business rules/problem domain. A set of attributes that bear on the existence of a set of functions and their specified properties. The functions are those that satisfy stated or implied needs.
    • Suitability
      • Are there relevant business requirements?
    • Accuracy
      • Is the data shown to the user accurate?
    • Compliance
      • Are there rules/regulations constraining the functionality? are these known and declared?
  • Usable 
    • Is the system usable without a lot of training and previous knowledge? How intuitive are the controls and workflow? How does the User Interface 'look and feel'? This is the domain of user experience and has 
      • How does the app match the job workflow? Is it similar and intuitive as to what I would do?
      • get jtbd matching key problem. set requirements, fulfil solution
    • This is the 'user interface', the 'look and feel'. For print and publications, this is called the 'design'. In software, this is the 'aesthetic design' and is really the tip of the iceberg. You only see about 15-20% of a software system, but its a very important part. Its has to be a comfortable place to get things done efficiently.
      • Survey users
        • UI/UX, HCI research (human computer interaction)
      • Accessibility - the application is a web application, so it needs to be accessible from common browsers and not just desktop browsers, but tablets and phones.
      • How intuitive are the controls and workflow?
      • Is the system usable without a lot of training and previous knowledge?
      • How does the User Interface 'look and feel'?
      • Usability Compliance
        • Accessibility requirements could be legally required.
  • Configurable
    • Configuration for: System (Dependencies), Product (Features), User (Permissions)
      • properties for environment - networking, logging, dependencies
      • properties for domain - domain rules, constraints
      • properties for product - feature flags
    • Permissions: 
      • permissions seem to be the same as feature flags, but they are actually dependent on them. Permissions depend on Role in the system. Features depend on the product. Enabling a feature is decided before determining if a particular user has access
  • Performance
    • This is a measure of how responsive a system is. is it fast? and what does that mean? Do thing take a long time to load, or do some searches take a long time to run? When making a system that does a lot of things, its an equally or larger challenge to make the system do those things in a timely manner.
    • instrument the system, know how long what takes. ask during usability reviews
    • Does it use acceptable amount of memory and does the use of the memory scale linearly with more usage?
    • Set targets to start
      • page loads in 3 second response
      • data ingestion speed
        • high latent sources: 3 days
        • medium latent sources: 4 hours
        • low latent sources: 10 seconds
  • Security 
    • If a user is going to hand over their data, whether its their personal details or business information they don't want to have to worry about the safety of that data. They don't want their account hacked, or have issues where their confidence in the systems integrity comes into question. Its comes down to Trust
    • define a security policy, implement threat mitigation, audit system regularly.
    • Review security aspects of new features, review operational security issues and review results of security testing for Authentication (Identity) and Authorization (Roles and Permissions)
    • No unauthenticated access to client specific data
    • Questions:
      • Is there private user data in the system?
      • How can the different roles access the system?
      • Is it deployed in a secure environment? (Asses the network security and use of standardized technologies)
      • Is the data persisted to a secure environment?
      • Is there any secure data in logs or other outputs?
  • Maintainable 
    • To be able to work on the system with others the code and artifacts need to versioned and the operator needs to be able to quickly diagnose issues. The code base needs to be maintainable to add features without breaking existing functionality
    • Source code should be written in a manner that is consistent, readable, simple in design, and easy to debug. A set of attributes that bear on the effort needed to make specified modifications.
      • Can a user other than the developer run the system? Are the tools to do that useable and coherent?
      • Source code should be written to facilitate test-ability.
      • Does the implementation follow the intention of the design?
        • is there a design?
        • The design of reusable components is encouraged. Component reuse can eliminate redundant development and test activities (i.e. reduce costs).
      • Does the code comply to a code standard?
      • What amount of the system can be verified with tests?
        • How many of those tests pass?
  • Extensible
    • Requirements are expected to evolve over the life of a product. Thus, a system should be developed in an extensible manner (i.e. perturbations in requirements may be managed through local extensions rather than wholesale modifications).
    • Can you change the functionality for future use? Is there a common coding standard
    • Can similar functionality be implemented with the same component?
  • Observability/Serviceability
    • While the system is running, are issues easy to diagnose? If something goes wrong, will someone know immediately? How is it monitored?
    • Can a user other than the developer run and configure the system?
      • Are the tools to do that usable and coherent?
    • Can you change the functionality without rebuilding the application, and can you add to it later
    • Observability capabilities should include:
      • on-demand query of all systems go
      • real time alert of subsystem failure
      • ability to see errors, warnings and info messages
  • Availability/Reliability 
    • The system needs to be able to run for long periods of time without degradation. Memory usage and resource allocations need to be sustainable and system loads predictable.
    • Reliability Compliance
      • Is there a declared/contracted availability?
      • Does it run and not degrade over time?
    • Deployment Execution.
      • How do you know when deployment is done?
      • Are the steps to deploy clearly explained and documented?
  • Portability 
    • The system needs to adhere to standards so that it will be able to run on publicly available 'clouds' without modification. There are many candidates and an agile project cannot be locked into a single vendor relationship.
    • Source code should be portable (i.e. not compiler or linker dependent). A set of attributes that bear on the ability of software to be transferred from one environment to another.
      • continuous integration and automated deployment help here
      • Will the application run on other environments? Other operating systems and networks? Have a developer version and testing version with recent data. 
      • Does the application run on the target environment? Does it only work on the development environment? Is there a process for incident and change management? (ITIL standards are a good start here)
  • Interoperability 
    • Does it play well with other systems?
      • using standard protocols?
    • Can the system survive with its dependencies in a bad state?
    • Are all the dependencies identified? (Assess network connections and other dependency couplings)
  • Efficiency/Scale-ability 
    • The relationship between the level of performance of the software and the amount of resources used, under stated conditions.
    • Scaling. Does it use acceptable amount of memory and does the use of the memory scale linearly with more usage?
    • Can the application share resources, or does it need its own machine/cluster?
    • Time Behavior, Resource Utilization, Resource Allocation
    • Under what conditions does the application leak memory? hog cpu cycles?
    • Efficiency Compliance
      • What is the current performance benchmark? what's the next target?

Application Architecture in the Large

The System satisfies the requirements through its various features. Use the quality model for a specific feature, or a collection of features within a product.
Understand technical debt from a high level. It's quality debt, but like the financials debt can be useful to get you to the goal; it just needs to be managed effectively.

Integrating in the process
All these points encompasses Quality Standards that need to be validated by Quality Engineering

Sign-off
For each quality requirement listed below, get sign-off from owner as part of the architecture review. An understanding of the current benchmark and achievable targets will properly frame expectations.
  • Functionality - Product Manager using backlog stories/acceptance criteria
  • Security - Sec Audit using security guidelines/standards
  • Usability - UI/UX - using usability standards
  • Performance - Performance Engineering using reliability/availability standards
  • Maintainability - Tech Lead using Code Guidelines/Standards
  • Portability - Operations using deploy and configuration Standards
  • Manageability - Operations through monitoring standards
  • Planning - Program Manager - using ADM/Scrum standards
  • Serviceability - Support using support standards

Wrap it up

You need to have some level of design to be successful for a system of any size, but too much architecture can slow down, or stop the implementation. Use consistent models and views that take into account the entire system; not just the UI and some functional stories.

In the end, the design will happen anyway. If it isn't formalized somehow (at least in some words and a diagram), then it's in the heads of whoever wrote the code. I have made some fairly large systems this way; but paid for it in the complexity of conveying that design to more people than myself. In the end the complexity of that mental model overwhelmed me. By using a simple template it becomes easy to account for what is going into the system, and much easier to share as a result.

Start with a lean definition of this for your needs; you don't have to re-invent the wheel here, just use the template.



No comments:

Post a Comment