As companies continue to seek greater efficiencies from their application portfolio and reduce cost associated to development and maintenance, more and more organizations are considering the use of Managed Services providers for application development.

This trend, coupled with the trend of development teams moving towards Agile development, has created a challenge for the customer. That challenge is how to make the vendor accountable to for sustainable and consistent output of software when Agile is so fluid and development work is so variable.  This white paper explores a method to put vendors in a position to be accountable to output.

Executive Summary

Managed Services clients can hold vendors accountable to software output in an enhancements only Managed Service contract where Agile methodology is the methodology of choice.  The key is to use velocity increases as the commitment SLA.

Main Body

The Application Support lifecycle contains many facets. However, you can break down those facets into six core areas:

  • Monitoring
  • Service Desk for Response
  • Triage
  • Repairing
  • Tuning or Proactive Improvement
  • Projects and Enhancements

When considering supporting an application via a Managed Service engagement, understanding these core areas are important, because Managed Services is best applied in routine and measurable work.

The six core areas analyzed further don’t always perfectly fit this definition. Let’s look at them again in detail:


Area Routine Measurable Comments
Monitoring Yes, hours monitored is routine Yes, alert trends provide a measure Some options exist for device and applications under contract.
Service Desk for Response Yes, hours staffed is routine Yes, ticket count trends provide a measure Additional pricing models include ticket based pricing and user based pricing.
Triage Yes, hours staffed is routine Yes, ticket count trends provide a measure
Repairing Yes, through average effort per impact severity ticket type Yes, resolution time and average effort
Tuning or Proactive Improvement No, as proactive opportunities are often discovered and acted upon with available bandwidth Yes and No, can be measured with time invested.  However, a measurement might be how much improvement was gained from the action
And Enhancements No, as enhancements are feature requests generated from users or regulatory changes that cannot be foreseen or estimated before being known Yes, but difficult as a function point system must be in place to measure. Some measure via releases, however, releases vary in size which means it is a marginal measure for output.

Of the six areas detailed in the application lifecycle, five would be considered “application management” and one (Enhancements) would be considered “application development.” Enhancements are not routine and they are difficult to predict and measure.  Those characteristics hit precisely at the issue that organizations are having in holding managed services vendors accountable to output of software within the engagement model.

However, the challenge gets more difficult when the organization is using the Agile methodology. Why? The simple answer is that function points and Agile don’t blend well.

Function points is a system of assessing software in predetermined units. This system is agreed upon between client and vendor and needs to be precise.  It ultimately has to establish an objective and standardized unit of measurement.

The problem with Agile is that it doesn’t use function points. It uses Story Points. Story Points and Function Points are different in that Story Points are subjective. This is desired, as scrum teams gain a relative sense of size when working together. All of the gurus and Agile text books advise against disrupting the subjective assessment system. In fact, they encourage the relative dynamic.

So, the issue becomes clear. To measure output of software with reliability you need a Function Point system. However, a Function Point system is bad for Agile. What is the solution?

The solution is to leverage measured aspects of an Agile development team that relate to output. One such measure is velocity. Velocity is a metric that predicts how much work an Agile software development team can successfully complete within a two-week sprint (or similar time-boxed period) (Agile-velocity, 2013).

Since velocity is a metric that predicts the amount of work to be completed over a sprint, it can be measured after a sprint is complete. Simply take the # of Story Points completed over the sprint period. This is velocity.  The way to hold a vendor accountable to output is to have them commit to velocity increases over a period of time. An example would be have them commit to a 5% increase in Velocity per team by the fourth quarter of contract year #1.  Then do the same for year #2.

An organization can experience real output value with this mechanism without disrupting the story point estimation process or the enhanced productivity of maturing Agile teams. By increasing velocity commitments, the client receives more software at higher quality for the same price, thus realizing true value.


Managed Services is an engagement model that works best with routine and measurable scope.  However, it can be applied when a good partner creatively finds ways to be held accountable, thus promising value and increasing client satisfaction.


Agile-velocity. (2013, July 15). Retrieved from Tech Target: http://whatis.techtarget.com/definition/Agile-velocity


Leave a Reply

Your email address will not be published.