Value & Impact
Delivery efficiency measurement (DVMS), impact KPIs, benefits realisation, and industry benchmarking.
Overview
Too many government ICT programmes focus governance on input costs — labour rates, contract pricing, compliance processes — while leaving the far more consequential question unasked: what did we actually deliver for that money, and did it make a difference?
ADDA benchmarking across government programmes shows that input costs (labour rates) vary only slightly across the market, but output costs (dollars per feature delivered to production) vary up to 50-fold between best and worst. The leverage in procurement lies not in negotiating hourly rates down by 5%, but in understanding how efficiently those hours are converted into production capability.
GoSource brings a disciplined approach to both sides: ensuring delivery teams understand why they are building and what success looks like (impact), while continuously measuring how efficiently they convert investment into capability (value).
The Value Gap
ADDA benchmarking data illustrates the scale of the opportunity:
| Procurement Model | Tech Stack | Cost per Feature |
|---|---|---|
| Big vendor, fixed price | Proprietary | ~$1,000,000 |
| Timesheets, contractor skills | Proprietary | ~$100,000–$150,000 |
| Feature-based, outcomes-focused | Open source | ~$20,000–$25,000 |
The 50-fold variation is not driven by labour rate differences. It is driven by architectural complexity, dependency management, procurement model, technology choices, and delivery practices.
How We Deliver
Impact KPIs
We define project-specific KPIs at the outset of every engagement, anchored to the client’s strategic goals and desired user outcomes. KPIs are not a post-hoc evaluation exercise — they are a constant compass for day-to-day decision-making, visible to the delivery team throughout the project.
Good KPIs are quantifiable, outcome-oriented (measuring effect on users, not team activity), baselined against current-state performance, and targeted with defined thresholds. They are reviewed at each delivery milestone and adjusted if early evidence shows they aren’t measuring what matters.
DVMS — Delivery Value Measurement Standard
DVMS provides continuous visibility into how efficiently investment is converted into delivered capability.
Feature classification. Each feature is classified using three fields:
| Field | Purpose | Options |
|---|---|---|
| Function type | What the feature does | UI: Add, Edit, View, Delete, List, Report; Integration: API Add, Edit, View, Delete, List |
| Change type | Nature of the change | New, Update, Remove |
| Complexity | Structural difficulty | Simple, Medium, Complex |
These produce a standardised feature point value, enabling like-for-like comparison across teams, programmes, and against industry benchmarks.
Three layers of measurement:
- Delivery efficiency — cost per feature point, delivery speed, cross-programme comparison against ISBSG benchmarks. Are we building things well?
- Estimation quality — three-way comparison between calculated estimates, product owner overrides, and actuals. Are our estimates reliable?
- Benefits realisation — tiered assessment of whether delivered capability achieves its intended outcomes. Are we building the right things?
Dashboard views are tailored to each audience:
| Audience | Focus |
|---|---|
| Senior leadership | Portfolio cost per feature point, programmes above/below benchmark, cost trajectory |
| Programme manager | Cost trends by team, delivery progress vs plan, estimation accuracy |
| Team leader | Hours per feature point, sprint output, estimation reliability |
Three-Tier Benefits Realisation
Not all benefits can be reduced to dollar values. Our framework uses three tiers: directly monetised benefits (cost savings, processing time reductions), proxy-monetised benefits (risk reduction, compliance improvement), and scored strategic benefits (sovereign capability, strategic flexibility). This avoids either ignoring non-monetary benefits or assigning them fictitious dollar values.
Principles
- Measure outputs, not just inputs. Without measuring what is produced for the money spent, programmes cannot distinguish efficient delivery from expensive waste.
- Transparency builds improvement. Delivery efficiency metrics are shared openly. When cost-per-feature-point trends upward, the response is investigation and support, not blame.
- Aligned incentives. Where possible, we prefer outcomes-based or milestone-based billing so that we are paid for delivering value, not consuming time.
Policy Alignment
- OECD Digital Government Policy Framework — Impact KPIs aligned with the OECD’s six dimensions for evaluating digital government.
- ISO/IEC 20926 (Function Point Analysis) — DVMS feature points are derived from the international standard for measuring functional software size.
- ISBSG Benchmarking Repository — Delivery efficiency benchmarked against the largest industry dataset for software delivery cost and productivity.
- ADDA Efficiency and Value Framework — GoSource is a founding member of the Australian Digital Delivery Alliance.
Evidence
- DVMS Framework — GoSource-developed Delivery Value Measurement Standard. Author: Steven Capell.
- Case Study: ABF Trade Modernisation — Measurable value metrics (clearance time, revenue compliance, strike rate) traceable to the ABF mission; secured capital funding.
- Case Study: ABF Trade Modernisation — Value metrics framework as the foundation for the business case; demonstrated values significantly exceeding costs.
- Case Study: National Parks e-Ticketing — Project cost repaid 4x in the first 3 months; $10M+ annual revenue processed.
- Case Study: Export Assurance ML — Pain point analysis across the $80B agricultural export market; only 20% of audits resulted in corrective actions.
- Reference: ADDA Recoding Australia — Identifies “Process Over Outcomes” as a root cause of ICT failure; advocates value-driven measurement.
- Staff: Steven Capell — Author of DVMS; designed value metrics frameworks for ABF, Home Affairs, and DAFF.
Tools & Technologies
- Feature Point Classification: Azure DevOps custom fields, Jira custom fields
- Benchmarking: ISBSG repository, IFPUG Function Point Analysis (ISO/IEC 20926)
- Dashboards: Power BI, Azure DevOps Analytics
- Benefits Assessment: Three-tier model (monetised, proxy-monetised, scored strategic)