I’ve been asked several times this year about measuring a software developer’s productivity, quality, and outcomes, especially when leadership promotes hybrid working models.
But here’s the reality that tech organizations face when it’s difficult to hire and retain great software developers: Talented software developers bristle at the idea of being closely managed, and many will leave jobs where there is a culture of micromanagement.
Asking a developer to report to a manager with no software development experience can spark fears of process bureaucracy. Some agile software developers who embrace the extremes of self-organizing principles want full autonomy and may rebel at any sign of leadership attempts to measure productivity, quality, or other performance considerations.
If software developers detest micromanaging, many have a stronger contempt for yearly performance reviews. Developers target real-time performance objectives and aim to improve velocity, code deployment frequency, cycle times, and other key performance indicators. Scrum teams discuss their performance at the end of every sprint, so the feedback from yearly and quarterly performance reviews can seem superfluous or irrelevant.
But there’s also the practical reality that organizations require methods to recognize whether agile teams and software developers meet or exceed performance, development, and business objectives. How can managers get what they need without making developers miserable?
What follows are seven recommended practices that align with principles in agile, scrum, devops, and the software development lifecycle and that could be applied to reviewing software developers. I don’t write them as SMART goals (specific, measurable, achievable, relevant, and time-bound), but leaders should adopt the relevant ones as such based on the organization’s agile ways of working and business objectives.
Some of these may only be relevant at the team level, while managers could use others to measure their direct reports.
Define objectives and key results that are aligned with business and technical objectives
Defining objectives and key results (OKRs) is a discussion that product owners, development managers, and architects can have with their teams to align on measurable success criteria. Ideally, it’s a collaboration between the leaders and the team, with the leaders defining the objective and the full team discussing, debating, and deciding the key results.
A best practice is to define OKRs on a meaningful cadence. If too frequent, the overhead of defining and measuring OKRs may be expensive; if too infrequent, teams may lose sight of the objectives. Two examples:
- The objective of “improve application reliability” may include results such as reducing page response time, improving app availability, or reducing error rates by a meaningful percentage.
- The objective of “improve deployment reliability” may include results such as increasing test automation or reducing build time by meaningful percentages.
Meet sprint and release commitments consistently
Scrum runs on a foundation of cadences and meeting commitments, so achieving deadlines is one way to measure a team’s discipline and alignment to standards. I don’t expect teams to meet every sprint’s commitments perfectly, but leaders can set a high and low bar of expectations aggregated across several sprints.
For teams performing releases on defined cadences (daily, weekly, every four sprints, etc.), I recommend reviewing whether teams release on schedule and meet quality benchmarks. Hitting the release date but causing outages, security incidents, or significant production issues are obvious problems.
Capture satisfaction of product owners and stakeholders
The agile manifesto identifies “customer collaboration over contract negotiation” as a core value. While we shouldn’t hold agile developers to deliver flawlessly – on time, scope, and cost, the proverbial iron triangle – we can seek to capture independent customer satisfaction metrics.
A satisfaction survey is one tool that larger development organizations can use to capture feedback for agile developers and teams. Some questions might cover:
- Collaboration when brainstorming problems and documenting solutions
- Delivery on scope and satisfaction of the results
- Quality of feedback when planning and estimating features
The key is to bring customer feedback back to the developer and agile teams so they can reflect on the results from the customer’s perspective and improve performance.
Quantify peer reviews of design, documentation, and ease of use
Ask a developer how easy it is to use another team’s APIs, upgrade another developer’s code, or learn a new application architecture from the available documentation. Unfortunately, you’re unlikely to get a positive response or a happy developer, especially when working on legacy code or in a monolithic architecture.
So how do you determine whether developers are doing a better job today of developing maintainable code, useful documentation, and microservices that are easy to consume? How could you measure this progress or regression?
While there may be tools or analytics to get at these metrics, I believe the simplest approach is to create a process for peer reviews. Peers can comment on code readability when reviewing a pull request, provide ratings on documentation, and respond to surveys when integrating microservices or APIs.
Peer reviews should supplement the feedback from code review and quality analysis tools that can provide real-time, granular feedback on code quality, security, and related issues.
Select non-negotiable key performance indicators for devops
Product owners and peers provide important feedback, but managers must also ensure that developers and development teams review and respond to operational feedback. The feedback should include specifics around site reliability engineering, security practices, and responsiveness to IT services management (ITSM) incidents, requests, and other tickets.
Devops, ITSM, and infosec have highly mature KPIs, and leaders should select a meaningful and manageable number for software development teams to focus on. For development teams working on cloud-native applications, I recommend defining service level objectives and using them to manage error budgets. For other development groups, measuring reductions in change failure rates and mean time to recover from incidents were the top KPIs in this research.
Demonstrate impacts from learning, experimenting, and mentoring
Today, more businesses recognize the importance of supporting continuous learning, promoting safe environments for experimentation, and enrolling participants in mentoring programs. While all of these are important goals, managers should review how developers put these guidelines into practice and where they deliver business impacts. Managers should help developers create a career development plan and provide feedback on how their learning, mentoring, and participation in experiments and proofs of concept align with the employee’s career goals.
Ask developers to propose work-life goals and objectives
In the Dice 2021 Technologist Sentiment Report, 36% of respondents rated their burnout a four or five on a five-point scale, and 48% reported they are likely to change employers.
I don’t believe CIOs, CTOs, delivery leaders, and software development managers want to see their software developers burn out and join the great resignation. So while I suggest several ways to manage software developers, leaders should be empathetic to today’s working environment and to every developer’s personal situation.
One way to strike a balance is to work with human resources on defining work-life goals and objectives. Developers should personalize these goals, and the organization and managers should keep them confidential. A work-life goal can create a balance many developers need today to feel supported.
Ultimately, managing and measuring performance requires frequent discussions between manager and employee. Are we aligned on goals and the criteria for success? Do we understand the standards and constraints? Even when metrics provide indicators, it’s often the discussion and follow-up actions that lead to improved performance. That’s just how people work.
Copyright © 2022 IDG Communications, Inc.