Ebm Practices
3 min readRapid overview
- EBM Practices and Implementation
- Hypothesis-Driven Development
- Definition
- Process
- Example
- Sprint Retrospectives and EBM
- Connection to EBM
- Best Practices
- Blameless Post-Mortems
- Purpose
- Key Principles
- Benefits for EBM
- Daily Stand-ups and Data
- Evidence-Based Stand-ups
- Anti-patterns
- Engagement Strategy
- Organizational Alignment
- Challenge
- Solution
- Why This Works
- KPI Selection
- Best Practices
- Anti-patterns
- Historical Data for Forecasting
- Using Evidence for Prediction
- Example
- Fostering Data Culture
- Actions
- Signs of Success
- Interview Questions
EBM Practices and Implementation
Hypothesis-Driven Development
Definition
Formulating assumptions about user needs and testing them through experiments to validate decisions with data.
Process
- Identify assumption about user behavior or value
- Define hypothesis with measurable outcome
- Design experiment to test hypothesis
- Collect data from experiment
- Analyze results and adapt
Example
- Hypothesis: "Adding dark mode will increase daily active users by 10%"
- Experiment: A/B test with subset of users
- Measure: Compare DAU between control and test groups
- Adapt: Roll out if validated, pivot if not
Sprint Retrospectives and EBM
Connection to EBM
- Provide structured opportunity for process reflection
- Encourage open communication and feedback
- Enable data-driven improvement decisions
- Focus on outcomes, not just activities
Best Practices
- Review metrics from the sprint (velocity, defects, cycle time)
- Identify what data tells us about our process
- Create hypotheses for improvements
- Measure impact in future sprints
Blameless Post-Mortems
Purpose
Analyze failures constructively without assigning blame, fostering a culture of openness and continuous improvement.
Key Principles
- Focus on systems, not individuals
- Ask "What happened?" not "Who did this?"
- Identify systemic improvements
- Share learnings openly
- Create psychological safety
Benefits for EBM
- Increases transparency (pillar of empiricism)
- Enables honest data collection
- Reduces fear of experimentation
- Improves innovation capability (A2I)
Daily Stand-ups and Data
Evidence-Based Stand-ups
Share meaningful metrics during stand-ups:
- Team velocity and cumulative flow
- Sprint burndown progress
- Blockers with impact data
- Cycle time for current work
Anti-patterns
- Individual performance metrics (creates fear)
- Vanity metrics without context
- Status reporting without action focus
Engagement Strategy
Rotate leadership of stand-ups among team members to increase ownership and engagement.
Organizational Alignment
Challenge
Aligning strategic goals with team-level initiatives.
Solution
Regular alignment meetings involving both management and Scrum teams:
- Share organizational goals and context
- Review team metrics against strategic KPIs
- Identify alignment gaps
- Adapt team goals to support strategy
Why This Works
- Creates two-way communication
- Ensures teams understand the "why"
- Enables strategic contribution from teams
- Maintains autonomy while ensuring alignment
KPI Selection
Best Practices
- Align with goals - KPIs should connect to strategic objectives
- Context matters - Different projects need different metrics
- Balance KVAs - Don't focus on just one area
- Avoid gaming - Choose metrics that can't be easily manipulated
- Review regularly - KPIs should evolve with the organization
Anti-patterns
- Same KPIs for all projects regardless of context
- Only financial KPIs
- Individual-focused metrics
- Too many KPIs (analysis paralysis)
Historical Data for Forecasting
Using Evidence for Prediction
- Analyze past sprint velocities for estimation
- Review cycle time trends for delivery forecasting
- Use throughput data for release planning
- Identify patterns in defect rates
Example
"Based on 10 sprints of data, our velocity averages 42 points with standard deviation of 5. We can forecast delivering 40-45 points next sprint with 68% confidence."
Fostering Data Culture
Actions
- Make data visible - Dashboards, information radiators
- Celebrate learning - Even from failed experiments
- Share metrics openly - Transparency builds trust
- Train on data literacy - Help team interpret metrics
- Model behavior - Leaders use data in decisions
Signs of Success
- Teams discuss data in retrospectives
- Decisions reference evidence
- Experiments are common practice
- Failures are learning opportunities
Interview Questions
Q: What is hypothesis-driven development?
Q: How do Sprint Retrospectives contribute to EBM?
Q: What's the goal of blameless post-mortems?
Q: How should teams approach KPI selection?