How ‘User Stories Applied’ inspired my API Connect Team’s agile journey
Reading Mike Cohn’s User Stories Applied gave me the inspiration to transform our IBM API Connect team, which consisted almost entirely of newly hired developers. The book’s ideas about writing user stories and improving collaboration encouraged me to move our team to a competitive Agile model. Scrum practices have helped us work more efficiently and grow toward becoming an optimal, self-sufficient team.
This blog explores the foundational Agile practices including user stories, sprint commitments and the Definition of Done (DoD), that became the building blocks for our transformation.

Part 1: User Stories: The Heartbeat of Agile Teams
User Stories Explained
A user story is a concise, user-centric description of a feature, written to encourage conversation and shared understanding:
“As a [user role], I want [feature], so that [benefit].”
I will use an example from API Connect Analytics as a reference to support my explanations.:
“As an API Product Manager, I want to view detailed traffic analytics for my APIs so that I can make data-driven decisions to optimise API usage and performance.”
This simple format promotes clarity and ensures the team stays focused on user value.
Applying INVEST to the Example
Independent: The analytics dashboard can be developed and deployed without waiting on other stories.
Negotiable: The level of detail and types of analytics can be adjusted based on stakeholder feedback.
Valuable: Provides key insights for the API Product Manager to drive usage and improve APIs.
Estimable: The team can estimate effort once the scope of metrics and UI components are defined.
Small: The story can be broken into sub-tasks (e.g. traffic volume view, filter by date..).
Testable: Acceptance criteria can confirm whether the analytics data displays correctly.
Applying SMART to the Example
Specific: Display API traffic analytics including request count, response time and error rates.
Measurable: Users should be able to filter data by API and time range with response in under 2 seconds.
Achievable: Within team capacity, using existing API Connect analytics infrastructure.
Relevant: Helps the product manager make informed decisions, aligning with business goals.
Time-bound: Deliverable within the upcoming sprint for initial release.
This combination aligned perfectly with APIC’s complex, cross-domain product features.
Part 2: Commitment to Sprint Goals and Definition of Done (DoD)
Commitment as a Team Principle
For our API Connect Analytics example, the team commits to completing the user story. This commitment means:
- Breaking down the story into frontend, backend and data integration tasks.
- Ensuring everyone understood the acceptance criteria and dependencies
- Collaboratively estimating the story points and aligning on realistic delivery within the sprint.
Definition of Done (DoD): Our Quality Gate
To ensure “done” means the same for everyone, the user story is ties to these DoD elements:
- Code reviewed and merged.
- Unit and integration tests passed, validating metrics accuracy.
- API performance benchmarks verified (response time <2s).
- Documentation updated to guide product managers on using the new analytics view.
- Feature deployable and showcased during sprint review.
Note: The scope of DoD can change depending on the organizations structure. For example, if there is a separate testing team, the DoD may focus more on development deliverables and integration with testing workflows instead of full validation within the Scrum pod. Our velocity will follow the new DoD in that case.
Next: Read Blog 2 to see how story points and team evolution bring consistency to Agile delivery.