Let's Talk Software

Even if you're not looking for custom software development, we're happy to chat about agile processes, tech stacks, architecture, or help with your ideas. Enter your contact information below and a member of our team will contact you.

    Clients who trust us to deliver on their custom software needs.
    Tonal Logo
    Aquabyte Logo
    More Cashback Rewards Logo
    MasterControl Logo
    Little Passports Logo
    Mido Lotto Logo
    home  /  Resources  /  Agile Software Development Resource Center  /  Writing Agile Requirements

    Agile Software Development Resource Center

    Writing Effective Agile Software Requirements

    The ability to write agile software requirements – user stories – is perhaps the most critical skill needed to maximize development velocity. User stories that are too large, vague, or lack a clear definition of done all result in confusion by the Scrum team. And confusion means wasting valuable time during development Sprints gaining clarity rather than doing their jobs.

    Requirements Documents Suck!

    Market Requirements Documents (MRD). Product Requirements Documents (PRD). Functional Requirements Documents (FRD). You may have written them. You may have read them. Either way, you probably hated them. And so did everyone else.

    Reading requirements documents is almost as painful as writing them

    The objective of defining software is straightforward. Figure out what customers/users need and communicate it clearly and concisely to the team that builds the software.

    Even Steve Jobs didn’t get everything “right” the first time. Apple built the failed Newton years before the iPhone. And they built the Macintosh Portable – referred to by those who worked on the project as the “lovable luggable” – years before the MacBook.

    Building great products, whether hardware or software, requires iteration. Try something. Give it to customers and gather feedback. Improve it. Repeat.

    Iteration does NOT mean that a product failed in the market. Even the most successful products go through iterations continually.

    Thus, when you think about successful software development being a direct result of iteration, it’s logical to conclude that traditional requirements documents are contradictory for a number of reasons:

    • A single document capturing requirements for an entire product release fails to allow a development organization to prioritize, add or remove individual features during a development cycle (a version/release).
    • They take significant time to write, review, and ultimately approve by every stakeholder. By the time the document is final, customer requirements may have changed.
    • They are tied directly the development processes that view a software increment as an entire release versus a few features. Thus, from the start of drafting requirements to the point that customers can give you feedback about what you built is months rather than days.
    • They create a culture of blame and accountability rather than collaboration.

    User Stories

    User Stories are how software requirements are documented using the Scrum Method. Unlike Waterfall Method-style requirements documents in which all software requirements are contained in one document, in the Scrum Method each User Story is a separate “document”.

    Prior to the existence of software products such as Jira and Asana, User Stories were written on Post-It Notes and stuck to a wall or whiteboard. The board had an area of the Product Backlog, the Sprint Backlog and the Sprint workflow stages. The Post-It Notes were moved around the board based on their progress through the planning and implementation process.

    User Story With Correct Structure And Meets I.N.V.E.S.T. Criteria

    User Stories have a specific format that includes four fields (three illustrated in the picture above):

    • Who is the user persona for the feature?
    • What does that user persona want the software to do?
    • What goal does the user persona have that the feature will achieve?
    • The “Definition of Done” in the form of listed acceptance criteria.

    The term “user persona” simply means a description of the prototypical user of the feature. Software products may have a single or multiple user personas.

    The format of the story is:

    As a [user persona],

    I want to [what the user persona wants the software to do],

    So that [the objective the user persona wants to achieve].

    Acceptance Criteria for User Stories

    User Story With Correct Structure And Meets I.N.V.E.S.T. Criteria
    Definition of Done (Acceptance Criteria) For a User Story
    User Story With Correct Structure And Meets I.N.V.E.S.T. Criteria
    Definition of Done (Acceptance Criteria) For a User Story

    The Impact of Poorly Written User Stories

    Poorly defined user stories are the primary driver of lower-than-desired velocity.

    Why?

    Because once the Sprint begins, rather than handling their tasks, the Scrum team must spend time clarifying requirements. Worse, stories that lack a sufficient definition of done (acceptance criteria) typically prove to also be those that should have been split. Once the Scrum team starts clarifying requirements, one story may become multiple, each with acceptance criteria. By the time the Sprint is completed, the Scrum team has wasted a measurable portion of their time doing the product owner’s job.

    Drafting Acceptance Criteria for Functional Requirements

    Acceptance criteria for functional requirements are use cases or logic related to use cases. One use case is the “happy path”… what happens if the feature works as expected. The others are “negative use cases”… what happens when the software doesn’t do what the user wants.  For example, take a “login” feature:

    • The happy path is that the user inputs a username/email address and a passsword and gains access to gated functionality.
    • Negative use cases are what happens when the username/email is bad; the password is bad; both are bad… in the case of a mobile app, what happens when there’s no connectivity.

    It is critical that functional acceptance criteria be thorough and detailed. Not only do they communicate clearly to the developers how to handle every possible outcome of the usage of a feature, but they also become test cases that a quality assurance engineer can use to evaluate whether the developed software has the “expected result” or not. If not, a Sprint defect (not to be confused with “escaped” or “production” defects) exists. Learn more about how software testing is integrated into development Sprints.

    Non-Functional Acceptance Criteria

    Acceptance criteria should also be written for non-functional requirements. Although it may seem logical to include them on individual stories, you’ll find that many of the non-functional requirements are identical for every story. An example is performance at scale. A product owner may specify that a given feature needs to support 20,000 people simultaneous using the feature. Would the next feature have a different requirement?

    At CodeStringers, we incorporate acceptance criteria into release plans as well as individual stories so that the non-functional acceptance criteria that are common to all features are documented once versus on every User Story.

    Criteria for Well Written User Stories

    Well written user stories meet six criteria articulated as the acronym “INVEST”:

    INDEPENDENT: Each story must not have functionality that overlaps others. Login is independent of password reset, for example.

    NEGOTIABLE: Perhaps the most forced word in this acronym, “negotiable” simply means that the team can understand the story and acceptance criteria.

    VALUABLE: Each story creates value for the customer.

    ESTIMATABLE: The team can size the effort to deliver the story.

    SMALL: Stories are small enough to be completed in one sprint… perhaps even smaller. Some organizations dictate that stories must take less than a few days. Stories that are too large are often too complex and need to be “broken down”. 

    TESTABLE: The acceptance criteria for the story can easily be translated into test cases, eaching having an expected result. When tested, if the expected result does not occur, a defect exists. The story is not “done” until all defects are cleared.

    Splitting Large User Stories

    Splitting large stories happens when the team estimates the effort to build a story in the Product Backlog that would take too long to implement.

    User Story That is Too Large and Must be Split
    Large User Story After Being Split
    User Story That is Too Large and Must be Split
    Large User Story After Being Split

    How is “too long” defined?

    The Scrum methodology states that a story must be small enough to complete in a single sprint. However, many development organizations are more strict stating that User Story estimates may never be larger than a set number of Story Points.

    Regardless of the rule for “too long”, ever Product Owner will at some point need to split a User Story.

    Turning Acceptance Criteria into Individual Stories

    Doing requires analyzing the current story’s use cases – acceptance criteria – to determine if individual acceptance criteria or groups of acceptance criteria could meet the INVEST criteria on their own.

    Let’s use the following example to illustrate the process.

    Your company is building a product for Human Resources departments in companies to keep track of its employees. One area of the product shows a data table of all current and past employees along with columns of information about each:

    • Employee name
    • Employment status (current or former)
    • Start date
    • End date
    • Most recent job title
    • Most recent supervisor
    • Most recent salary

    Users can click on each row to view that employee’s details, which would include another data table where each row is a job the employee held at the company along with other information such as disciplinary actions.

    Initially, the Product Owner writes a single User Story for this “feature”:

    “As a human resources employee, I want to view a list of current and former employees along with data about each employee so that I can quickly find the records for an employee when I need to update information.”

    While the User Story is in Product Backlog without detailed acceptance criteria, the team estimates the User Story 8 Story Points, which is comprised of sub-tasks to build a database table, an API to retrieve information from the database, and a frontend capability to render the data in a tabular layout.

    User Story Refinement

    As the story becomes higher in priority (moves up in the Product Backlog), the Product Owner “refines” the story (adds acceptance criteria) including:

    • The table can be sorted by every column.
    • The table can be filtered by start date, end date, supervisor and employment status.
    • Users can update values directly in the data table (the table is read AND write).
    • Users can search the data by employee name.
    • Search and filter work in tandem, meaning that a user can search for a first name and filter those results by one of more of the filters to refine the found results.
    • Users click on each row to open a details page (the team has already decided that the details page itself will be defined independently of this User Story, but the action to initiate opening the page is a criteria for this feature so that the User Interface includes the interaction capability).
    • Users can add rows for new employees.

    The team re-estimates the story with the added acceptance criteria at 80 Story Points. The Velocity (amount of work the team can do in one sprint) is 35 Story Points.

    This User Story must be split.

    Splitting the User Story

    Looking at the acceptance criteria, we can easily pull out multiple capabilities that could on their own meet the INVEST criteria and thus become a User Story on their own. At the most extreme, the following could be a resulting list of independent User Stories:

    1. View the data table with the columns.
    2. Sort by first name.
    3. Sort by last name.
    4. Sort by employment status.
    5. Sort by start date.
    6. Sort by end date.
    7. Sort by job title.
    8. Sort by supervisor.
    9. Sort by most recent salary.
    10. Search by first or last name.
    11. Filter by start date range (user inputs two dates and the table displays any employee who started or ended in that range).
    12. Filter by end date.
    13. Filter by supervisor.
    14. Filter by employment status.
    15. Update first name.
    16. Update last name.
    17. Update end date.
    18. Update employee status.
    19. Create a new employee.
    20. See an option to open a details page for each employee.

    The team looks at this list and tells the Product Owner the following:

    • Item 1 is 8 Story Points.
    • Items 2 through 9 can be grouped as a single story and total 5 Story Points.
    • Items 11 through 14 are each individual stories and each one is 5 points, totally 20 Story Points.
    • Items 15 through 18 can be grouped into one story and total 21 Story Points.
    • Item 19 is 21 Story Points.
    • Item 20 is insignificant and can be added to the story for item 1 without impacting the estimate.

    We now have 8 individual User Stories totaling 75 Story Points and, if the team pushes hard, they think they can complete them in two Sprints.

    Scroll to Top