diff --git a/patterns/everything-as-code.md b/patterns/everything-as-code.md index 6f83f274..664c1467 100644 --- a/patterns/everything-as-code.md +++ b/patterns/everything-as-code.md @@ -46,7 +46,7 @@ A code review involves another member of the team looking through a proposed cod Many teams consider code which has been written [as a pair](https://martinfowler.com/articles/on-pair-programming.html) to already have been reviewed, and do not require a separate review. -Robert Fink provides an excellent description of the [motivation and practice of code reviews](https://medium.com/palantir/code-review-best-practices-19e02780015f). Some key points from this and other sources ([Google](https://github.com/google/eng-practices/blob/master/review/reviewer/index.md), [SmartBear](https://smartbear.com/learn/code-review/best-practices-for-peer-code-review/), [Atlassian](https://www.atlassian.com/agile/software-development/code-reviews)) are: +Robert Fink provides an excellent description of the [motivation and practice of code reviews](https://blog.palantir.com/code-review-best-practices-19e02780015f). Some key points from this and other sources ([Google](https://github.com/google/eng-practices/blob/master/review/reviewer/index.md), [SmartBear](https://smartbear.com/learn/code-review/best-practices-for-peer-code-review/), [Atlassian](https://www.atlassian.com/agile/software-development/code-reviews)) are: #### Egalitarian diff --git a/patterns/little-and-often.md b/patterns/little-and-often.md index 41cf7f87..204a028f 100644 --- a/patterns/little-and-often.md +++ b/patterns/little-and-often.md @@ -48,7 +48,7 @@ This pattern is in essence very straightforward; the power comes in applying it - **Delivering software.** The trivial and obvious example of this pattern is that it is better to deliver software in small increments than to do it in one [big bang](https://hackernoon.com/why-your-big-bang-multi-year-project-will-fail-988e45c830af). - **Planning.** Start by doing just enough planning to forecast the size and type of team(s) you need to get the job done roughly when you want it to be done by. Incrementally refine that plan through (typically) fortnightly backlog/roadmap refinement sessions. -- **User-centred design.** User research and design activities ([SERVICE-USER](https://www.gov.uk/service-manual/user-research), [SERVICE-DESIGN](https://www.gov.uk/service-manual/design)) occur in all phases of an agile delivery: [discovery](https://www.gov.uk/service-manual/agile-delivery/how-the-discovery-phase-works), [alpha](https://www.gov.uk/service-manual/agile-delivery/how-the-alpha-phase-works), [beta](https://www.gov.uk/service-manual/agile-delivery/how-the-beta-phase-works) and [live](https://www.gov.uk/service-manual/agile-delivery/how-the-live-phase-works) ([SERVICE-PHASES](https://www.gov.uk/service-manual/agile-delivery)). Delivery in all phases is done using [build-measure-learn](http://theleanstartup.com/principles#:~:text=A%20core%20component%20of%20Lean,feedback%20loop) loops, with the whole multi-disciplinary team working closely together in all three activities. This approach means that rather than having a big up front design, the design is iteratively refined throughout all phases ([SERVICE-AGILE](https://www.gov.uk/service-manual/agile-delivery/agile-government-services-introduction#the-differences-between-traditional-and-agile-methods)). +- **User-centred design.** User research and design activities ([SERVICE-USER](https://www.gov.uk/service-manual/user-research), [SERVICE-DESIGN](https://www.gov.uk/service-manual/design)) occur in all phases of an agile delivery: [discovery](https://www.gov.uk/service-manual/agile-delivery/how-the-discovery-phase-works), [alpha](https://www.gov.uk/service-manual/agile-delivery/how-the-alpha-phase-works), [beta](https://www.gov.uk/service-manual/agile-delivery/how-the-beta-phase-works) and [live](https://www.gov.uk/service-manual/agile-delivery/how-the-live-phase-works) ([SERVICE-PHASES](https://www.gov.uk/service-manual/agile-delivery)). Delivery in all phases is done using [build-measure-learn](https://theleanstartup.com/principles#:~:text=A%20core%20component%20of%20Lean,feedback%20loop) loops, with the whole multi-disciplinary team working closely together in all three activities. This approach means that rather than having a big up front design, the design is iteratively refined throughout all phases ([SERVICE-AGILE](https://www.gov.uk/service-manual/agile-delivery/agile-government-services-introduction#the-differences-between-traditional-and-agile-methods)). - **Technical design and architecture.** While some up front thinking is generally beneficial to help a delivery team set off in the right direction, the output design is best viewed as first draft which will be refined during delivery as more is discovered about technical and product constraints and opportunities. See [Evolutionary Architectures](https://evolutionaryarchitecture.com/precis.html). - **Team processes.** Great team processes come about by starting with something simple and practising [continuous improvement](https://kanbanize.com/lean-management/improvement/what-is-continuous-improvement) to find ways of working, definitions of done and so on which are well suited to the particular team and environment. diff --git a/practices/feature-toggling.md b/practices/feature-toggling.md index c405b743..b8b7086d 100644 --- a/practices/feature-toggling.md +++ b/practices/feature-toggling.md @@ -106,11 +106,12 @@ Toggles are intended to be short-lived unless explicitly designed to be permanen ### Best practice lifecycle 1. **Introduce** the toggle with a clear purpose and target outcome. -2. **Implement** the feature behind the toggle. -3. **Test** the feature in both on/off states. -4. **Roll out** gradually (e.g., canary users, targeted groups). -5. **Monitor** the impact of the feature. -6. **Remove** the toggle once the feature is stable and fully deployed. +2. **Keep it tidy** create a PR for the toggle removal called cleanup/feature_flag_name +3. **Implement** the feature behind the toggle. +4. **Test** the feature in both on/off states. +5. **Roll out** gradually (e.g., canary users, targeted groups). +6. **Monitor** the impact of the feature. +7. **Remove** the toggle once the feature is stable and fully deployed. Document toggles in your architecture or delivery tooling to ensure visibility and traceability. @@ -149,3 +150,4 @@ Best practices: - [Best practices for coding with feature flags](https://launchdarkly.com/blog/best-practices-for-coding-with-feature-flags/) - [Defensive coding](https://docs.flagsmith.com/guides-and-examples/defensive-coding) - [An example tool for feature toggling](https://docs.flagsmith.com/) +- [How to use feature flags without technical debt](https://launchdarkly.com/blog/how-to-use-feature-flags-without-technical-debt/)