In a bid to ensure digital teams and agencies build high quality government services, the Digital by Default Service Standard was launched, and in June 2015, 18 points of criteria were outlined. We were over the moon to see a public body taking such a proactive approach to qualifying digital standards, an approach that is very much in line with our own process.


You will have heard us ramble on about our collaborative working approach time and time again - we are extremely proud of this and it has helped us to develop award-winning websites over the years. Our research begins at proposal stage where we take time to understand our clients marketplace and competitor USPs enabling us to provide suggestions and advice for the project moving forward. During our on-site discovery sessions, we become fully immersed in our clients’ business, getting to know the typical end users and stakeholders, prioritising content and wireframing key user journeys for the new site.

Digital by Default

At this stage, we develop project KPIs with the client team and evaluate the current CMS and system requirements. This allows us to assess if there are opportunities for efficiencies or improvements to be made.

You may have heard us harp on about it before, but we’re big lovers of the agile methodology. In fact, we closed the agency for two days and put the entire company through agile training and their PSM Scrum Master 1 certifications, to ensure it runs through our core. As a result, the team is split into multi-disciplinary sub-teams on a project by project basis dependent on individual skillsets and resource required. The team is led by a digital product owner who is responsible for liaising with the client and ensuring their expectations are met.

  • Research and understand user requirements to facilitate the design of the service.
  • Multi-disciplinary team that can design and build the service led by a suitably skilled member of the team with decision-making ability (such as our Digital Product Owners).
  • Evaluate what tools and systems will be used to build, host, operate and measure the service and how to procure them.
  • Evaluate what user data and information the digital service will be providing or storing, and address the security level, legal responsibilities, privacy issues and risks associated with the service.


Our agile process adheres to the ‘fail fast, learn fast’ mantra. We place people (both our internal team and client stakeholders) at the forefront of our projects allowing for regular feedback and the flexibility for inevitable changes and overcoming of hurdles. As the agile process works with small regular releases (increments), this means risk is reduced and predictability is increased. Using agile methodology throughout the design and build stages of the project supports a closer working relationship and ensures collaboration with our client throughout.

  • Build the service using agile, iterative and user-centred methods set out in project specification.
  • Build products which can be iterated and improved upon frequently and that your team have the capacity and technical flexibility to do so.
  • Make all source code open and reusable and publish it under appropriate licences.
  • Create a service that is simple and intuitive enough that users succeed first time.


The agile methodology dictates that we work in two week sprints with a sprint review at the end of each, where the increments are released to our clients for review and testing. As such, we continuously test, catching any bugs or errors early and allowing for quicker updates and fixes to be implemented.

Following the web development stage, projects are tested across multiple browsers and devices in our UA testing area. Common browsers include Chrome, Firefox, Internet Explorer, Safari and Opera and we also test across multiple devices such as iPhone, Android and Windows smartphone and tablet devices.

Additionally, any new versions of the browsers released during the lifecycle of the project will be tested before the website is launched into the live environment. The staging site is then handed over to the client for client user testing against original user journeys.

  • Be able to test the end-to-end service in an environment identical to that of the live version including on all common browsers and devices.
  • Test the service from beginning to end with the minister responsible for it.


Developing beautiful websites with great UX is what makes us tick. However, we also want our clients to see return on investment and identify the value add of our services. As such, we implement Google Analytics across all our sites to collect insightful user data that aids future site improvements. Additionally, for many of our client projects, we conduct 6 monthly and yearly reviews to track performance and identify any opportunities for growth.

  • Use tools for analysis that collects performance data to analyse the success of the service and to translate this into features and tasks for the next phase of development
  • Report on performance

We have utilised the agile approach to great success when working with clients including EDF Energy Group, The Courtauld Institute of Art, Young Vic and the University of Southampton and all our future projects will be subject to our agile process, thus ensuring that we practice what we preach as a “people business”. We’re happy to share more about our agile approach to projects, just drop us an email to [email protected]