Critical Design Strategy (CDS)

A structured method for evaluating visualisation designs

What is CDS?

The Critical Design Strategy (CDS) comprises three sequential stages: Overview, Detail, and Review. Each stage guides the appraiser through a structured critique of a visualisation artefact, helping them think critically, reflect deeply, and identify opportunities for design improvement.

The person conducting the critique is the appraiser—often the designer or developer of the visualisation. The subject of critique is the artefact, which might be a sketch, dashboard, interactive tool, physicalisation, or poster. It presents data, is designed for a user, and may involve contributions from multiple roles such as designer and developer.

Sometimes, a single individual performs all these roles—for example, a student may design, implement, and appraise their own visualisation using the CDS.

  1. Overview
    After preparing the artefact, the appraiser gives it a name, summarises its essence, and holistically critiques it by selecting five descriptive words from a fixed list.
  2. Detail
    The appraiser conducts a thorough critique by responding to 30 heuristic questions across six perspectives.
  3. Review
    Finally, the appraiser reflects on both the overall critique and the detailed analysis to identify the next steps.

Stage 1: Overview

The first stage helps the appraiser build a deep understanding of the challenge and context, preparing for critique by reviewing the artefact, data, and task. The appraiser then assigns a short name and summarizes the essence of the design. Finally, they select five words from a set of twenty to capture their initial holistic impression.

  • Name the artefact – create a short, meaningful title (2–3 words).
  • Summarise its essence – a brief description or theme.
  • Select 5 impression words – e.g., “beautiful”, “confusing”, “useful”.

Stage 2: Detail

This stage involves a comprehensive critique across 30 heuristic questions, structured into six key perspectives: User, Environment, Interface, Components, Design, and Visual Marks.

Each question is scored on a Likert scale (from -2 to +2) to represent quality or suitability, and serves as both a reflection guide and evaluation tool. Responses can be compiled into a report or summary of findings.

Stage 3: Review

The final stage focuses on synthesis and reflection. The appraiser calculates an overall score (e.g., average of Likert scores), revisits initial impressions, and evaluates strengths and weaknesses across the six perspectives.

Based on this reflection, the appraiser identifies next steps – such as redesigning, refining layout or interface, or planning usability testing or user feedback. This stage ensures insights are actionable and valuable for improving the artefact.

Using this tool in the Classroom

Instructors can integrate CDS into lectures, workshops, and peer review sessions, using it to guide discussion and encourage constructive critique. Over time, students develop a more nuanced understanding of design trade-offs, gain confidence in articulating feedback, and learn to make more deliberate design decisions.

The Critical Design Strategy (CDS) is more than a critique checklist – it is a structured thinking aid designed to help learners engage deeply with their own and others’ visualisation work. In teaching, it can be used both as a formative exercise during project development and as a summative reflection at the end of a task.

Using it in assessments

Students can apply the CDS either as part of ongoing project work or as a formal assessment. One possible approach is to structure the assessment in two stages.

In the first stage, students select their own dataset, analyse it, and conduct a design study (for example, using the Five Design Sheets method – cf. FdS website, Book, Paper). They then apply the CDS to their initial designs to evaluate their suitability, identify strengths, and highlight potential weaknesses.

In the second stage, students develop and implement an appropriate visualisation solution for their data. They perform a self-evaluation of their final output using the CDS, and present their findings as a technical report. This report should include the CDS results, organised into the three main stages, with detailed discussion across the six perspectives of the second stage.

Using the CDS in Project Work

Students can also use the CDS as a structured framework within larger project work, particularly when developing a new visualisation tool or artefact.

In a project-based context, the CDS can be applied throughout the design and development process — from the initial concept through to the final evaluation. A typical workflow might proceed as follows:

  1. Initial Concept and Data Understanding – The student begins by identifying the problem space, defining the intended audience, and selecting or sourcing relevant data. This includes examining the dataset’s composition, structure, and variables, and clarifying the intended tasks the visualisation should support. Contextual understanding is crucial — such as the creator’s intent, the environment in which the tool will be used, and the constraints of the medium.
  2. Design Study and Prototyping – Before committing to a final solution, the student develops design ideas — potentially using structured methods such as the Five Design Sheets (FdS) approach (Book, Paper). This stage encourages exploration of multiple ideas, consideration of different interaction paradigms, and reflection on design trade-offs before coding begins.
  3. Initial CDS Evaluation – Once early designs are prepared, the student applies the CDS to evaluate them holistically. In Stage 1 (Overview), they name the design, summarise its essence, and select five semantic differential words to capture their first impressions. In Stage 2 (Detail), they respond to the 30 heuristic questions across six perspectives, identifying potential strengths and weaknesses in usability, aesthetics, data representation, task alignment, and more. This early critique provides guidance for refinement before significant development effort is invested.
  4. Implementation – With insights from the initial critique, the student implements the visualisation tool — coding interactive features, integrating datasets, and ensuring alignment with the intended user tasks and context of use.
  5. Final CDS Evaluation and Reflection – After the artefact is complete, the student applies the CDS again to evaluate the finished product. This includes reviewing the scores and observations from each perspective, comparing them with the initial evaluation, and identifying where improvements have been made or new issues have emerged.
  6. Reporting – The student writes up their process in a technical or research-style report. The report includes:
    • A description of the dataset, design process, and implementation decisions.
    • CDS results from both evaluations, structured by the three main stages and with depth across the six perspectives in Stage 2.
    • Reflections on the overall design process, key learnings, and proposed future improvements.

Using the CDS in this way embeds critical thinking throughout the project lifecycle. Rather than serving as a one-off evaluation tool, it becomes a mechanism for continuous design reflection — helping students identify strengths, uncover weaknesses, and make informed decisions that lead to more effective, user-focused visualisations.

Using the CDS in a Commercial Setting

In industry, the CDS can be used as part of a professional design and development workflow to evaluate visualisation products before deployment. This is especially valuable in settings where the visualisation is a core component of a business tool — for example, dashboards for business intelligence, interactive analytics platforms, or data-driven client presentations.

A commercial project might proceed as follows:

  1. Project Brief and Data Context – The client or internal team defines the objectives of the visualisation. This includes clarifying the business problem being addressed, the decisions the visualisation will support, the types of data available, and any compliance or privacy considerations.
  2. Design and Concept Proposals – Designers and developers create multiple visualisation concepts, exploring different layouts, interaction models, and visual encodings to address the brief. The CDS is applied at this stage to identify potential usability or clarity issues before development begins.
  3. Prototype Evaluation – Early prototypes are tested internally or with stakeholders. The CDS is used to capture structured feedback across the six perspectives, ensuring that both the functional requirements and the client’s brand/aesthetic expectations are met.
  4. Implementation and Pre-Launch Review – Once a near-final version is developed, the CDS is applied again to ensure no critical issues have been overlooked. This stage often highlights small refinements — such as adjusting colour schemes for accessibility, improving interactivity, or refining data labelling.
  5. Delivery and Continuous Improvement – After launch, the CDS can be used periodically (e.g., quarterly reviews) to evaluate the tool’s ongoing performance, especially if datasets, user needs, or business requirements evolve. This supports a cycle of continuous improvement and ensures that the visualisation remains effective and relevant over time.

By integrating the CDS into commercial workflows, organisations can ensure that their visualisation products are not only technically functional but also strategically aligned with business goals, easy to interpret, and engaging for their intended audience.