Akrivia Health is an Oxford University spin off that operates a mental health research platform built on more than four billion clinical datapoints collected over seven years. The healthcare data platform aggregates structured fields, longitudinal assessments, medication records and free text notes from mental health services. It is used for clinical research by NHS teams, academic groups and pharmaceutical partners who need to work with real world patient records at scale.
This project is part of our continued work in healthcare data platforms and clinical research software, where evidence based UX, data governance requirements and analytical workflow design shape interfaces for sensitive medical applications.
The project was to design the core user experience for this clinical research software. The interface had to support advanced healthcare analytics while remaining usable for clinicians and researchers who do not consider themselves data specialists. At the same time, the medical software UX had to respect data governance, ethics and audit requirements around sensitive clinical data.
For product leaders, the goal was not only usability but research reliability. Teams needed a system where they could define complex cohorts, return to them months later and understand exactly how each was constructed. The platform therefore had to combine mental health expertise, healthcare UX design and a robust model of provenance in a single application.
We applied Dynamic Systems Design, a method that grows solutions through embedded experimentation, resolves tensions between local optimization and system coherence, and stewards implementation until organizations gain independence.
Akademisk litteraturöversikt
Informationsarkitektur
Option Space Mapping
Cohort Builder Design
Interaktiv prototypframställning
Usability Testing
Data Visualization Architecture
Governance Model Design
UI Design
System för utformning
Engineering Alignment
Implementation Partnership
Before defining screens, the team reviewed the academic literature on electronic health records and healthcare analytics. Across thirty two papers, including several from journals such as the Journal of Biomedical Informatics, eight studies were identified as directly relevant for interface decisions. They analysed how clinicians and researchers search within EHR systems, how often they lose context during long sessions and where EHR interface design fails to make provenance visible.
These studies described concrete behaviours. Users often move back and forth between structured clinical data and narrative notes. They rely on temporal patterns in the patient record but lose track of what filters are active. When queries are refined repeatedly, the history of decisions becomes opaque, which undermines reproducibility. Clinical data is technically rich but cognitively fragile.
The findings were translated into requirements for the medical research software. The healthcare data platform needed clear provenance cues, visible query history and a stable view of what patient data was currently in scope. EHR interface design principles from the literature were used as constraints rather than decoration. The platform had to help users understand where they were in the data and how they arrived there.
Interviews and prior research showed that cohort construction is the central task in this kind of clinical research software. A typical study might look for adults diagnosed with major depression between 2016 and 2020, who received a specific antidepressant class, showed a Hamilton score above a threshold, had no recorded bipolar diagnosis and experienced symptom relapse after dose changes. This is one query, but in practice it is refined many times.
The query builder in the healthcare data platform therefore had to support up to eight nested levels of logic without losing readability. Conditions combine diagnostic codes, medication sequences, rating scale scores, service use patterns and free text markers. In healthcare UX design terms, this is not a simple filter bar but a visual model of analytical reasoning.
To support both data scientists and non technical researchers, the interface keeps the structure of each cohort visible at all times. Logical blocks can be grouped, reordered and duplicated as hypotheses evolve. Patient data analytics becomes an explicit chain of decisions rather than a black box. This visibility allows researchers, supervisors and governance teams to audit cohorts and confirm that they match the intended inclusion and exclusion criteria.
Through Sandbox Experiments, a two week discovery phase combined qualitative research and task analysis with users from three environments. Fourteen individual interviews and three focus groups brought together twenty four participants, including NHS analysts, academic researchers and pharmaceutical research staff. Each group worked within different institutional constraints and approval processes, but all needed to perform patient data analytics on the same mental health datasets.
Academic teams described lengthy ethics and data access approvals before they could even log into clinical research software that touched real patient records. Pharma teams had more room for early exploration but faced strict reporting and audit obligations later in the project. NHS analysts used similar tools for service evaluation and needed clear boundaries between research and operational use. These realities shaped the design more than any generic persona description.
Task analysis mapped the sequence of actions in a full study journey, from initial idea to final extraction. The research confirmed that confusion often appears during handovers between people or between stages of governance. This insight led to a strong focus on workflow continuity and clear states, so that the same healthcare data platform could support very different approval paths without fragmenting the experience.
To understand the baseline for clinical research software, nine commercial tools were benchmarked in depth. These were not academic prototypes but real healthcare analytics products used in hospitals, research institutes and industry. The evaluation looked at query builders, EHR interface design, workspace models, audit trails and how each system exposed the logic of patient cohort selection.
Several recurring problems emerged. Some tools showed only the final result of a query, leaving users unsure which conditions were actually applied. Others forced researchers into fixed step procedures that did not map to the way mental health studies evolve over time. Provenance was often hidden behind technical logs rather than presented as part of the user experience. Even where functionality was rich, the medical software UX made it hard to trust the outcome.
The benchmark did not simply criticise competitors. It clarified which patterns users already knew, such as familiar filter controls, and which structural issues had to be avoided. The Akrivia platform was positioned as a healthcare data platform that exposes the reasoning behind results and respects the cognitive and regulatory burdens of mental health research, rather than following generic business analytics conventions.
Based on research and benchmarking, five distinct interaction models for cohort building were proposed through option space mapping. One behaved like a wizard, guiding users through sequential steps. Another presented the query as nested blocks of logic. A third organised conditions around the timeline of the patient record. The remaining models emphasised reuse of cohort fragments or side by side comparison of variants. Each represented a different hypothesis about how clinical researchers think.
These models went through six design cycles with increasing fidelity, from wireframes to interactive prototypes. Eight usability sessions with NHS, academic and pharma users tested realistic tasks, such as building a treatment resistant depression cohort or adjusting an existing cohort to new inclusion criteria. Participants were observed as they tried to understand past decisions, modify conditions and explain their logic to a colleague.
The final query builder in the clinical research software is a convergence of these experiments. It retains the readability of the nested model, borrows temporal cues from the timeline model and incorporates fragments that can be reused across projects. In healthcare UX design terms, it offers freedom to explore without sacrificing traceability, which is critical for governance and for scientific review.
Beyond cohort selection, the platform had to support analysis of clinical data inside the same environment. The healthcare data platform integrates modules for descriptive statistics, correlation exploration and comparative views between cohorts. Researchers can inspect distributions of key measures, follow outcome trajectories and compare treatment responses without exporting data prematurely to external tools.
Visualisation follows a clear grammar tailored to medical research software. Time based charts help teams see how symptom scores evolve before and after treatment changes. Comparison views show differences in medication patterns or service use between cohorts. These views are not decorative dashboards but instruments for clinical reasoning. They are designed so that a statistician, a psychiatrist and a data governance officer can all understand what is being shown.
By embedding these analytics modules, the platform reduces the number of tools needed for patient data analytics. It also keeps more of the analytical journey inside an environment designed for data security, provenance and NHS governance. For many teams, this is as important as the visual design itself.
Because Akrivia serves multiple institutions, the platform had to behave as a multi team healthcare data system rather than a single project tool. Workspaces, projects and permission levels were defined so that NHS trusts, academic groups and pharma partners could share the same clinical research software without blurring governance boundaries. Each study sits inside a clearly delimited context with its own approval and data access rules.
Data governance officers were involved in shaping the model for access requests, approvals and auditing. The interface makes it clear which datasets a user can see, what role they have and which actions are permitted at any given time. This is essential for GDPR compliance around sensitive health data. Healthcare UX design here is not about convenience but about preventing inappropriate access without the need to memorise complex policy documents.
The platform also maintains an explicit audit trail of analytical actions, so that governance teams can review how a cohort was constructed and how clinical data was used. This reduces the burden of compliance reporting and gives institutions more confidence when opening their datasets to wider research use.
The visual system for the Akrivia platform was treated as a piece of healthcare UX design in its own right. Most screens present a neutral, quiet surface for concentrated work with clinical data. Typographic hierarchy is clear, helping users distinguish between structure, content and controls without conscious effort. Interaction patterns are consistent across modules so that researchers can transfer understanding from cohort building to analytics and workspace management.
Colour is used sparingly and with defined meaning. In the query builder, it separates logical groups and highlights active conditions. In analytics views, it corresponds to cohorts or outcome states rather than decorative palettes. The result is a clinical interface design that stays readable over long sessions, supports supervision and review, and does not compete with the content.
For medical software UX, this restraint is a strategic choice. The environment must feel reliable to NHS staff, academics and pharma researchers who rely on the application for serious decisions. The design language supports that trust by favouring clarity, consistency and legibility over expressive visual effects.
From the beginning, designers and engineers treated the Akrivia platform as long lived healthcare software, not a short term prototype. The product is a web based clinical research platform that must integrate with existing data pipelines and operational systems. Technical workshops at the start of the project clarified constraints around performance, security and deployment, so that interaction models did not conflict with architectural realities.
In parallel, a design system was created to support the implementation and future roadmap. It defines components for query blocks, patient record views, analytics panels, workspace management and navigation, each with precise behaviour rules and states. For developers, this library acts as a contract. It links healthcare UX design decisions with concrete implementation details in a form that is stable over time.
During build, the design team stayed involved to answer questions, adjust patterns where engineering uncovered edge cases and ensure that the clinical research software behaved as intended in real environments. This avoided the usual gap between concept and production and gave Akrivia a foundation for several years of product evolution.
At the end of discovery, Akrivia and the design team agreed on a clear scope for the first release of the healthcare data platform. The initial interactive prototype of the clinical research software was delivered four weeks later, allowing stakeholders to test real workflows with real mental health data. The full interaction design and design system for the alpha release followed over the next two months.
Because engineering had been involved from the start, implementation of the core features stayed on schedule and within the agreed scope. The design system now supports further work on analytics modules, new mental health datasets and future NHS research projects without requiring a fresh redesign. For product managers, this reduces the cost and risk of extending the application.
Most importantly, researchers now work in a system that makes their analytical logic visible and auditable. Cohorts can be reconstructed and reviewed. Governance teams see how sensitive patient data is used.
The organization gained intangible resources: judgment about what matters in mental health data analysis, shared product intuition about how clinical research platforms should expose reasoning and maintain provenance, and reasoning capability that allows teams to extend analytics features without fragmenting the governance model. The system maintains competitive position by making research reproducible and auditable, while competitors who prioritize visual sophistication over analytical traceability struggle to serve institutions working under strict data governance and scientific review requirements.
The Akrivia platform has become a piece of clinical research software that reflects the realities of mental health research, rather than asking researchers to adapt to generic business tools.
Den första klickbara prototypen levererades på 4 veckor
Design för alfautgåvan levererades på 2 månader
Sömlös överlämning till ingenjörsteamet
Fullständigt designsystem levererat för den långsiktiga visionen
Ingen tidsfrist har överskridits på 3 månader