Codix was designed to support video coding analysis using the creative sense-making and observable creative sense-making cognitive frameworks. The user uploads a video, selects either the CSM framework or the OCSM framework, and plays the video to start the coding process. The slider is used to adjust the code value, which highlights the corresponding selection in the coding scheme at the top right. The code count is cumulatively displayed to the right of the code chart. The user can add actions that appear on a timeline at the bottom right.
Codix is a video analysis platform designed to support continuous video coding using the Creative Sense-Making (CSM) and Observable Creative Sense-Making (OCSM) cognitive frameworks. Built for researchers and practitioners who need a rigorous way to capture interaction dynamics as they unfold, Codix makes it possible to translate rich qualitative observation into structured, quantitative data—without losing the temporal nuance of real behavior.
Developed by Nicholas Davis, PhD at Co-Creative AI Consulting, Codix enables users to code participant behavior and interaction patterns across a wide range of research contexts, including experimental studies, user studies, and ethnographic fieldwork. It is especially well suited to improvisational and temporally fluid domains such as dance, drawing, pretend play, and music, where meaning emerges through moment-to-moment coordination rather than static outcomes. Because Codix supports multiple coding schemes—and allows schemes to be edited—it supports flexible, theory-driven analysis while remaining compatible with systematic data collection.
Codix is simple to use while remaining analytically powerful. The user begins by uploading a video, selecting either the CSM or OCSM framework, and pressing play to start the coding session. Coding is performed continuously using a slider: as the user adjusts the slider value, Codix highlights the corresponding selection in the coding scheme table (displayed at the top right), making it immediately clear which code is currently active. Once the session is underway, Codix records the selected code once per second, producing a continuous time series that reflects how the interaction evolves over time.
As coding proceeds, Codix dynamically updates a set of visualizations that make the structure of the session legible in real time. The code application chart beneath the video displays coded values through time, allowing users to see shifts, transitions, and periods of stability at a glance. Alongside it, the code count display accumulates the total number of times each code has been applied across the session, providing an immediate summary of distribution and emphasis. Codix also renders a Creative Sense-Making curve, computed as the cumulative sum of code scores through time, offering a compact view of how the overall trajectory of sense-making develops across the video.
In addition to continuous coding, Codix supports event-based annotation through an actions timeline. Users can add actions during review, and these events appear on a timeline in the bottom right of the interface. This feature is useful for capturing qualitative events that matter to interpretation—key moments, turns, disruptions, breakthroughs, or thematic markers—alongside the continuous coding stream. In practice, this supports mixed-method workflows where continuous coding captures regulatory dynamics and interaction flow, while event annotations help identify categories and themes that may later support Grounded Theory or Thematic Analysis.
References to the CSM and OCSM coding techniques are provided directly within the Codix interface (bottom left) for users who want to ground their analysis in the underlying methodological literature.
Davis, N., Hsiao, C. P., Singh, K. Y., Lin, B., & Magerko, B. (2017, June). Creative sense-making: Quantifying interaction dynamics in co-creation. In Proceedings of the 2017 ACM SIGCHI Conference on Creativity and Cognition (pp. 356-366). [PDF]
Davis, N. (2024) Creative Sense-Making: A Cognitive Framework for Quantifying Co-Creative AI. To appear in eds. Tigre-Moura, F., AI, Co-Creativity, Creativity. Routledge.
Deshpande, M., Trajkova, M., Knowlton, A., & Magerko, B. (2023, June). Observable Creative Sense-Making (OCSM): A Method For Quantifying Improvisational Co-Creative Interaction. In Proceedings of the 15th Conference on Creativity and Cognition (pp. 103-115). [PDF]