
This semester, my research is not just about design itself, it’s about how the brain processes what we see and how we can actually visualize this data. Do people visually engage with design differentley?
Since a lot of you might not be familiar with my research, I thought it would be nice to do a quick catch up, on the matter. In the last semester I have spent my time further deepening my understanding for neurodesign. Neurodesign sits at the intersection of design, cognitive science, and neuroscience. So instead of evaluating design purely through aesthetics or intention, we as designers ask:
What happens in the brain when we experience design?
This includes processes such as attention, perception, and decision-making, all of which influence how visual information is interpreted. Importantly, these processes are largely automatic and unconscious (Posner, 1980). This means that, what we think we see and what we actually process can differ significantly. For designers this practice could create a shift: From designing based on intuition, to designing based on measurable cognitive responses. In a world where artificial intelligence becomes more and more advanced, it could mean immense progress for designers, to dive deeper into human cognitive responses, in order to make designs more relateable. Users being abled to experience designs, that feel tailored specifically to them, could mean a new way of connecting.
In neurodesign research, perception is understood as context-dependent and shaped by prior knowledge (Eisma, Eijssen & de Winter, 2022). This directly connects to my central comparison: Designers (trained visual literacy, pattern recognition) & Non-designers (intuitive, less structured viewing behaviour). Research suggests that expertise fundamentally alters how visual information is processed (Lohmeyer et al., 2014). Designers often scan strategically, while non-experts rely more on visual salience.
Attention, Cognitive Load, and Ignored Design
Another important concept within neurodesign is cognitive load. As we all know by now, the brain has limited processing capacity, which means not all visual information receives equal attention. When designs become too complex, users may engage in selective attention, ignoring parts of the visual field entirely (Spinks & Mortimer, 2016). At the same time, research shows that guiding attention meaning through hierarchy, contrast, or motion, can significantly improve comprehension (Rodemer et al., 2022).
More design ≠ more understanding
However, this raises an important question: if attention can be guided, to what extent can perception actually be controlled? While design strategies such as hierarchy and contrast allow designers to direct visual flow, they do not guarantee uniform interpretation. Individual differences such as prior experience, cultural background, and emotional state, continue to influence how information is processed. This suggests that design operates within a space of probability rather than certainty. Designers can increase the likelihood that specific elements are noticed or understood, but they cannot fully determine how a visual message will be received.
This becomes particularly relevant when considering the role of artificial intelligence in contemporary design processes. AI systems are highly effective at optimizing visual output based on existing data patterns. They can predict where users are likely to look, which compositions perform best, and how to structure information for maximum clarity. In this sense, AI aligns closely with principles of cognitive efficiency, often reducing cognitive load by streamlining visual complexity. However, optimizing for efficiency does not necessarily equate to optimizing for experience.
From a neurodesign perspective, engagement is not solely driven by clarity or ease of processing. Elements such as ambiguity, surprise, and even minor inconsistencies can capture attention and sustain interest. These factors introduce a level of cognitive tension, encouraging deeper processing rather than immediate recognition. While AI tends to minimize such irregularities in favor of optimized outcomes, human designers may intentionally incorporate them as part of a more nuanced design strategy. This highlights a fundamental distinction: AI operates primarily through pattern recognition and prediction, whereas human designers integrate interpretation, intuition, and contextual awareness. As a result, the integration of AI into design workflows does not eliminate the need for human input, but rather shifts its focus. Designers are no longer only responsible for producing visual outcomes, but increasingly for evaluating, selecting, and contextualizing them.
In this evolving landscape, understanding cognitive processes becomes even more critical. By grounding design decisions in knowledge about perception, attention, and cognitive load, designers can engage more deliberately with both human users and computational systems. This creates the potential for a hybrid approach, where AI supports efficiency and scalability, while human designers maintain responsibility for meaning, relevance, and experiential quality.
Ultimately, neurodesign does not seek to replace intuition with data, but to expand it. By making cognitive processes more visible and measurable, it allows designers to reflect on their decisions in new ways, bridging the gap between subjective experience and objective analysis. In this sense, the future of design may not lie in choosing between human or machine-driven approaches, but in understanding how both can operate together within the same cognitive and perceptual frameworks that shape how we see, interpret, and connect with the world.
Sources:
Eisma, Y.B., Eijssen, D. & de Winter, J.C.F. (2022) What attracts the driver’s eye attention as a function of task and environment. Information (Switzerland), 13(7).
Lohmeyer, Q., Matthiesen, S. & Meboldt, M. (2014) Task-dependent visual behaviour of engineering designers – an eye-tracking experiment. DESIGN Conference.
Posner, M.I. (1980) Orienting of attention. Quarterly Journal of Experimental Psychology, 32(1), pp. 3–25.
Rodemer, M. et al. (2022) Dynamic signals in instructional videos support students to navigate through complex representations. Applied Cognitive Psychology.
Scene Grammar Lab (2023) Eye-tracking research overview.
Spinks, J. & Mortimer, D. (2016) Lost in the crowd? Using eye-tracking to investigate information processing in choice experiments. BMC Medical Informatics and Decision Making.
