CASE STUDY
Perception
Translating emotion into color with AI
PRODUCT TYPE
0→1 product
INTELLECTUAL PROPERTY
3 patents (published)
AI FOCUS AREA
Color Intelligence
KEY CAPABILITY
Emotion → Color

What is Perception?
Perception was built to translate feelings directly into color (and vice versa). Instead of relying on traditional color harmony rules, this AI tool uses color psychology and machine learning to generate palettes from words, images, or existing palettes. It gives you options ranked by their emotional weight, so every design choice starts from a more meaningful and intuitive place.
01
My Role
I originated the concept for Perception and helped shape it from idea to working product.
My work included:
-
defining the product vision
-
translating color psychology research into product features
-
collaborating with researchers at Utah State University
-
expanding the perception dataset through large-scale studies
-
shaping the AI system behavior and outputs
-
leading the product design and UX
-
developing the product name and brand identity

02
The Problem
Have you ever tried defending a color choice when your only reason was "it just felt right"?
We all know color has emotional power. Certain shades of green are even used to create a calm feeling in prisons, but as designers, we often just rely on personal taste. This makes our decisions hard to justify.
Perception turns color psychology into a practical AI tool, giving you a solid "why" behind every palette you create, based on data and the emotional intent you want for your brand and audience.

03
The Gap
Most design tools treat color like a formula, pulling palettes from rigid theory or from images, making the flawed assumption that those colors evoke the same emotional response.
But color is pure feeling. This approach misses the emotional connection. If you searched for colors that felt “precise” you'll likely get a palette from a dartboard image. The results don't reflect color psychology and are totally disconnected from the feeling we are after.


04
Research Beginnings
I’ve always been fascinated by the Color Image Scale research by Kobayashi (1987-1992), a concept originally intended for the fashion industry in the '90s. It analyzed 130 base colors and over 1,000 color combinations, organizing colors along two simple lines:
Warm ↔ Cool and Soft ↔ Hard.
This creates 8 emotional groupings like “Elegant” or “Natural.” The system was considered "fringe" in design circles, and surprisingly, was never developed into a proper digital tool.
05
Applying the Model
-
I had been applying the Color Image Scale across design contexts for years—from brand experiences to digital app interfaces.
-
Aligning color choices with emotional intent consistently improved how products communicated tone, story, and brand personality.
4 examples shown:
-
McDonalds = "Casual/Enjoyable"
-
Nike = "Dynamic/Active"
-
WholeFoods = Natural/Fresh"
-
Mercedes-Benz = "Modern/Precise"


06
Building the System
-
Perception was created to translate this framework into a computational model.
-
We conducted large-scale perception studies (shown) with a colleague at Utah State University, using Amazon Mechanical Turk (MTurk) to expand the dataset connecting color and emotional language.
-
Machine learning models were then used to compare language vectors with palette vectors, allowing the system to generate palettes aligned with a given emotional prompt.
-
The platform could also analyze imagery, extracting representative palettes based on meaningful color relationships rather than simple pixel sampling.
07
The Product
-
Perception turns this research into a simple design workflow.
-
Designers can start with a word, image, or palette.
-
The system generates color combinations and ranks the emotional qualities they express.
-
Instead of guessing, designers can begin with palettes grounded in perception data.

08
Patented Technology
The system behind Perception introduced a new way to translate emotional language into color palettes using machine learning.
The technology analyzes relationships between colors and emotional descriptors, then ranks how strongly a palette expresses specific qualities such as calm, modern, or refined.
Three patent applications were filed to protect the core methods used to generate, analyze, and rank emotion-driven color palettes.
Patent applications (pending):
US 63/385,175
US 63/385,176
US 63/385,180
09
Outcomes
-
Perception transformed decades of color psychology research into a generative AI design tool.
-
The project resulted in three published U.S. patent applications (pending) covering systems for translating language, imagery, and color inputs into palette generation and analysis.
-
More importantly, it explores how AI can support creative decision-making by grounding design choices in human perception data.

30s & 90s Video Spots

