While colorblindness is one of the most known about visual impairments, there aren’t many resources for colorblind people in daily life. For example, my boyfriend is colorblind, and he often texts me images of things, such as a shirt he’s thinking about wearing or an item he’s thinking about buying, asking what color they are. This web app allows users to use their camera to figure out what color something in the field of view is.
There are many products that help address larger-scale issues that come with disabilities, but there are few products that focus on small-scale, everyday problems. However, these small-scale products often have a large impact due to the frequency of their use and are also oftentimes easier to create. In this project, I aim to explore small-scale solutions to everyday problems that people with visual impairment face. In my first iteration of the app, I used p5.js. To identify colors, I used an API that names colors based on their RGB values. In this first prototype, the screen shows the user’s camera. Upon clicking on a portion of the screen, the color name of the pixel clicked appears.
For my second prototype, after realizing that I wanted simpler color names, I decided to look into the most distinct colors humans can perceive. When I found a site that listed the most distinct colors for accessibility purposes, I decided to use the RGB values of these colors to name colors in my app.
I next worked to implement motion detection using Python and OpenCV. I first recreated the initial app by allowing users to click the screen to get a color name. This recreation used the average RGB value of a group of pixels around the touch point rather than a single pixel, and I proved to be more accurate. This group of pixels is indicated using a yellow circle that appears when tapping the screen. I then worked to implement image tracking. I used motion detection in conjunction with visual recognition to allow the camera to understand which direction motion was going as well as the extent of motion. I used this motion to move the yellow circle I was using to mark the user’s touch point.
As I worked on the presentation and appearance of the app, I wanted to keep visual accessibility as the focus, so I worked to ensure that the user would have options to set contrast as well as text size. I decided to have the web app open to show the instructions and accessibility options. Once exiting this menu, the web app would show the camera feed along with the color names. Because some users may want more specific color names, I added the options of “generalized name” and “specific name.” To aid visual accessibility, I also added the option of a zoomed in preview of what colors are being sampled in the corner of the screen.