Designing Learning with Humans in Mind
- kimread
- Jun 24
- 2 min read
Updated: Sep 23
In a world where learning usually relies on digital platforms, one question is essential: How do we ensure our online learning environments are welcoming, intuitive, and effective for everyone, especially those who have been excluded by design in the past?
One example of our work in this area is a usability project that tested a learning management system (LMS) with Education Support Professionals (ESPs), a group often underserved in traditional K-12 professional development settings. What we learned through the process informed the design of the LMS and our course design for ESPs. By using simple usability testing, we made sure technology was truly a bridge to learning instead of an obstacle.

Why Usability Testing?
As UX expert Leah Buley puts it, “UX is a force for good.” When we apply user experience principles to learning, we reduce the friction between what learners are trying to do—grow their skills, connect with others, implement learning—and the technology that helps them do it.
Usability testing doesn’t require a big budget or a complex research team. At its core, it’s about inviting real people to use your site or platform while you observe, listen, and learn. You find out very quickly what’s working, what’s frustrating, and what’s unintentionally excluding learners. It can often lead to simple fixes with big impact.
Inclusive Design Starts with Listening
Our approach was grounded in the principles of design justice and inclusive design. These frameworks ask: Who isn’t here? Who feels uninvited? Whose voice is missing?
We then developed a usability test protocol that included scripted prompts and opportunities for participants to "think aloud" as they navigated the LMS. We watched how participants responded to the visual layout, how they made navigation choices, how they searched for help, what confused them, and what delighted them.
And then we listened.

What We Learned, and Changed
Some of the most valuable insights came from watching users try to complete tasks that we thought were simple. For example:
Learners didn’t always recognize icons or know what the top navigation bar was for, so we added a demo video in the course welcome email that explained those features.
Some couldn’t find help easily, so we added clearer links and embedded support in more places.
Importantly, we didn’t assume that every piece of feedback needed to be implemented literally. Instead, we looked for patterns and root causes, and used those to inform meaningful design improvements.
Humanizing Technology, One Test at a Time
If we want our learning platforms to serve all of our intended learners, we need to center learners and test our assumptions. A learning plan that incorporates thoughtful and experienced evaluation of technology offers a simple, powerful way to do just that.
It also reminds us that the “problem” is never the user. The problem is often the design. Do you have designs you want to improve? Let’s chat about how we can support you.




Comments