The Good Tech Companies - Why EdTech Fails Neurodiverse Students and How Shafaq Bajwa Aims to Fix It
Episode Date: December 8, 2025This story was originally published on HackerNoon at: https://hackernoon.com/why-edtech-fails-neurodiverse-students-and-how-shafaq-bajwa-aims-to-fix-it. Shafaq Bajwa rev...eals why EdTech often fails neurodiverse learners and how human-centered, empathetic design can redefine meaningful educational tools. Check more stories related to programming at: https://hackernoon.com/c/programming. You can also check exclusive content about #human-centered-design, #neurodiversity-in-edtech, #neurodiversity, #classroom-empathy, #edtech-scalability, #special-needs-technology, #universal-design-for-learning, #good-company, and more. This story was written by: @jonstojanjournalist. Learn more about this writer by checking @jonstojanjournalist's about page, and for more stories, please visit hackernoon.com. Shafaq Bajwa, a data scientist turned special needs classroom assistant, exposes the gap between scalable EdTech and the real needs of neurodiverse learners. Her experience shows that independence, data, and efficiency models often clash with human learning. She argues for patient, empathetic, context-rich tools that support emotional safety, productive struggle, and individualized growth.
Transcript
Discussion (0)
This audio is presented by Hacker Noon, where anyone can learn anything about any technology.
Why edtech fails neurodiverse students and how Shafak Bawa aims to fix it.
By John Stoyan journalist, the educational technology sector is expanding rapidly,
with a growing focus on creating tools for neurodiverse learners.
Yet, a fundamental disconnect often exists between the products developed in sterile engineering
environments and THE dynamic, complex needs of students in the classroom.
Shafak Bawa, a data scientist and software engineer, experienced this GAP firsthand after founding
techno stars, an interactive educational platform, and now works on the front lines as a learning
support assistant at Moorcroft School in London. With a background that spans software
development at Dubai Islamic Bank to earning an MSC in data science, Bawa's journey from building
technology to implementing it in a special needs classroom offers a critical perspective. Her experience
reveals how the tech industry's core values, scalability, efficiency, and data-driven independence
can clash with the principles of inclusive education, which demand patience, collaboration,
and a deep understanding of human context. The myth of independency a central goal in technology design
is to foster user autonomy. However, this focus on individual achievement can misunderstand
the collaborative reality of learning for many nerd-averse students, for whom progress is often
built on supported interaction and trust, a core tenet of the neurodiversity paradigm. Bawa observed this
directly in her work. In the tech industry, products often I am to make users completely independent,
but this doesn't always work funerative of her students. At Moorcroft, I've learned that real
learning happens through trust, guidance, and collaboration, she explains. The assumption that independence
is the ultimate goal overlooks the power of connection. Frameworks like universal design for learning,
UDL, emphasize creating multiple pathways for engagement, which often involve structured support
rather than isolation. For many learners, real growth stems from shared experiences. Bawa notes,
progress comes from shared experiences in relationships, not isolation. True independence grows
through connection and support, not by removing it. This insight suggests that EdTech should
function less as a replacement for human interaction and more as a facilitator of it,
fostering relationships that build confidence and security.
The Tension-OF scalability, the economic model of venture-backed technology,
which powered Bawa's own startup in a Plan 9 incubator, prizes rapid growth and scalability.
This approach is often fundamentally at odds with the patient, resource-intensive,
and highly personalized needs of neurodiverse learners.
While the impact investing market has grown to $715 billion,
the pressure for quick returns remains a powerful force.
Venture-backed tech models focus on rapid growth and scalability, aiming to reach large markets
quickly and show fast returns. However, working with neurodiverse students has shown me that
meaningful impact often requires the opposite approach, time, patience, and deep personalization,
Bawa states. This creates attention, as inclusive education demands empathy and adaptation,
a philosophy echoed in the concept of philanthropic patient capital. The challenge is to reconcile
market demands with human-centered design, the needs of these learners can't be standardized or
rushed, and progress IS measured in individual milestones, not user metrics. This creates attention.
Scalable products prioritize efficiency, while inclusive education demands empathy and adaptability,
Bawa concludes. This reflects a broader shift toward strategies like disability lens investing,
which prioritizes inclusion alongside financial goals. Data's missing context data science excels at
identifying what is happening but often struggles to explain why in a classroom environment
this limitation can lead to flawed conclusions about student behavior without the essential context
provided by human observation this gap has led to the development of mixed method approaches like
big thick blending which combines quantitative data with qualitative insights bawa recounts a scenario
where data alone would have been misleading in one classroom a student's data might have shown frequent
disengagement, leaving tasks incomplete and avoiding eye contact, which could easily be logged
as a lack of focus or motivation. But being there in person, I noticed subtle cues. It wasn't
disinterest, it was sensory overload, she says. This experience underscores the need for empathy and
data interpretation, a key theme in human computer interaction, HCI, research, which increasingly
uses mixed methods to contextualize user behavior. By adjusting the environment, engagement improved
dramatically. Bawa reflects, data alone would have missed that context. My presence made it clear
that the why behind the behavior was emotional and sensory, not behavioral. This experience
reminded me that data needs empathy and observation to tell the full story. Redefining successful
outcome software development operates in sprints, measured in weeks, an agile framework that
prioritizes rapid iteration and immediate results. This timeline is starkly different from the pace of
human development, where a significant breakthrough can take months of quiet, consistent effort.
This disparity exposes a fundamental flaw in the tech world's obsession with instant gratification.
Working in education has completely changed how I view progress. In software development,
success is often measured by quick deliverables and visible results within short sprints,
Bawa explains. Her classroom experience has reshaped this perception entirely. She notes that a
students' growth can take months of quiet effort, a process better valued by metrics like social
return on investment, SROI, which assesses long-term quality of life improvements. This radical
shift in perspective highlights the importance of patience. This experience has taught me that
real success isn't always immediate or measurable, and that patience, consistency, and trust are just
as valuable as speed, Badgwa says. It is a lesson in valuing slow, meaningful progress,
a process that requires tools designed to support productive persistence over time.
The invisible user E-X-P-E-R-I-E-N-C-E-A developer coding and application
sees a logical user journey, but in a classroom, that journey is filtered through a complex
layer of emotions, moods, and contextual challenges.
This emotional landscape is often invisible during the development process, yet it is the
most critical part of the user experience.
This gap highlights the value of methodologies that prioritize lived experience.
The most critical part that's invisible to a developer is the emotional and contextual layer of use.
When you're coding, you imagine a logical, consistent user journey, but in reality, every student brings unique moods, needs, and challenges that can change minute to minute, Bawa says.
This aligns with principles from feminist HCI, which uses personal narratives to uncover biases in design, and is supported by research and to universally accessible instructional models.
This realization redefined usability for Bawa.
I've seen how a child frustration, excitement, or confusion can completely reshape how they interact
with technology.
It made me realize that usability isn't just about clean interfaces or smooth performance.
It's about emotional safety, flexibility, and human connection, she reflects.
To truly understand the user experience, developers must connect with the deeply human behaviors
that shape it.
The value of productive struggle in engineering.
a failed process or point of friction is something to be eliminated.
The tech industry's drive for seamless, frictionless user journeys reflects this ethos.
However, in learning, moments of struggle are often where deep understanding and resilience are
forge. By designing away all challenges, technology risks creating passive users instead of active
learners. The tech industry often designs for smooth, flawless user journeys, minimizing friction
so users never get stuck. But in learning, those moments of struggler where real understanding
takes root, Bawa notes. A key risk in eye-driven education is that students may develop
learned helplessness if systems intervene too quickly, preventing them from engaging in productive
struggle. In the classroom, Bawa has seen how guided challenges build confidence, a concept often
misaligned with how intelligent tutoring systems detect unproductive struggle. Tech's obsession with
seamless success can unintentionally erase these growth opportunities. To truly support deep learning,
we need tools that make space for safe struggle, reflection, Andre Covery, not just instant results,
she asserts. This means designing for resilience, not just efficiency. The limits of digital
S-E-N-S-O-R-S-A programmer's world is mediated through a screen, but a special needs classroom
a rich, multi-sensory environment. The most critical data points are often in digital cues,
a shift in tone, a tensing of muscles, a change in breathing, that current technology cannot
fully capture. While research explores real-time physiological measures, studies often lack
neurodivergent participants, a significant gap highlighted in reviews of cognitive load research.
The most critical missing ingredient is the sensory and emotional context. At Moorcroft,
I rely on things no sensor or algorithm can yet fully capture.
A change in tone of voice, body tension or the way a student's breathing shifts when they're
overwhelmed or engaged, Bawa emphasizes.
Emerging systems using Bayesian immediate feedback learning, BIFL, attempt to adapt to physiological
signals, but they are still in early stages.
This sensory layer is vital for creating truly adaptive tools.
These signals tell me far more than a data dashboard ever could.
They show the emotional state behind the action, not just the action itself, Bawa says.
The future of adaptive technology depends on its ability to interpret these human layers,
not oh, replace empathy, but to inform a more responsive design.
Beyond the engineering mindset and engineering mindset is trained to systematize, streamline,
and optimize for efficiency.
While invaluable in many contexts, this approach can conflict with the messy, nonlinear,
and deeply personal nature of human learning.
A moment of hesitation or exploration, seen as inefficient by an algorithm, may be a crucial
part of a student's cognitive process. Bawa recalls a sequencing game where her instinct was to
remove unnecessary steps. My engineering instinct was to streamline the interface and optimize the
process for efficiency. But in that moment, the student needed time to explore, make mistakes,
and process each step at their own pace. Rushing or simplifying the task would have robbed them of a
valuable learning experience, she remembers. That conflict taught me that the engineering mindset of
efficiency and predictability has limits. Human learning is messy, nonlinear, and deeply personal,
Bawa reflects. Success isn't always about optimization. Sometimes it's about patience, presence,
and honoring the learner's rhythm. This lesson highlights the need for a more holistic approach
to ed tech development, one that balances technical rigor with profound human empathy.
Bawa's journey from a technology founder to a classroom aid underscores a critical message for the edtech industry.
Building truly effective tools funeradiverse learners requires moving beyond metrics of speed and scale.
It demands a new paradigm grounded in patience, direct observation, and a deep appreciation for the complex, individual paths that define human learning.
Thank you for listening to this Hackernoon story, read by artificial intelligence.
Visit Hackernoon.com to read, write, learn and public.
English.
