EBA Reader
EBA Reader is a guided read-along experience inside Exam Bytes Academy that helped children read for longer by making reading feel supported, not lonely, and by adding lightweight comprehension prompts directly into the reading flow.
Context
Exam Bytes Academy is an 11+ learning ecosystem, but literacy quietly determines whether everything else works. If a child avoids reading, comprehension and speed improve more slowly, confidence drops, and other subjects become harder because the instructions and language are a constant tax.
The human problem shows up at home. Parents end up feeling like they are constantly pushing rather than supporting. That dynamic creates tension and makes reading feel like a battleground instead of a habit.
The product goal was to reduce the loneliness of reading and keep the child moving through text without turning the experience into another test.
Discovery
This started from observation, not a roadmap. I kept seeing children reject books, and the excuses were repetitive: I do not have time, it is boring, I do not want to. The important detail was that the rejection was not absolute. When someone read with them, or when reading became more interactive in class, they were suddenly willing. They liked stories. They did not like struggling through text alone.
I ran a small test that was easy to commit to: five minutes a day. I made it interactive in class and they did it consistently. That confirmed the real issue. It was not that they did not like reading. It was that they did not like reading when they felt unsupported.
The best signal came next. Children gave specific friction points in plain language. Sometimes we do not know what words mean. Sometimes we do not know how to pronounce things. That feedback was a product map. It pointed directly to the moments where the experience breaks.
What I decided to build (and why)
The solution was to make reading feel like someone is with you, without making it feel like you are being tested. I moved the experience toward a read-along format that combines guidance, pacing, and lightweight interaction.
- Guided read-along sessions: a simple flow children can follow without getting stuck, closer to a podcast or video than a worksheet.
- A scalable media pipeline: an automated Python pipeline that turns text into audio using an open-source TTS engine, muxes to MP4, and produces content fast enough to iterate without manual editing for every piece.
- Comprehension inside the story: prompts embedded directly into the reading flow so the session pauses to ask a lightweight question, then returns later with an explanation. This keeps comprehension in-context instead of turning it into a separate quiz.
- Low-friction progress: basic progress tracking and parent visibility so routines can be supported without micromanagement.
The guiding principle was to reduce friction at the exact moment it appears. If the child feels supported, they stay in the story. If they stay in the story, the habit gets a chance to form.
Rejected alternatives
- Assign reading and expect consistency Cheap, but it failed the support requirement and repeated the same home tension.
- Static content pages with a progress bar Simple, but it does not change the experience of struggling alone in the moment.
- A separate comprehension quiz after reading Measurable, but adds friction and makes reading feel like a test.
- Buying a third-party read-along platform Faster to deliver, but reduces integration and control over the loop design and measurement.
What shipped
- A read-along session flow that children could follow without feeling stuck
- A media pipeline that generated audio and video content fast enough to ship and iterate regularly
- Embedded comprehension prompts with later explanations to keep understanding inside the story flow
- Basic progress tracking and parent visibility to support routines with less chasing
Analytics: Events
-
reader_session_startedTracks intent and entry. This shows whether children are choosing to begin a session and which content is pulling them in. -
reader_session_completedTracks whether the read-along loop holds. This is the clearest indicator that the experience feels supportive rather than exhausting. -
reader_session_abandonedTracks where the session breaks. This helps distinguish content boredom from friction, pacing issues, or comprehension overload. -
reader_prompt_shownTracks how often comprehension support appears. This helps calibrate prompt frequency so the story is not interrupted too often. -
reader_prompt_answeredTracks engagement with comprehension. This can indicate whether prompts are at the right level and whether interaction increases abandonment or improves completion.
How I used these signals
- Start to completion rate by content to identify which stories and formats sustain attention.
- Abandon timing to tune pacing, prompt placement, and session length.
- Prompt answered rate to validate that prompts feel lightweight and supportive rather than test-like.
AI usage
AI sped up documentation and iteration. It helped draft the PRD and user stories, generate comprehension prompt ideas at the right reading level, and sanity-check measurement plans so I could focus time on the flow and content quality.
Next steps
- Vocabulary support such as tap-to-define for unfamiliar words
- Pronunciation support for tricky words and names
- Difficulty adaptation so prompts and pacing match the child’s level
- Richer parent and tutor views that translate reading behaviour into simple, actionable signals